Open In App

CrewAI Flow

Last Updated : 15 Sep, 2025
Comments
Improve
Suggest changes
Like Article
Like
Report

CrewAI Flows are used for building, orchestrating and managing AI workflows in a structured, event-driven manner. It allows developers to combine tasks, Crews (groups of agents) and code into multi-step processes with control over state, execution paths, conditional logic and persistence.

Flows are designed to ensure that complex automations can be built without losing clarity, reproducibility or control. They integrate closely with Crews (CrewAI’s agent groups) and support event-listening, branching, loops and state management.

Workflow of CrewAI Flows

Here’s how a typical Flow in CrewAI is structured

  1. Start: A Flow begins from a method decorated with @start(). This is the entry point.
  2. Task Execution: The Flow runs tasks or Crews or direct LLM calls, depending on what’s defined. Each task can produce outputs.
  3. Event or Listener: Using @listen(...) decorated methods, the Flow can respond to the completion of earlier steps. For example, a method might wait (“listen”) for a prior method to finish before running.
  4. Conditional Routing: With constructs like routers (via @router) or logical operators (or_, and_), the Flow can branch based on previous outputs.
  5. State Management: Throughout the flow, there is a state object (structured via Pydantic models or unstructured as a dict) that carries data between steps. Flow state also gets a unique identifier (UUID) for tracking.
  6. Persistence: Optional state persistence ensures that flows (or parts of them) can resume across restarts or that prior state can be retrieved.
  7. Visualization and Output: Flows support plotting or visualizing workflow structure (methods, relationships) and final output, including intermediate states.

Components of CrewAI Flows

These are the core building blocks or features that make CrewAI Flows useful and flexible:

1. Decorators

CrewAI Flows uses Python decorators to define how and when each part of our workflow runs.

  • @start(): marks the entry method.
  • @listen(): listens to outputs or completion of other methods.
  • @router(): for conditional branching or routing based on state or outputs.

2. State

Flows maintain a state object that carries information between steps and helps us track execution

  • Unstructured state (dictionary-style) giving agility.
  • Structured state (using Pydantic models) for type safety, schema validation, auto-completion.
  • Unique identifiers (UUIDs) assigned automatically to track each flow execution.

3. Event-Driven Execution

Methods can trigger other methods upon completion. Flows support logical composition like or_ / and_ to combine events.

4. Conditional Logic and Routing

Based on output or state, flows can take different execution paths. Useful for branches, error handling, multiple possible workflows.

5. Integration with Crews

Crews are groups of agents meant for collaborative tasks. Flows can orchestrate Crews, combining them with direct LLM calls and custom code.

6. Persistence and Resumption

Flows allow saving state, so workflows can resume or be inspected later.

7. Visualization Tools

Methods like plot() help us see the flow structure visually.

Implementation of Example Flow with CrewAI

We will be building a simple Flow using CrewAI to demonstrate how these components work in practice

1. Setting Up the Environment

Before working with Flows, we need to install CrewAI:

!pip install crewai

We also need to set any LLM API keys if external LLMs are used:

Python
import os
os.environ["OPENAI_API_KEY"] = "your-api-key-here"

2. Importing Required Libraries

First, we import the necessary classes to define Flows and interact with LLMs.

  • Flow: Base class for defining a Flow.
  • start: Decorator marking the entry point of a Flow.
  • listen: Decorator for methods that run after another method completes.
  • completion: Helper function to call the LLM with model and messages.
Python
from crewai.flow.flow import Flow, listen, start
from litellm import completion

3. Defining the Flow Class

We define a Flow class that orchestrates the steps.

  • model: Specifies the LLM model to use.
  • self.state: Dictionary holding intermediate and final results.
Python
class DishFlow(Flow):
    model = "gpt-4o-mini"

4. Start Method – Suggest a Cuisine

This is the entry point of the Flow, decorated with @start():

  • @start(): Marks this method as the Flow entry point.
  • completion(model, messages): Sends a prompt to the LLM.
  • self.state["cuisine"]: Stores the chosen cuisine for later steps.
Python
    @start()
    def suggest_cuisine(self):
        print("Flow started")
        response = completion(
            model=self.model,
            messages=[
                {"role": "user", "content": "Suggest a cuisine for a meal."}
            ]
        )
        cuisine = response["choices"][0]["message"]["content"]
        self.state["cuisine"] = cuisine
        print(f"Chosen cuisine: {cuisine}")
        return cuisine

5. Listener Method – Suggest a Dish

This method runs after suggest_cuisine and produces a dish:

  • @listen(suggest_cuisine): Ensures this method runs after the start method.
  • cuisine: Argument automatically passed from the previous step.
  • self.state["dish"]: Stores the suggested dish.
Python
    @listen(generate_language)
    def generate_coding_tip(self, language):
        """Generate a quick coding tip for the chosen language"""
        response = completion(
            model=self.model,
            messages=[
                {"role": "user", "content": f"Give me a quick coding tip for {language}."}
            ]
        )
        tip = response["choices"][0]["message"]["content"]
        self.state["tip"] = tip
        print(f"Coding Tip: {tip}")
        return tip

6. Listener Method – Provide a Cooking Tip

This method runs after suggest_dish and produces a useful cooking tip:

  • @listen(suggest_dish): Ensures this method runs after the dish is suggested.
  • dish: Argument automatically passed from the previous step.
  • self.state["tip"]: Stores the tip for later inspection or output.
Python
    @listen(suggest_dish)
    def give_tip(self, dish):
        response = completion(
            model=self.model,
            messages=[
                {"role": "user", "content": f"Give a useful cooking tip for making {dish}."}
            ]
        )
        tip = response["choices"][0]["message"]["content"]
        self.state["tip"] = tip
        return tip

7. Running and Visualizing the Flow

Finally, we instantiate the Flow, visualize it and execute it.

  • flow.plot(): Visualizes the Flow structure.
  • flow.kickoff(): Executes the Flow from start to finish.
  • result: Contains the final output from the last step.
Python
flow = DishFlow()
flow.plot()
result = flow.kickoff()
print("Final output:", result)

Output:

flow
Output
flowchart
Flow Diagram

Running this Flow produces:

  • cuisine: The cuisine suggested by the model.
  • dish: A dish from that cuisine.
  • tip: A helpful cooking tip for that dish.

Applications of CrewAI Flows

These are real-world use cases where CrewAI Flows shine:

  • Content generation pipelines: combining outline creation, content writing, review, revision.
  • Automations with conditional branching: It can include reading email, deciding based on content whether to reply, archive or escalate.
  • Document or data processing: It works like OCR → extraction → analysis → reporting.
  • Chatbots with memory and decision logic: Depending on user’s previous interactions, state, etc.
  • Integrations with external systems or APIs: Flows that trigger external services, wait for responses, branch accordingly.

Explore