Build Flows with the Flow Builder#
Prerequisites
This guide assumes you are familiar with the following concepts:
Flows and basic nodes/edges
Selecting an LLM configuration
Overview#
The FlowBuilder provides a concise, readable way to assemble flows without manually wiring every edge. It supports:
You can quickly assemble a sequence of connected nodes with
build_linear_flow.add_nodeandadd_edgeenable you to manually add individual nodes and wire control edges for custom flow topologies.add_data_edgelets you specify data flow between nodes.Use
add_sequenceto add multiple nodes in order and automatically connect them with control edges.set_entry_pointandset_finish_pointsdeclare where your flow begins and ends.add_conditionalallows you to branch execution to specific nodes based on outputs from a source node.
See the full API in API › Flows and quick snippets in the Reference Sheet.
1. Build a linear flow#
Create two LLM nodes and connect them linearly with a single call.
from pyagentspec.flows.flowbuilder import FlowBuilder
from pyagentspec.flows.nodes import LlmNode
from pyagentspec.llms import VllmConfig
llm_config = VllmConfig(
name="Llama 3.1 8B instruct",
url="your_url",
model_id="meta-llama/Meta-Llama-3.1-8B-Instruct",
)
greet = LlmNode(name="greet", llm_config=llm_config, prompt_template="Say hello")
reply = LlmNode(name="reply", llm_config=llm_config, prompt_template="Say world")
linear_flow = FlowBuilder.build_linear_flow([greet, reply])
API Reference: FlowBuilder
2. Add a conditional branch#
Add a branching step where a node output (for example decision) determines which node to run next. You can also define multiple finish points.
decider = LlmNode(
name="decider",
llm_config=llm_config,
prompt_template="Return success or fail",
)
on_success = LlmNode(name="on_success", llm_config=llm_config, prompt_template="OK")
on_fail = LlmNode(name="on_fail", llm_config=llm_config, prompt_template="KO")
flow_with_branch = (
FlowBuilder()
.add_sequence([decider])
.add_node(on_success)
.add_node(on_fail)
.add_conditional(
source_node=decider,
source_value=LlmNode.DEFAULT_OUTPUT,
destination_map={"success": on_success, "fail": on_fail},
default_destination=on_fail,
)
.set_entry_point(decider)
.set_finish_points([on_success, on_fail])
.build()
)
Notes:
add_conditionalaccepts the branch key as a string output name (e.g.,"decision") or as a tuple(node_or_name, output_name).set_finish_pointsdeclares which nodes connect to automatically createdEndNode(s).
3. Manually connect nodes#
Add individual nodes and explicitly define both control and data edges for full control over your flow.
producer = LlmNode(name="producer", llm_config=llm_config, prompt_template="Say Hello")
consumer1 = LlmNode(name="consumer1", llm_config=llm_config, prompt_template="{{generated_text}}")
consumer2 = LlmNode(name="consumer2", llm_config=llm_config, prompt_template="{{also_value}}")
flow_with_connections = (
FlowBuilder()
.add_node(producer)
.add_node(consumer1)
.add_node(consumer2)
.add_edge(producer, consumer1)
.add_edge(producer, consumer2)
# Using the default output name for LlmNode.DEFAULT_OUTPUT
.add_data_edge(producer, consumer1, LlmNode.DEFAULT_OUTPUT)
.add_data_edge(producer, consumer2, (LlmNode.DEFAULT_OUTPUT, "also_value"))
.set_entry_point(producer)
.set_finish_points([consumer1, consumer2])
.build()
)
Notes:
- add_node lets you add each node individually.
- add_edge specifies the order in which your nodes are going to be executed.
- add_data_edge passes specific output data from a source node to a target node.
This approach allows you to design custom topologies beyond simple sequences or branches.
4. Export the flow#
Serialize your flow to Agent Spec JSON for execution in a compatible runtime.
from pyagentspec.serialization import AgentSpecSerializer
serialized_linear = AgentSpecSerializer().to_json(linear_flow)
serialized_branch = AgentSpecSerializer().to_json(flow_with_branch)
API Reference: AgentSpecSerializer
Here is what the Agent Spec representation will look like ↓#
Click here to see the assistant configuration.
{
"component_type": "Flow",
"id": "ff3b388c-63dc-4efb-bde6-8fb9f2cbd98c",
"name": "Flow",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [],
"start_node": {
"$component_ref": "fea10024-86cd-4af7-a04c-e0e2fb943edd"
},
"nodes": [
{
"$component_ref": "3cfe5496-6f81-4440-af26-39b2a06db16f"
},
{
"$component_ref": "14c8019a-ae69-4a25-82d6-12ab845cb778"
},
{
"$component_ref": "fea10024-86cd-4af7-a04c-e0e2fb943edd"
},
{
"$component_ref": "277a121e-3c4a-494d-8030-e6f9414b73c5"
}
],
"control_flow_connections": [
{
"component_type": "ControlFlowEdge",
"id": "3b1bdfa8-dfd8-4b2d-9740-faa9c01f7363",
"name": "control_edge_greet_reply_None",
"description": null,
"metadata": {},
"from_node": {
"$component_ref": "3cfe5496-6f81-4440-af26-39b2a06db16f"
},
"from_branch": null,
"to_node": {
"$component_ref": "14c8019a-ae69-4a25-82d6-12ab845cb778"
}
},
{
"component_type": "ControlFlowEdge",
"id": "976c1074-8e6b-45f7-8b6f-bf2b6f504367",
"name": "control_edge_StartNode_greet_None",
"description": null,
"metadata": {},
"from_node": {
"$component_ref": "fea10024-86cd-4af7-a04c-e0e2fb943edd"
},
"from_branch": null,
"to_node": {
"$component_ref": "3cfe5496-6f81-4440-af26-39b2a06db16f"
}
},
{
"component_type": "ControlFlowEdge",
"id": "6752d469-6a3b-498c-8c92-0324ddd76a88",
"name": "control_edge_reply_EndNode_1_None",
"description": null,
"metadata": {},
"from_node": {
"$component_ref": "14c8019a-ae69-4a25-82d6-12ab845cb778"
},
"from_branch": null,
"to_node": {
"$component_ref": "277a121e-3c4a-494d-8030-e6f9414b73c5"
}
}
],
"data_flow_connections": [],
"$referenced_components": {
"3cfe5496-6f81-4440-af26-39b2a06db16f": {
"component_type": "LlmNode",
"id": "3cfe5496-6f81-4440-af26-39b2a06db16f",
"name": "greet",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [
{
"description": "Raw text generated by the LLM",
"title": "generated_text",
"type": "string"
}
],
"branches": [
"next"
],
"llm_config": {
"$component_ref": "9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5"
},
"prompt_template": "Say hello"
},
"9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5": {
"component_type": "VllmConfig",
"id": "9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5",
"name": "Llama 3.1 8B instruct",
"description": null,
"metadata": {},
"default_generation_parameters": null,
"url": "http://localhost:8000",
"model_id": "meta-llama/Meta-Llama-3.1-8B-Instruct"
},
"14c8019a-ae69-4a25-82d6-12ab845cb778": {
"component_type": "LlmNode",
"id": "14c8019a-ae69-4a25-82d6-12ab845cb778",
"name": "reply",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [
{
"description": "Raw text generated by the LLM",
"title": "generated_text",
"type": "string"
}
],
"branches": [
"next"
],
"llm_config": {
"$component_ref": "9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5"
},
"prompt_template": "Say world"
},
"fea10024-86cd-4af7-a04c-e0e2fb943edd": {
"component_type": "StartNode",
"id": "fea10024-86cd-4af7-a04c-e0e2fb943edd",
"name": "StartNode",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [],
"branches": [
"next"
]
},
"277a121e-3c4a-494d-8030-e6f9414b73c5": {
"component_type": "EndNode",
"id": "277a121e-3c4a-494d-8030-e6f9414b73c5",
"name": "EndNode_1",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [],
"branches": [],
"branch_name": "next"
}
},
"agentspec_version": "25.4.1"
}
component_type: Flow
id: ff3b388c-63dc-4efb-bde6-8fb9f2cbd98c
name: Flow
description: null
metadata: {}
inputs: []
outputs: []
start_node:
$component_ref: fea10024-86cd-4af7-a04c-e0e2fb943edd
nodes:
- $component_ref: 3cfe5496-6f81-4440-af26-39b2a06db16f
- $component_ref: 14c8019a-ae69-4a25-82d6-12ab845cb778
- $component_ref: fea10024-86cd-4af7-a04c-e0e2fb943edd
- $component_ref: 277a121e-3c4a-494d-8030-e6f9414b73c5
control_flow_connections:
- component_type: ControlFlowEdge
id: 3b1bdfa8-dfd8-4b2d-9740-faa9c01f7363
name: control_edge_greet_reply_None
description: null
metadata: {}
from_node:
$component_ref: 3cfe5496-6f81-4440-af26-39b2a06db16f
from_branch: null
to_node:
$component_ref: 14c8019a-ae69-4a25-82d6-12ab845cb778
- component_type: ControlFlowEdge
id: 976c1074-8e6b-45f7-8b6f-bf2b6f504367
name: control_edge_StartNode_greet_None
description: null
metadata: {}
from_node:
$component_ref: fea10024-86cd-4af7-a04c-e0e2fb943edd
from_branch: null
to_node:
$component_ref: 3cfe5496-6f81-4440-af26-39b2a06db16f
- component_type: ControlFlowEdge
id: 6752d469-6a3b-498c-8c92-0324ddd76a88
name: control_edge_reply_EndNode_1_None
description: null
metadata: {}
from_node:
$component_ref: 14c8019a-ae69-4a25-82d6-12ab845cb778
from_branch: null
to_node:
$component_ref: 277a121e-3c4a-494d-8030-e6f9414b73c5
data_flow_connections: []
$referenced_components:
3cfe5496-6f81-4440-af26-39b2a06db16f:
component_type: LlmNode
id: 3cfe5496-6f81-4440-af26-39b2a06db16f
name: greet
description: null
metadata: {}
inputs: []
outputs:
- description: Raw text generated by the LLM
title: generated_text
type: string
branches:
- next
llm_config:
$component_ref: 9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5
prompt_template: Say hello
9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5:
component_type: VllmConfig
id: 9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5
name: Llama 3.1 8B instruct
description: null
metadata: {}
default_generation_parameters: null
url: http://localhost:8000
model_id: meta-llama/Meta-Llama-3.1-8B-Instruct
14c8019a-ae69-4a25-82d6-12ab845cb778:
component_type: LlmNode
id: 14c8019a-ae69-4a25-82d6-12ab845cb778
name: reply
description: null
metadata: {}
inputs: []
outputs:
- description: Raw text generated by the LLM
title: generated_text
type: string
branches:
- next
llm_config:
$component_ref: 9e8fc43b-215f-4dfb-a7bd-c8b02fd97cd5
prompt_template: Say world
fea10024-86cd-4af7-a04c-e0e2fb943edd:
component_type: StartNode
id: fea10024-86cd-4af7-a04c-e0e2fb943edd
name: StartNode
description: null
metadata: {}
inputs: []
outputs: []
branches:
- next
277a121e-3c4a-494d-8030-e6f9414b73c5:
component_type: EndNode
id: 277a121e-3c4a-494d-8030-e6f9414b73c5
name: EndNode_1
description: null
metadata: {}
inputs: []
outputs: []
branches: []
branch_name: next
agentspec_version: 25.4.1
Recap#
This how-to guide showed how to:
Build a linear flow in one line with
build_linear_flowAdd a conditional branch with
add_conditionalDeclare entry and finish points and serialize your flow
Next steps#
Explore more patterns in the Reference Sheet
See the complete API in API › Flows
Learn about branching and loops in How to Develop a Flow with Conditional Branches