How to Execute Agent Spec Configuration with WayFlow#
Prerequisites
This guide assumes you have already defined either a Flow or an Agent using PyAgentSpec and exported its Agent Spec configuration.
This guide demonstrates how to:
Install WayFlow to use its Agent Spec adapter
Define a tool execution registry
Load an Agent Spec configuration using the WayFlow adapter
Run and interact with the assistant
The example uses a minimal Agent configured with a single tool for performing multiplications. The same approach can be applied to load and execute more complex assistants.
1. Installation#
The execution of this guide requires to install the package wayflowcore
.
pip install "wayflowcore==25.4.1"
You can find more information about the Agent Spec adapter in the WayFlow API Reference.
2. Defining the tool registry#
Before loading the configuration, define a tool registry that specifies how tools are executed. This registry is required to specify how tools are executed, because the tools implementation is not included in the Agent Spec configuration of the assistants.
The example below registers a single tool that performs multiplications. In practice, this should be extended to register implementations for all tools that the assistant should use.
This guide focuses on the simplest tool type: ServerTool
.
For information on other types, such as ClientTool and RemoteTool, refer to the Agent Spec Language Specification.
from wayflowcore.tools import ServerTool
multiplication_tool = ServerTool(
name="multiplication_tool",
description="Tool that allows to compute multiplications",
parameters={"a": {"type": "integer"}, "b": {"type": "integer"}},
output={"title": "product", "type": "integer"},
func=lambda a, b: a*b,
)
tool_registry = {
"multiplication_tool": multiplication_tool,
}
3. Loading the Agent Spec configuration#
The configuration of the agent is as follows.
Note that the configuration first defines two components multiplication_tool
and vllm_config
which are then referenced in the definition of the agent itself.
{
"component_type": "Agent",
"id": "e52d2c57-0bdc-4f25-948a-2e9d9f670008",
"name": "Math homework assistant",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [],
"llm_config": {
"component_type": "VllmConfig",
"id": "vllm_config",
"name": "llama-3.1-8b-instruct",
"description": null,
"metadata": {},
"default_generation_parameters": {},
"url": "LLAMA_PUBLIC_ENDPOINT",
"model_id": "meta-llama/Meta-Llama-3.1-8B-Instruct"
},
"system_prompt": "You are an assistant for helping with math homework.",
"tools": [
{
"component_type": "ServerTool",
"id": "multiplication_tool",
"name": "multiplication_tool",
"description": "Tool that allows to compute multiplications",
"metadata": {},
"inputs": [
{
"title": "a",
"type": "integer"
},
{
"title": "b",
"type": "integer"
}
],
"outputs": [
{
"title": "product",
"type": "integer"
}
]
}
],
"agentspec_version": "25.4.1"
}
component_type: Agent
id: e52d2c57-0bdc-4f25-948a-2e9d9f670008
name: Math homework assistant
description: null
metadata: {}
inputs: []
outputs: []
llm_config:
component_type: VllmConfig
id: vllm_config
name: llama-3.1-8b-instruct
description: null
metadata: {}
default_generation_parameters: {}
url: LLAMA_PUBLIC_ENDPOINT
model_id: meta-llama/Meta-Llama-3.1-8B-Instruct
system_prompt: You are an assistant for helping with math homework.
tools:
- component_type: ServerTool
id: multiplication_tool
name: multiplication_tool
description: Tool that allows to compute multiplications
metadata: {}
inputs:
- title: a
type: integer
- title: b
type: integer
outputs:
- title: product
type: integer
agentspec_version: 25.4.1
Loading the configuration to the WayFlow executor is simple as long as the tool_registry
has been defined.
from wayflowcore.agentspec import AgentSpecLoader
loader = AgentSpecLoader(tool_registry=tool_registry)
assistant = loader.load_yaml(AGENTSPEC_CONFIG)
4. Running the assistant#
To start the interaction, first create a conversation.
The assistant then prompts the user for input, and responses from the agent are printed continuously until the process is interrupted (e.g., via Ctrl+C).
Messages of type TOOL_REQUEST
and TOOL_RESULT
are also displayed.
These indicate when the agent invokes one of the available tools to respond to the user.
For more information on message types, refer to the WayFlow API documentation.
from wayflowcore import MessageType
if __name__ == "__main__":
conversation = assistant.start_conversation()
message_idx = 0
while True:
user_input = input("\nUSER >>> ")
conversation.append_user_message(user_input)
assistant.execute(conversation)
messages = conversation.get_messages()
for message in messages[message_idx+1:]:
if message.message_type == MessageType.TOOL_REQUEST:
print(f"\n{message.message_type.value} >>> {message.tool_requests}")
else:
print(f"\n{message.message_type.value} >>> {message.content}")
message_idx = len(messages)
You may also want to execute a non-conversational flow. In this case the execution loop could be implemented as shown below:
if __name__ == "__main__":
conversation = assistant.start_conversation({ "some_input_name": "some_input_value", ... })
status = assistant.execute(conversation)
for output_name, output_value in conversation.state.input_output_key_values.items():
print(f"{output_name} >>> \n{output_value}")
Recap#
This guide covered how to:
Install the Agent Spec adapter for WayFlow.
Define the execution of tools in a tool registry.
Load an Agent Spec configuration using the WayFlow adapter.
Run the assistant and interact with it.
Below is the complete code from this guide.
1import logging
2import warnings
3
4from wayflowcore import MessageType
5from wayflowcore.agentspec import AgentSpecLoader
6from wayflowcore.tools import ServerTool
7
8warnings.filterwarnings("ignore")
9logging.basicConfig(level=logging.CRITICAL)
10
11AGENTSPEC_CONFIG = """
12component_type: Agent
13id: e52d2c57-0bdc-4f25-948a-2e9d9f670008
14name: Math homework assistant
15description: null
16metadata: {}
17inputs: []
18outputs: []
19llm_config:
20 component_type: VllmConfig
21 id: vllm_config
22 name: llama-3.1-8b-instruct
23 description: null
24 metadata: {}
25 default_generation_parameters: {}
26 url: LLAMA_PUBLIC_ENDPOINT
27 model_id: meta-llama/Meta-Llama-3.1-8B-Instruct
28system_prompt: You are an assistant for helping with math homework.
29tools:
30- component_type: ServerTool
31 id: multiplication_tool
32 name: multiplication_tool
33 description: Tool that allows to compute multiplications
34 metadata: {}
35 inputs:
36 - title: a
37 type: integer
38 - title: b
39 type: integer
40 outputs:
41 - title: product
42 type: integer
43"""
44
45multiplication_tool = ServerTool(
46 name="multiplication_tool",
47 description="Tool that allows to compute multiplications",
48 parameters={"a": {"type": "integer"}, "b": {"type": "integer"}},
49 output={"title": "product", "type": "integer"},
50 func=lambda a, b: a * b,
51)
52
53tool_registry = {
54 "multiplication_tool": multiplication_tool,
55}
56
57loader = AgentSpecLoader(tool_registry=tool_registry)
58assistant = loader.load_yaml(AGENTSPEC_CONFIG)
59
60if __name__ == "__main__":
61 conversation = assistant.start_conversation()
62 message_idx = 0
63 while True:
64 user_input = input("\nUSER >>> ")
65 conversation.append_user_message(user_input)
66 assistant.execute(conversation)
67 messages = conversation.get_messages()
68 for message in messages[message_idx + 1 :]:
69 if message.message_type == MessageType.TOOL_REQUEST:
70 print(f"\n{message.message_type.value} >>> {message.tool_requests}")
71 else:
72 print(f"\n{message.message_type.value} >>> {message.content}")
73 message_idx = len(messages)
Next steps#
To discover more advanced capabilities, refer to the WayFlow documentation and learn what you can build with Flows and Agents.