How to Use Disaggregated Configurations#
Disaggregated configurations enable you to separate out certain components or values from the main Agent Spec configuration JSON/YAML, referencing them externally rather than serializing them inline. This pattern is commonly used to manage agent or application setups that require modularity and environment specific customization.
Disaggregated configurations improve security and maintainability by allowing sensitive or frequently changing components, such as LLM Configurations, URLs, or other runtime parameters, to be stored separately from the main configuration. This separation makes it easier to securely manage environment specific settings, reuse shared components across multiple agents, and update or swap components without altering the core configuration, reducing duplication and risk of accidental data exposure.
This guide will show you how to:
Create disaggregated components for agents and tools and use them effectively.
Save your main spec and disaggregated components into separate files.
Load the main spec with different versions of the disaggregated components.
Basic Implementation#
Let’s demonstrate by creating a weather agent that uses a client tool to fetch weather data. This example will show how disaggregation works in a simple scenario.
from pyagentspec.llms.vllmconfig import OpenAiCompatibleConfig
from pyagentspec.tools import ClientTool
from pyagentspec.property import StringProperty
from pyagentspec.agent import Agent
from pyagentspec.serialization import AgentSpecSerializer, AgentSpecDeserializer
llm_config_dev = OpenAiCompatibleConfig(
name="llm-dev",
model_id="llm-model_1",
url="http://dev.llm.url",
)
city_input = StringProperty(title="city_name", default="zurich")
weather_output = StringProperty(title="forecast")
weather_tool = ClientTool(
id="weather_tool",
name="get_weather",
description="Gets the weather for a city",
inputs=[city_input],
outputs=[weather_output],
)
agent = Agent(
id="agent_id",
name="Weather Agent",
llm_config=llm_config_dev,
system_prompt="You are a helpful weather assistant.",
tools=[weather_tool],
)
Serialization/Deserialization#
Consider an LLM config, which often varies between development and production environments. Disaggregating this component enables you to load the appropriate version dynamically. Similarly, client tools may be mocked during development but swapped with real ones, such as database tools, in production. Storing these separately safeguards them while ensuring they remain accessible to the relevant agents. These are just a few examples, many other components can benefit from this modular approach.
serializer = AgentSpecSerializer()
main_yaml, disagg_yaml = serializer.to_yaml(
agent,
disaggregated_components=[
(llm_config_dev, "llm_config"),
(weather_tool, "client_weather_tool"),
],
export_disaggregated_components=True,
)
Now, in deserialization time, you can change the disaggregated components as shown:
deserializer = AgentSpecDeserializer()
component_registry = deserializer.from_yaml(
disagg_yaml,
import_only_referenced_components=True,
)
# Change the components dynamically
# For example, in this case we want to use a different LLM from the one we built the agent with
llm_config_prod = OpenAiCompatibleConfig(
name="llm-prod",
model_id="llm_model_2",
url="http://prod.llm.url",
)
component_registry["llm_config"] = llm_config_prod
# The `client_weather_tool` remains the one that was deserialized from `disagg_yaml`
# Load the agent with the updated component registry
loaded_agent = deserializer.from_yaml(
main_yaml,
components_registry=component_registry,
)
This approach of dynamically defining components is the preferred way for handling sensitive information, as it ensures that such data is never unnecessarily exposed or stored, maintaining a higher level of security.
Now you can run the spec using any supported framework. For more information, see AgentSpec with Wayflow and AgentSpec across Frameworks.
Here is what the Agent Spec representation will look like: ↓ (first one is the main spec and second one is the disaggregated components)
Click here to see the assistant configuration.
{
"component_type": "Agent",
"id": "agent_id",
"name": "Weather Agent",
"description": null,
"metadata": {},
"inputs": [],
"outputs": [],
"llm_config": {
"$component_ref": "llm_config"
},
"system_prompt": "You are a helpful weather assistant.",
"tools": [
{
"$component_ref": "client_weather_tool"
}
],
"agentspec_version": "25.4.1"
}
{
"$referenced_components": {
"llm_config": {
"component_type": "OpenAiCompatibleConfig",
"id": "9612d107-0e69-4169-85d1-8e4e3aee41ec",
"name": "llm-dev",
"description": null,
"metadata": {},
"default_generation_parameters": null,
"url": "http://dev.llm.url",
"model_id": "llm-model_1",
"agentspec_version": "25.4.1"
},
"client_weather_tool": {
"component_type": "ClientTool",
"id": "weather_tool",
"name": "get_weather",
"description": "Gets the weather for a city",
"metadata": {},
"inputs": [
{
"title": "city_name",
"default": "zurich",
"type": "string"
}
],
"outputs": [
{
"title": "forecast",
"type": "string"
}
],
"agentspec_version": "25.4.1"
}
}
}
component_type: Agent
id: agent_id
name: Weather Agent
description: null
metadata: {}
inputs: []
outputs: []
llm_config:
$component_ref: llm_config
system_prompt: You are a helpful weather assistant.
tools:
- $component_ref: client_weather_tool
agentspec_version: 25.4.1
$referenced_components:
llm_config:
component_type: OpenAiCompatibleConfig
id: 9612d107-0e69-4169-85d1-8e4e3aee41ec
name: llm-dev
description: null
metadata: {}
default_generation_parameters: null
url: http://dev.llm.url
model_id: llm-model_1
agentspec_version: 25.4.1
client_weather_tool:
component_type: ClientTool
id: weather_tool
name: get_weather
description: Gets the weather for a city
metadata: {}
inputs:
- title: city_name
default: zurich
type: string
outputs:
- title: forecast
type: string
agentspec_version: 25.4.1
Recap#
In this guide, you’ve learned to split your agent setups into manageable, secure pieces using disaggregated configurations in Agent Spec.
Below is the complete code from this guide.
from pyagentspec.llms.vllmconfig import OpenAiCompatibleConfig
from pyagentspec.tools import ClientTool
from pyagentspec.property import StringProperty
from pyagentspec.agent import Agent
from pyagentspec.serialization import AgentSpecSerializer, AgentSpecDeserializer
llm_config_dev = OpenAiCompatibleConfig(
name="llm-dev",
model_id="llm-model_1",
url="http://dev.llm.url",
)
city_input = StringProperty(title="city_name", default="zurich")
weather_output = StringProperty(title="forecast")
weather_tool = ClientTool(
id="weather_tool",
name="get_weather",
description="Gets the weather for a city",
inputs=[city_input],
outputs=[weather_output],
)
agent = Agent(
id="agent_id",
name="Weather Agent",
llm_config=llm_config_dev,
system_prompt="You are a helpful weather assistant.",
tools=[weather_tool],
)
serializer = AgentSpecSerializer()
main_yaml, disagg_yaml = serializer.to_yaml(
agent,
disaggregated_components=[
(llm_config_dev, "llm_config"),
(weather_tool, "client_weather_tool"),
],
export_disaggregated_components=True,
)
deserializer = AgentSpecDeserializer()
component_registry = deserializer.from_yaml(
disagg_yaml,
import_only_referenced_components=True,
)
# Change the components dynamically
# For example, in this case we want to use a different LLM from the one we built the agent with
llm_config_prod = OpenAiCompatibleConfig(
name="llm-prod",
model_id="llm_model_2",
url="http://prod.llm.url",
)
component_registry["llm_config"] = llm_config_prod
# The `client_weather_tool` remains the one that was deserialized from `disagg_yaml`
# Load the agent with the updated component registry
loaded_agent = deserializer.from_yaml(
main_yaml,
components_registry=component_registry,
)
Next Steps#
Now that you understand disaggregated configurations, explore more with: