How to build a simple ReAct Agent#

Agents can be configured to tackle many scenarios. Proper configuration of their instructions is essential.

In this how to guide, we will learn how to:

  • Configure the instructions of an Agent.

  • Set up instructions that vary based on some inputs.

  • Add tools to create a ReAct Agent.

Basic implementation#

In this scenario built for this guide, you need an agent to assist a user in writing articles. In the simplest implementation of your agent, you just have to specify some basic elements that compose your agent. The first is the LLM, that can be defined as any of the LlmConfig implementations offered by Agent Spec. This LLM is going to be prompted by the Agent to accomplish the tasks that you (in the prompt), or the user (during the conversation), assign to it.

from pyagentspec.llms import VllmConfig

llm_config = VllmConfig(
    name="vllm-llama-4-maverick",
    model_id="llama-4-maverick",
    url="http://url.to.my.vllm.server/llama4mav",
)

Now that you have defined your LLM, you can directly create your Agent by specifying a name and giving it a system prompt to instruct the agent on what to do.

from pyagentspec.agent import Agent

agent = Agent(
    name="Helpful agent",
    llm_config=llm_config,
    system_prompt="""Your a helpful writing assistant. Answer the user's questions about article writing.
Make sure to welcome the user first, but keep it short""",
)

Sometimes, there is contextual information relevant to the conversation. Assume a user is interacting with the assistant, and you know some relevant information about this user. In such a scenario, you might want to make the agent aware of this information by injecting it in its system prompt. To make the assistant more context-aware, Agent Spec allows defining input properties through placeholders in the system_prompt. To do this, you can just specify the name of the property between double curly brackets. In the following example we inject the username in the system prompt of the agent.

from pyagentspec.agent import Agent
from pyagentspec.property import StringProperty

agent = Agent(
    name="Helpful agent with username",
    llm_config=llm_config,
    system_prompt="""Your a helpful writing assistant. Answer the user's questions about article writing.
Make sure to welcome the user first, their name is {{user_name}}, but keep it short""",
    inputs=[StringProperty(title="user_name")]
)

Lastly, you can add tools that the Agent can use as part of its ReAct implementation to accomplish the assigned tasks. It’s not required to update the system prompt to make the agent aware of the existence of these tools. You can simply define them, and add them to the Agent’s instantiation.

from pyagentspec.property import ListProperty
from pyagentspec.tools import ServerTool

tools = [
    ServerTool(
        name="get_synonyms",
        description="Given a word, return the list of synonyms according to the vocabulary",
        inputs=[StringProperty(title="word")],
        outputs=[ListProperty(title="synonyms", item_type=StringProperty(title="word"))]
    ),
    ServerTool(
        name="pretty_formatting",
        description="Given a paragraph, format the paragraph to fix spaces, newlines, indentation, etc.",
        inputs=[StringProperty(title="paragraph")],
        outputs=[StringProperty(title="formatted_paragraph")]
    ),
]

agent = Agent(
    name="Helpful agent with username and tools",
    llm_config=llm_config,
    system_prompt="""Your a helpful writing assistant. Answer the user's questions about article writing.
Make sure to welcome the user first, their name is {{user_name}}, but keep it short""",
    tools=tools,
    inputs=[StringProperty(title="user_name")]
)

Agent Spec Serialization#

You can export the assistant configuration using the AgentSpecSerializer.

from pyagentspec.serialization import AgentSpecSerializer

serialized_assistant = AgentSpecSerializer().to_json(agent)

Here is what the Agent Spec representation will look like ↓

Click here to see the assistant configuration.
{
  "component_type": "Agent",
  "id": "c0018da9-2d1d-4e5d-bfb5-63c3e6989d37",
  "name": "Helpful agent with username and tools",
  "description": null,
  "metadata": {},
  "inputs": [
    {
      "title": "user_name",
      "type": "string"
    }
  ],
  "outputs": [],
  "llm_config": {
    "component_type": "VllmConfig",
    "id": "6706d322-b10f-4432-a087-45bf22aabfb2",
    "name": "vllm-llama-4-maverick",
    "description": null,
    "metadata": {},
    "default_generation_parameters": null,
    "url": "http://url.to.my.vllm.server/llama4mav",
    "model_id": "llama-4-maverick"
  },
  "system_prompt": "Your a helpful writing assistant. Answer the user's questions about article writing.\nMake sure to welcome the user first, their name is {{user_name}}, but keep it short",
  "tools": [
    {
      "component_type": "ServerTool",
      "id": "40c7909f-a972-4497-90db-d7ba76047e28",
      "name": "get_synonyms",
      "description": "Given a word, return the list of synonyms according to the vocabulary",
      "metadata": {},
      "inputs": [
        {
          "title": "word",
          "type": "string"
        }
      ],
      "outputs": [
        {
          "title": "synonyms",
          "items": {
            "title": "word",
            "type": "string"
          },
          "type": "array"
        }
      ]
    },
    {
      "component_type": "ServerTool",
      "id": "a532a148-0dfc-4de6-a954-349b01b04275",
      "name": "pretty_formatting",
      "description": "Given a paragraph, format the paragraph to fix spaces, newlines, indentation, etc.",
      "metadata": {},
      "inputs": [
        {
          "title": "paragraph",
          "type": "string"
        }
      ],
      "outputs": [
        {
          "title": "formatted_paragraph",
          "type": "string"
        }
      ]
    }
  ],
  "agentspec_version": "25.4.1"
}

Recap#

In this guide, you learned how to configure Agent instructions with:

  • Pure text instructions;

  • Specific context-dependent variables;

  • Tools.

Below is the complete code from this guide.
from pyagentspec.agent import Agent
from pyagentspec.llms import VllmConfig
from pyagentspec.property import ListProperty, StringProperty
from pyagentspec.serialization import AgentSpecSerializer
from pyagentspec.tools import ServerTool

llm_config = VllmConfig(
    name="vllm-llama-4-maverick",
    model_id="llama-4-maverick",
    url="http://url.to.my.vllm.server/llama4mav",
)

tools = [
    ServerTool(
        name="get_synonyms",
        description="Given a word, return the list of synonyms according to the vocabulary",
        inputs=[StringProperty(title="word")],
        outputs=[ListProperty(title="synonyms", item_type=StringProperty(title="word"))]
    ),
    ServerTool(
        name="pretty_formatting",
        description="Given a paragraph, format the paragraph to fix spaces, newlines, indentation, etc.",
        inputs=[StringProperty(title="paragraph")],
        outputs=[StringProperty(title="formatted_paragraph")]
    ),
]

agent = Agent(
    name="Helpful agent with username and tools",
    llm_config=llm_config,
    system_prompt="""Your a helpful writing assistant. Answer the user's questions about article writing.
Make sure to welcome the user first, their name is {{user_name}}, but keep it short""",
    tools=tools,
    inputs=[StringProperty(title="user_name")]
)

serialized_assistant = AgentSpecSerializer().to_json(agent)

Next steps#

Having learned how to configure agent instructions, you may now proceed to: