Custom MCP Tools
Details
The AI Optimizer exposes an MCP server built on FastMCP. All registered tools are available over the MCP protocol at the /mcp endpoint and through the REST API at /mcp/tools.
Developers can add custom tools by dropping a Python file into the tools package โ no other files need to be edited.
Quick Example
Create a new file in src/server/app/mcp/tools/:
Restart the server and the tool is immediately available to any MCP client. The registry automatically discovers every register_* function in the src/server/app/mcp/tools/ package at startup.
How Auto-Discovery Works
During startup, register_mcp_tools() in src/server/app/mcp/tools/registry.py scans the package for Python modules, imports each one, and calls every function whose name starts with register_. Utility modules (schemas.py, registry.py, __init__.py) are skipped automatically.
This means adding a tool is a single-file operation:
- Create a new
.pyfile insrc/server/app/mcp/tools/. - Define a function named
register_<something>. - Inside that function, decorate your tool with
@mcp.tool(). - Restart the server.
Step-by-Step Guide
1. Create the Tool File
Add a new Python file under src/server/app/mcp/tools/. Each file should contain:
- A registration function (named
register_*) that decorates the tool with@mcp.tool(). - An optional private implementation function (
_impl) to keep business logic separate from the registration boilerplate.
Import the shared mcp instance from server.app.core.mcp:
2. Decorate with @mcp.tool()
The @mcp.tool() decorator accepts the following parameters:
| Parameter | Description |
|---|---|
name | Unique tool identifier. Prefix with optimizer_ by convention. |
title | Human-readable display name. |
tags | Set of strings for categorization (e.g. {"math", "optimizer"}). |
annotations | Optional hints: readOnlyHint, idempotentHint, openWorldHint. |
timeout | Execution timeout in seconds (default varies by FastMCP). |
The decorated function’s docstring is sent to the LLM as the tool description, so make it clear and concise. Function parameters with type hints become the tool’s input schema automatically.
3. Define a Response Model (Optional)
For tools that return structured data, define a Pydantic BaseModel in src/server/app/mcp/tools/schemas.py:
Then use it as the return type:
4. Verify
After restarting the server, confirm the tool is registered:
The response will include your new tool alongside the built-in tools.
Using Custom Tools
Registering a tool makes it available on the MCP server, but something still needs to call it. There are two ways a custom tool can be used:
External MCP Clients
Any MCP-compatible client can connect to the AI Optimizer server and use registered tools directly. Configure the client to connect to the /mcp endpoint with an X-API-Key header.
Examples of MCP clients that can consume tools this way:
- Claude Desktop
- Claude Code
- VS Code Copilot
- Cursor
- Any client that supports the MCP specification
With this approach the tool is available immediately after registration โ no additional server-side code is needed.
Internal Agent Use (AgentSpec)
The AI Optimizer uses AgentSpec to define agents and flows as portable configurations. There are two ways to bind MCP tools to an agent:
MCPToolBox โ Connects to the MCP server and discovers all available tools at runtime. The built-in NL2SQL Agent uses this pattern. Any custom tool registered on the server is automatically available without code changes.
MCPTool โ References a single tool by name with explicit inputs and outputs, wired into a flow graph. The built-in VecSearch Flow uses this pattern. Adding a tool requires modifying the flow definition.
Example: Agent with MCPToolBox
The simplest way to use the optimizer_add tool in a custom agent is with an MCPToolBox, which auto-discovers all registered tools:
When a user asks “What is 3 + 4?”, the LLM sees optimizer_add in its available tools, calls it with a=3, b=4, and returns the result.
See the Agents and Flows documentation for the full define โ load โ execute pattern, including how to wire an agent into the chat endpoint.
Tips
- Async tools: Use
async defwhen your tool performs I/O (database queries, HTTP calls, etc.). - Context: Add an optional
ctx: Contextparameter (fromfastmcp) to emit progress messages back to the MCP client viaawait ctx.info(...). - Naming: Prefix tool names with
optimizer_to avoid collisions with tools from other MCP servers. - Testing: See
src/server/tests/mcp/tools/for examples of how the built-in tools are tested.