Skip to content

API Reference

Typed server specifications and conversion helpers for FastMCP configuration.

ServerSpec = Union[StdioServerSpec, HTTPServerSpec] module-attribute

Union of supported server specifications.

HTTPServerSpec

Bases: _BaseServer

Specification for a remote MCP server reachable via HTTP/SSE.

Attributes:

Name Type Description
url str

Full endpoint URL for the MCP server (e.g., http://127.0.0.1:8000/mcp).

transport Literal['http', 'streamable-http', 'sse']

The transport mechanism ("http", "streamable-http", or "sse").

headers Dict[str, str]

Optional request headers (e.g., Authorization tokens).

auth str | None

Optional auth hint if your FastMCP deployment consumes it.

Source code in src/deepmcpagent/config.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
class HTTPServerSpec(_BaseServer):
    """Specification for a remote MCP server reachable via HTTP/SSE.

    Attributes:
        url: Full endpoint URL for the MCP server (e.g., http://127.0.0.1:8000/mcp).
        transport: The transport mechanism ("http", "streamable-http", or "sse").
        headers: Optional request headers (e.g., Authorization tokens).
        auth: Optional auth hint if your FastMCP deployment consumes it.
    """

    url: str
    transport: Literal["http", "streamable-http", "sse"] = "http"
    headers: Dict[str, str] = Field(default_factory=dict)
    auth: str | None = None

StdioServerSpec

Bases: _BaseServer

Specification for a local MCP server launched via stdio.

NOTE

The FastMCP Python client typically expects HTTP/SSE transports. Using StdioServerSpec requires a different adapter or an HTTP shim in front of the stdio server. Keep this for future expansion or custom runners.

Attributes:

Name Type Description
command str

Executable to launch (e.g., "python").

args List[str]

Positional arguments for the process.

env Dict[str, str]

Environment variables to set for the process.

cwd str | None

Optional working directory.

keep_alive bool

Whether the client should try to keep a persistent session.

Source code in src/deepmcpagent/config.py
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
class StdioServerSpec(_BaseServer):
    """Specification for a local MCP server launched via stdio.

    NOTE:
        The FastMCP Python client typically expects HTTP/SSE transports. Using
        `StdioServerSpec` requires a different adapter or an HTTP shim in front
        of the stdio server. Keep this for future expansion or custom runners.

    Attributes:
        command: Executable to launch (e.g., "python").
        args: Positional arguments for the process.
        env: Environment variables to set for the process.
        cwd: Optional working directory.
        keep_alive: Whether the client should try to keep a persistent session.
    """

    command: str
    args: List[str] = Field(default_factory=list)
    env: Dict[str, str] = Field(default_factory=dict)
    cwd: str | None = None
    keep_alive: bool = True

servers_to_mcp_config(servers)

Convert programmatic server specs to the FastMCP configuration dict.

Parameters:

Name Type Description Default
servers Mapping[str, ServerSpec]

Mapping of server name to specification.

required

Returns:

Type Description
Dict[str, Dict[str, object]]

Dict suitable for initializing fastmcp.Client({"mcpServers": ...}).

Source code in src/deepmcpagent/config.py
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
def servers_to_mcp_config(servers: Mapping[str, ServerSpec]) -> Dict[str, Dict[str, object]]:
    """Convert programmatic server specs to the FastMCP configuration dict.

    Args:
        servers: Mapping of server name to specification.

    Returns:
        Dict suitable for initializing `fastmcp.Client({"mcpServers": ...})`.
    """
    cfg: Dict[str, Dict[str, object]] = {}
    for name, s in servers.items():
        if isinstance(s, StdioServerSpec):
            cfg[name] = {
                "transport": "stdio",
                "command": s.command,
                "args": s.args,
                "env": s.env or None,
                "cwd": s.cwd or None,
                "keep_alive": s.keep_alive,
            }
        else:
            entry: Dict[str, object] = {
                "transport": s.transport,
                "url": s.url,
            }
            if s.headers:
                entry["headers"] = s.headers
            if s.auth is not None:
                entry["auth"] = s.auth
            cfg[name] = entry
    return cfg

FastMCP client wrapper that supports multiple servers via a single configuration.

FastMCPMulti

Create a single FastMCP client wired to multiple servers.

The client is configured using the mcpServers dictionary generated from the typed server specifications.

Parameters:

Name Type Description Default
servers Mapping[str, ServerSpec]

Mapping of server name to server spec.

required
Source code in src/deepmcpagent/clients.py
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
class FastMCPMulti:
    """Create a single FastMCP client wired to multiple servers.

    The client is configured using the `mcpServers` dictionary generated from
    the typed server specifications.

    Args:
        servers: Mapping of server name to server spec.
    """

    def __init__(self, servers: Mapping[str, ServerSpec]) -> None:
        mcp_cfg = {"mcpServers": servers_to_mcp_config(servers)}
        self._client = FastMCPClient(mcp_cfg)

    @property
    def client(self) -> FastMCPClient:
        """Return the underlying FastMCP client instance."""
        return self._client

client property

Return the underlying FastMCP client instance.


MCP tool discovery and conversion to LangChain tools.

MCPToolLoader

Discover MCP tools via FastMCP and convert them to LangChain tools.

Source code in src/deepmcpagent/tools.py
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
class MCPToolLoader:
    """Discover MCP tools via FastMCP and convert them to LangChain tools."""

    def __init__(self, multi: FastMCPMulti) -> None:
        self._multi = multi

    async def get_all_tools(self) -> List[BaseTool]:
        """Return all available tools as LangChain `BaseTool` instances."""
        c = self._multi.client
        async with c:
            tools = await c.list_tools()
            out: List[BaseTool] = []
            for t in tools:
                name = t.name
                desc = getattr(t, "description", "") or ""
                schema = getattr(t, "inputSchema", None) or {}
                model = _jsonschema_to_pydantic(schema)
                out.append(
                    _FastMCPTool(
                        name=name,
                        description=desc,
                        args_schema=model,
                        tool_name=name,
                        client=c,
                    )
                )
            return out

    async def list_tool_info(self) -> List[ToolInfo]:
        """Return human-readable tool metadata for introspection or debugging."""
        c = self._multi.client
        async with c:
            tools = await c.list_tools()
            return [
                ToolInfo(
                    server_guess="",
                    name=t.name,
                    description=getattr(t, "description", "") or "",
                    input_schema=getattr(t, "inputSchema", None) or {},
                )
                for t in tools
            ]

get_all_tools() async

Return all available tools as LangChain BaseTool instances.

Source code in src/deepmcpagent/tools.py
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
async def get_all_tools(self) -> List[BaseTool]:
    """Return all available tools as LangChain `BaseTool` instances."""
    c = self._multi.client
    async with c:
        tools = await c.list_tools()
        out: List[BaseTool] = []
        for t in tools:
            name = t.name
            desc = getattr(t, "description", "") or ""
            schema = getattr(t, "inputSchema", None) or {}
            model = _jsonschema_to_pydantic(schema)
            out.append(
                _FastMCPTool(
                    name=name,
                    description=desc,
                    args_schema=model,
                    tool_name=name,
                    client=c,
                )
            )
        return out

list_tool_info() async

Return human-readable tool metadata for introspection or debugging.

Source code in src/deepmcpagent/tools.py
117
118
119
120
121
122
123
124
125
126
127
128
129
130
async def list_tool_info(self) -> List[ToolInfo]:
    """Return human-readable tool metadata for introspection or debugging."""
    c = self._multi.client
    async with c:
        tools = await c.list_tools()
        return [
            ToolInfo(
                server_guess="",
                name=t.name,
                description=getattr(t, "description", "") or "",
                input_schema=getattr(t, "inputSchema", None) or {},
            )
            for t in tools
        ]

ToolInfo dataclass

Human-friendly metadata for a discovered MCP tool.

Source code in src/deepmcpagent/tools.py
14
15
16
17
18
19
20
@dataclass(frozen=True)
class ToolInfo:
    """Human-friendly metadata for a discovered MCP tool."""
    server_guess: str
    name: str
    description: str
    input_schema: Dict[str, Any]

System prompt definition for deepmcpagent.

Edit this file to change the default system behavior of the agent without modifying code in the builder.


Agent builders that use the FastMCP client and MCP-only tools.

build_deep_agent(*, servers, model, instructions=None) async

Build an MCP-only agent graph.

This function discovers tools from the configured MCP servers, converts them into LangChain tools, and then builds an agent. If the optional deepagents package is installed, a Deep Agent loop is created. Otherwise, a LangGraph ReAct agent is used.

Parameters:

Name Type Description Default
servers Mapping[str, ServerSpec]

Mapping of server name to spec (HTTP/SSE recommended for FastMCP).

required
model ModelLike

REQUIRED. Either a LangChain chat model instance, a provider id string accepted by init_chat_model, or a Runnable.

required
instructions Optional[str]

Optional system prompt. If not provided, uses DEFAULT_SYSTEM_PROMPT.

None

Returns:

Type Description
Tuple[Runnable, MCPToolLoader]

Tuple of (graph, loader) where: - graph is a LangGraph or DeepAgents runnable with .ainvoke. - loader can be used to introspect tools.

Source code in src/deepmcpagent/agent.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
async def build_deep_agent(
    *,
    servers: Mapping[str, ServerSpec],
    model: ModelLike,
    instructions: Optional[str] = None,
) -> Tuple[Runnable, MCPToolLoader]:
    """Build an MCP-only agent graph.

    This function discovers tools from the configured MCP servers, converts them into
    LangChain tools, and then builds an agent. If the optional `deepagents` package is
    installed, a Deep Agent loop is created. Otherwise, a LangGraph ReAct agent is used.

    Args:
        servers: Mapping of server name to spec (HTTP/SSE recommended for FastMCP).
        model: REQUIRED. Either a LangChain chat model instance, a provider id string
            accepted by `init_chat_model`, or a Runnable.
        instructions: Optional system prompt. If not provided, uses DEFAULT_SYSTEM_PROMPT.

    Returns:
        Tuple of `(graph, loader)` where:
            - `graph` is a LangGraph or DeepAgents runnable with `.ainvoke`.
            - `loader` can be used to introspect tools.
    """
    if model is None:  # Defensive check; CLI/code must always pass a model now.
        raise ValueError("A model is required. Provide a model instance or a provider id string.")

    multi = FastMCPMulti(servers)
    loader = MCPToolLoader(multi)
    tools: List[BaseTool] = await loader.get_all_tools()
    chat = _normalize_model(model)
    sys_prompt = instructions or DEFAULT_SYSTEM_PROMPT

    try:
        # Optional deep agent loop if the extra is installed.
        from deepagents import create_deep_agent  # type: ignore

        graph = create_deep_agent(tools=tools, instructions=sys_prompt, model=chat)
    except ImportError:
        # Solid fallback with LangGraph's ReAct agent.
        graph = create_react_agent(model=chat, tools=tools, state_modifier=sys_prompt)

    return graph, loader