跳到内容

pydantic_ai.mcp

MCPServer

基类: ABC

用于将代理附加到 MCP 服务器的基类。

参见 https://modelcontextprotocol.io 了解更多信息。

源代码位于 pydantic_ai_slim/pydantic_ai/mcp.py
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
class MCPServer(ABC):
    """Base class for attaching agents to MCP servers.

    See <https://modelcontextprotocol.io> for more information.
    """

    is_running: bool = False

    _client: ClientSession
    _read_stream: MemoryObjectReceiveStream[JSONRPCMessage | Exception]
    _write_stream: MemoryObjectSendStream[JSONRPCMessage]
    _exit_stack: AsyncExitStack

    @abstractmethod
    @asynccontextmanager
    async def client_streams(
        self,
    ) -> AsyncIterator[
        tuple[MemoryObjectReceiveStream[JSONRPCMessage | Exception], MemoryObjectSendStream[JSONRPCMessage]]
    ]:
        """Create the streams for the MCP server."""
        raise NotImplementedError('MCP Server subclasses must implement this method.')
        yield

    async def list_tools(self) -> list[ToolDefinition]:
        """Retrieve tools that are currently active on the server.

        Note:
        - We don't cache tools as they might change.
        - We also don't subscribe to the server to avoid complexity.
        """
        tools = await self._client.list_tools()
        return [
            ToolDefinition(
                name=tool.name,
                description=tool.description or '',
                parameters_json_schema=tool.inputSchema,
            )
            for tool in tools.tools
        ]

    async def call_tool(self, tool_name: str, arguments: dict[str, Any]) -> CallToolResult:
        """Call a tool on the server.

        Args:
            tool_name: The name of the tool to call.
            arguments: The arguments to pass to the tool.

        Returns:
            The result of the tool call.
        """
        return await self._client.call_tool(tool_name, arguments)

    async def __aenter__(self) -> Self:
        self._exit_stack = AsyncExitStack()

        self._read_stream, self._write_stream = await self._exit_stack.enter_async_context(self.client_streams())
        client = ClientSession(read_stream=self._read_stream, write_stream=self._write_stream)
        self._client = await self._exit_stack.enter_async_context(client)

        await self._client.initialize()
        self.is_running = True
        return self

    async def __aexit__(
        self, exc_type: type[BaseException] | None, exc_value: BaseException | None, traceback: TracebackType | None
    ) -> bool | None:
        await self._exit_stack.aclose()
        self.is_running = False

client_streams abstractmethod async

client_streams() -> AsyncIterator[
    tuple[
        MemoryObjectReceiveStream[
            JSONRPCMessage | Exception
        ],
        MemoryObjectSendStream[JSONRPCMessage],
    ]
]

为 MCP 服务器创建流。

源代码位于 pydantic_ai_slim/pydantic_ai/mcp.py
43
44
45
46
47
48
49
50
51
52
@abstractmethod
@asynccontextmanager
async def client_streams(
    self,
) -> AsyncIterator[
    tuple[MemoryObjectReceiveStream[JSONRPCMessage | Exception], MemoryObjectSendStream[JSONRPCMessage]]
]:
    """Create the streams for the MCP server."""
    raise NotImplementedError('MCP Server subclasses must implement this method.')
    yield

list_tools async

list_tools() -> list[ToolDefinition]

检索当前在服务器上处于活动状态的工具。

注意:- 我们不缓存工具,因为它们可能会更改。- 我们也不订阅服务器以避免复杂性。

源代码位于 pydantic_ai_slim/pydantic_ai/mcp.py
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
async def list_tools(self) -> list[ToolDefinition]:
    """Retrieve tools that are currently active on the server.

    Note:
    - We don't cache tools as they might change.
    - We also don't subscribe to the server to avoid complexity.
    """
    tools = await self._client.list_tools()
    return [
        ToolDefinition(
            name=tool.name,
            description=tool.description or '',
            parameters_json_schema=tool.inputSchema,
        )
        for tool in tools.tools
    ]

call_tool async

call_tool(
    tool_name: str, arguments: dict[str, Any]
) -> CallToolResult

在服务器上调用工具。

参数

名称 类型 描述 默认值
tool_name 字符串

要调用的工具的名称。

必需
参数 dict[str, Any]

传递给工具的参数。

必需

返回值

类型 描述
CallToolResult

工具调用的结果。

源代码位于 pydantic_ai_slim/pydantic_ai/mcp.py
71
72
73
74
75
76
77
78
79
80
81
async def call_tool(self, tool_name: str, arguments: dict[str, Any]) -> CallToolResult:
    """Call a tool on the server.

    Args:
        tool_name: The name of the tool to call.
        arguments: The arguments to pass to the tool.

    Returns:
        The result of the tool call.
    """
    return await self._client.call_tool(tool_name, arguments)

MCPServerStdio dataclass

基类: MCPServer

在子进程中运行 MCP 服务器,并通过 stdin/stdout 与其通信。

此类实现了 MCP 规范中的 stdio 传输。请参阅 https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#stdio 了解更多信息。

注意

将此类用作异步上下文管理器将在进入上下文时启动服务器作为子进程,并在退出上下文时停止它。

示例

from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStdio

server = MCPServerStdio('npx', ['-y', '@pydantic/mcp-run-python', 'stdio'])  # (1)!
agent = Agent('openai:gpt-4o', mcp_servers=[server])

async def main():
    async with agent.run_mcp_servers():  # (2)!
        ...

  1. 参见 MCP 运行 Python 了解更多信息。
  2. 这将启动服务器作为子进程并连接到它。
源代码位于 pydantic_ai_slim/pydantic_ai/mcp.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
@dataclass
class MCPServerStdio(MCPServer):
    """Runs an MCP server in a subprocess and communicates with it over stdin/stdout.

    This class implements the stdio transport from the MCP specification.
    See <https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#stdio> for more information.

    !!! note
        Using this class as an async context manager will start the server as a subprocess when entering the context,
        and stop it when exiting the context.

    Example:
    ```python {py="3.10"}
    from pydantic_ai import Agent
    from pydantic_ai.mcp import MCPServerStdio

    server = MCPServerStdio('npx', ['-y', '@pydantic/mcp-run-python', 'stdio'])  # (1)!
    agent = Agent('openai:gpt-4o', mcp_servers=[server])

    async def main():
        async with agent.run_mcp_servers():  # (2)!
            ...
    ```

    1. See [MCP Run Python](../mcp/run-python.md) for more information.
    2. This will start the server as a subprocess and connect to it.
    """

    command: str
    """The command to run."""

    args: Sequence[str]
    """The arguments to pass to the command."""

    env: dict[str, str] | None = None
    """The environment variables the CLI server will have access to.

    By default the subprocess will not inherit any environment variables from the parent process.
    If you want to inherit the environment variables from the parent process, use `env=os.environ`.
    """

    @asynccontextmanager
    async def client_streams(
        self,
    ) -> AsyncIterator[
        tuple[MemoryObjectReceiveStream[JSONRPCMessage | Exception], MemoryObjectSendStream[JSONRPCMessage]]
    ]:
        server = StdioServerParameters(command=self.command, args=list(self.args), env=self.env)
        async with stdio_client(server=server) as (read_stream, write_stream):
            yield read_stream, write_stream

command instance-attribute

command: str

要运行的命令。

args instance-attribute

args: Sequence[str]

传递给命令的参数。

env class-attribute instance-attribute

env: dict[str, str] | None = None

CLI 服务器将有权访问的环境变量。

默认情况下,子进程不会从父进程继承任何环境变量。 如果您想从父进程继承环境变量,请使用 env=os.environ

MCPServerHTTP dataclass

基类: MCPServer

通过可流式 HTTP 连接连接的 MCP 服务器。

此类实现了 MCP 规范中的 SSE 传输。请参阅 https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#http-with-sse 了解更多信息。

名称 "HTTP" 的使用是因为此实现将在未来进行调整,以使用新的 Streamable HTTP 目前正在开发中。

注意

将此类用作异步上下文管理器将创建一个新的 HTTP 连接池,以连接到应该已经在运行的服务器。

示例

from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerHTTP

server = MCPServerHTTP('https://127.0.0.1:3001/sse')  # (1)!
agent = Agent('openai:gpt-4o', mcp_servers=[server])

async def main():
    async with agent.run_mcp_servers():  # (2)!
        ...

  1. 例如,您可能正在连接到使用以下命令运行的服务器:npx @pydantic/mcp-run-python sse,请参阅 MCP 运行 Python 了解更多信息。
  2. 这将连接到运行在 localhost:3001 的服务器。
源代码位于 pydantic_ai_slim/pydantic_ai/mcp.py
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
@dataclass
class MCPServerHTTP(MCPServer):
    """An MCP server that connects over streamable HTTP connections.

    This class implements the SSE transport from the MCP specification.
    See <https://spec.modelcontextprotocol.io/specification/2024-11-05/basic/transports/#http-with-sse> for more information.

    The name "HTTP" is used since this implemented will be adapted in future to use the new
    [Streamable HTTP](https://github.com/modelcontextprotocol/specification/pull/206) currently in development.

    !!! note
        Using this class as an async context manager will create a new pool of HTTP connections to connect
        to a server which should already be running.

    Example:
    ```python {py="3.10"}
    from pydantic_ai import Agent
    from pydantic_ai.mcp import MCPServerHTTP

    server = MCPServerHTTP('https://127.0.0.1:3001/sse')  # (1)!
    agent = Agent('openai:gpt-4o', mcp_servers=[server])

    async def main():
        async with agent.run_mcp_servers():  # (2)!
            ...
    ```

    1. E.g. you might be connecting to a server run with `npx @pydantic/mcp-run-python sse`,
      see [MCP Run Python](../mcp/run-python.md) for more information.
    2. This will connect to a server running on `localhost:3001`.
    """

    url: str
    """The URL of the SSE endpoint on the MCP server.

    For example for a server running locally, this might be `https://127.0.0.1:3001/sse`.
    """

    @asynccontextmanager
    async def client_streams(
        self,
    ) -> AsyncIterator[
        tuple[MemoryObjectReceiveStream[JSONRPCMessage | Exception], MemoryObjectSendStream[JSONRPCMessage]]
    ]:  # pragma: no cover
        async with sse_client(url=self.url) as (read_stream, write_stream):
            yield read_stream, write_stream

url instance-attribute

url: str

MCP 服务器上 SSE 端点的 URL。

例如,对于本地运行的服务器,这可能是 https://127.0.0.1:3001/sse