服务器
Pydantic AI 模型也可以在 MCP 服务器中使用。
MCP 服务器
这是一个在工具调用中使用 Pydantic AI 的 Python MCP 服务器 的简单示例。
mcp_server.py
from mcp.server.fastmcp import FastMCP
from pydantic_ai import Agent
server = FastMCP('Pydantic AI Server')
server_agent = Agent(
'anthropic:claude-3-5-haiku-latest', system_prompt='always reply in rhyme'
)
@server.tool()
async def poet(theme: str) -> str:
"""Poem generator"""
r = await server_agent.run(f'write a poem about {theme}')
return r.output
if __name__ == '__main__':
server.run()
简单客户端
可以使用任何 MCP 客户端查询此服务器。以下是一个直接使用 Python SDK 的示例:
mcp_client.py
import asyncio
import os
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
async def client():
server_params = StdioServerParameters(
command='python', args=['mcp_server.py'], env=os.environ
)
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
result = await session.call_tool('poet', {'theme': 'socks'})
print(result.content[0].text)
"""
Oh, socks, those garments soft and sweet,
That nestle softly 'round our feet,
From cotton, wool, or blended thread,
They keep our toes from feeling dread.
"""
if __name__ == '__main__':
asyncio.run(client())
MCP 采样
什么是 MCP 采样?
有关 MCP 采样的详细信息,以及当您将 Pydantic AI 用作 MCP 客户端时如何支持它,请参阅 MCP 客户端文档。
当 Pydantic AI 智能体在 MCP 服务器中使用时,它们可以通过 MCPSamplingModel
使用采样。
我们可以扩展上面的示例来使用采样,这样智能体就不会直接连接到 LLM,而是通过 MCP 客户端回调来进行 LLM 调用。
mcp_server_sampling.py
from mcp.server.fastmcp import Context, FastMCP
from pydantic_ai import Agent
from pydantic_ai.models.mcp_sampling import MCPSamplingModel
server = FastMCP('Pydantic AI Server with sampling')
server_agent = Agent(system_prompt='always reply in rhyme')
@server.tool()
async def poet(ctx: Context, theme: str) -> str:
"""Poem generator"""
r = await server_agent.run(f'write a poem about {theme}', model=MCPSamplingModel(session=ctx.session))
return r.output
if __name__ == '__main__':
server.run() # run the server over stdio
上述客户端不支持采样,因此如果您尝试将其与此服务器一起使用,将会收到一个错误。
在 MCP 客户端中支持采样最简单的方法是使用 Pydantic AI 智能体作为客户端,但如果您想用原生的 MCP SDK 支持采样,可以这样做:
mcp_client_sampling.py
import asyncio
from typing import Any
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from mcp.shared.context import RequestContext
from mcp.types import (
CreateMessageRequestParams,
CreateMessageResult,
ErrorData,
TextContent,
)
async def sampling_callback(
context: RequestContext[ClientSession, Any], params: CreateMessageRequestParams
) -> CreateMessageResult | ErrorData:
print('sampling system prompt:', params.systemPrompt)
#> sampling system prompt: always reply in rhyme
print('sampling messages:', params.messages)
"""
sampling messages:
[
SamplingMessage(
role='user',
content=TextContent(
type='text',
text='write a poem about socks',
annotations=None,
meta=None,
),
)
]
"""
# TODO get the response content by calling an LLM...
response_content = 'Socks for a fox.'
return CreateMessageResult(
role='assistant',
content=TextContent(type='text', text=response_content),
model='fictional-llm',
)
async def client():
server_params = StdioServerParameters(command='python', args=['mcp_server_sampling.py'])
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write, sampling_callback=sampling_callback) as session:
await session.initialize()
result = await session.call_tool('poet', {'theme': 'socks'})
print(result.content[0].text)
#> Socks for a fox.
if __name__ == '__main__':
asyncio.run(client())
(这个例子是完整的,可以“按原样”运行)