工程经验 - Agent 工程之初探 MCP 协议

290 34~44 min

一、Minimal MCP Example

直接跑一下一个 MCP 最小实现的案例,体验体验 MCP 是啥

Overview

sequenceDiagram participant MCP Client participant MCP Server participant Resources MCP Client->>MCP Server: Initialize Connection MCP Server-->>MCP Client: Connection Established MCP Client->>MCP Server: List Resources Request MCP Server->>Resources: Get Available Resources Resources-->>MCP Server: Return Resource List MCP Server-->>MCP Client: Resource List Response MCP Client->>MCP Server: Read Resource Request (e.g. greeting.txt) MCP Server->>Resources: Fetch Resource Content Resources-->>MCP Server: Return Resource Content MCP Server-->>MCP Client: Resource Content Response

SHOW ME THE CODE

请严格按照如下目录树,将文件建立(或者可以把这些代码丢到 Cursor,帮你写 :P),需要写的内容在下面都使用 VIM 标记出来了

(mcp-quick) warren@L-MBP simple_resource % tree
.
├── README.md
├── mcp_simple_resource
│   ├── __init__.py
│   ├── __main__.py
│   ├── client.py
│   └── server.py
├── pyproject.toml
└── uv.lock
touch __init__.py
touch __main__.py
touch README.md

vim main.py :

import sys

from server import main

sys.exit(main())

vim server.py :

import anyio
import click
import mcp.types as types
from mcp.server.lowlevel import Server
from pydantic import AnyUrl

SAMPLE_RESOURCES = {
    "greeting": "Hello! This is a sample text resource.",
    "help": "This server provides a few sample text resources for testing.",
    "about": "This is the simple-resource MCP server implementation.",
}


@click.command()
@click.option("--port", default=8000, help="Port to listen on for SSE")
@click.option(
    "--transport",
    type=click.Choice(["stdio", "sse"]),
    default="stdio",
    help="Transport type",
)
def main(port: int, transport: str) -> int:
    app = Server("mcp-simple-resource")

    @app.list_resources()
    async def list_resources() -> list[types.Resource]:
        return [
            types.Resource(
                uri=AnyUrl(f"file:///{name}.txt"),
                name=name,
                description=f"A sample text resource named {name}",
                mimeType="text/plain",
            )
            for name in SAMPLE_RESOURCES.keys()
        ]

    @app.read_resource()
    async def read_resource(uri: AnyUrl) -> str | bytes:
        assert uri.path is not None
        name = uri.path.replace(".txt", "").lstrip("/")

        if name not in SAMPLE_RESOURCES:
            raise ValueError(f"Unknown resource: {uri}")

        return SAMPLE_RESOURCES[name]

    if transport == "sse":
        from mcp.server.sse import SseServerTransport
        from starlette.applications import Starlette
        from starlette.routing import Mount, Route

        sse = SseServerTransport("/messages/")

        async def handle_sse(request):
            async with sse.connect_sse(
                request.scope, request.receive, request._send
            ) as streams:
                await app.run(
                    streams[0], streams[1], app.create_initialization_options()
                )

        starlette_app = Starlette(
            debug=True,
            routes=[
                Route("/sse", endpoint=handle_sse),
                Mount("/messages/", app=sse.handle_post_message),
            ],
        )

        import uvicorn

        uvicorn.run(starlette_app, host="0.0.0.0", port=port)
    else:
        from mcp.server.stdio import stdio_server

        async def arun():
            async with stdio_server() as streams:
                await app.run(
                    streams[0], streams[1], app.create_initialization_options()
                )

        anyio.run(arun)

    return 0

vim client.py :

import asyncio
from mcp.types import AnyUrl
from mcp.client.session import ClientSession
from mcp.client.stdio import StdioServerParameters, stdio_client


async def main():
    async with stdio_client(
        StdioServerParameters(command="uv", args=["run", "mcp-simple-resource"])
    ) as (read, write):
        async with ClientSession(read, write) as session:
            await session.initialize()

            # List available resources
            resources = await session.list_resources()
            print(resources)

            # Get a specific resource
            resource = await session.read_resource(AnyUrl("file:///greeting.txt"))
            print(resource)


asyncio.run(main())

vim pyproject.toml :

[project]
name = "mcp-simple-resource"
version = "0.1.0"
description = "A simple MCP server exposing sample text resources"
readme = "README.md"
requires-python = ">=3.10"
authors = [{ name = "Anthropic, PBC." }]
maintainers = [
    { name = "David Soria Parra", email = "davidsp@anthropic.com" },
    { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
]
keywords = ["mcp", "llm", "automation", "web", "fetch"]
license = { text = "MIT" }
classifiers = [
    "Development Status :: 4 - Beta",
    "Intended Audience :: Developers",
    "License :: OSI Approved :: MIT License",
    "Programming Language :: Python :: 3",
    "Programming Language :: Python :: 3.10",
]
dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]

[project.scripts]
mcp-simple-resource = "mcp_simple_resource.server:main"

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.hatch.build.targets.wheel]
packages = ["mcp_simple_resource"]

[tool.pyright]
include = ["mcp_simple_resource"]
venvPath = "."
venv = ".venv"

[tool.ruff.lint]
select = ["E", "F", "I"]
ignore = []

[tool.ruff]
line-length = 88
target-version = "py310"

[tool.uv]
dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]

运行:

# 安装 uv
pip install uv

# 启动 Server
uv run mcp-simple-resource

#启动 Client
(mcp-quick) warren@L-MBP mcp_simple_resource % python client.py 
meta=None nextCursor=None resources=[Resource(uri=AnyUrl('file:///greeting.txt'), name='greeting', description='A sample text resource named greeting', mimeType='text/plain'), Resource(uri=AnyUrl('file:///help.txt'), name='help', description='A sample text resource named help', mimeType='text/plain'), Resource(uri=AnyUrl('file:///about.txt'), name='about', description='A sample text resource named about', mimeType='text/plain')]
meta=None contents=[TextResourceContents(uri=AnyUrl('file:///greeting.txt'), mimeType='text/plain', text='Hello! This is a sample text resource.')]

至此,完成了整个 MCP 的搭建与使用的流程

二、What is MCP?

经过上述的代码,相信读者已经对 MCP 有个大概的印象,这里贴个官方的架构图

架构

graph LR subgraph Your_Computer Client[Host with MCP Client - Claude, IDEs, Tools] ServerA[MCP Server A] ServerB[MCP Server B] ServerC[MCP Server C] DataA[Local Data Source A] DataB[Local Data Source B] Client -->|MCP Protocol| ServerA Client -->|MCP Protocol| ServerB Client -->|MCP Protocol| ServerC ServerA --> DataA ServerB --> DataB end RemoteC[Remote Service C] ServerC -->|Web APIs| RemoteC

在这个协议中,有三大部分:MCP Client 、MCP Server、Resource/Service

  • MCP Client 是与 LLM 接触的那部分,比如如果我们要做 Agent 那么我们的 Agent 就是 MCP Client

  • MCP Server 是连接资源/服务的那部分,同时也是 MCP Client 获取资源/服务的桥梁

  • Resource/Service 很好理解了,就是外部的各种资源,比如知识库,数据库,或者其他的一些三方服务

可以看到,这个协议一种统一的客户端-服务端通信的方式,这个协议起源是,考虑到 LSP (编辑器如何与第三方语言服务交互,实现代码补全/类型约束/错误提示等功能)是 IDE 的通用协议一样,对于模型访问外界资源,Tool Use 的能力,也希望提供一种统一的对接方式,MCP 的底层通过 JSON-RPC 2.0 协议实现通信,支持两种传输机制:

  • 标准输入输出(本地进程通信)

  • Server-Sent Events(SSE over HTTP,用于远程通信)

回到 minimal example

  • server:定义了 list_resources 和 read_resource 的两个接口,即,提供了资源列表和读取资源的接口

  • client:定了了连接 server 后,就去 list_resources 和 read_resource,从而获取到 resource

按照 MCP 协议提供的接口,需要类似在 Java SpringBoot 那样,按照一定的格式将方法写成装饰器,当然,函数名是可以随便定义的,可以写很多,例如 list_resourcesA()list_resourceB() 等,只要装饰器和异步函数关键字等格式写好就好

意义

“说那么多专业术语,MCP 出来的前后的区别是啥?意义是啥?”

回归问题本质,意义即 Tool Use 和 Resource Retrive 的大统一

首先大多数情况,对于用户来说, LLM 其实在公网的,而用户的资源则不一定在公网,比如:

  • 对个人,你想做资料整理、分析:那你的 PDF 文档、Excel 表格在你的电脑本地吧?

  • 对工作,你想让大模型直接对你的数据库做分析:那你的数据库在公司的内网吧?

那咋让公网的 LLM 能访问到自己(局域网) 的资源呢?

在 MCP 出来之前,LLM 提出了 Function Calling,这个可以注册 Tool 让 LLM 去使用你自己定义的 Tool,比如查询论文,查询天气之类的,其实 Function 是自己写的一段代码,LLM 会给出指示让程序调用自己写的这段代码,从而完成信息获取,再返回给 LLM。所以其实没有 MCP,利用 Function Calling 也能完成所谓的 Tool Use 和 Resource Retrive 需求。

那 MCP 还有啥意义?

  • 统一插件的开发方式:现有的平台,比如 GPTs(OpenAI)、Coze(ByteDance)、Dify 的插件系统接口都是不一样的,从插件服务提供者角度,你开发一个插件,需要适配多家平台,但是有了 MCP,你只需要开发一套 MCP Client-Server,多家平台都可以使用(前提是 MCP成为行业的标准)

  • 跨 Tool 的上下文:在 Function Calling 中,更强调单次调用,信息获取,没有跨 Tool 的上下文同步,这个在 MCP 的使用中则不会,多个 Tool 可共享一个上下文,体验的感觉提高不少

三、Minimal LLM WITH MCP Example

一个基于 OpenAI 兼容的接口去使用 MCP 的案例,此案例为:让 LLM 直接去操作本地的一个 SQLlite 数据库,可以直接使用自然语言去创建数据表,CRUD 之类的

Overview

sequenceDiagram participant User participant Main participant MCP Client participant MCP Server participant LLM participant Docker SQLite Main->>MCP Client: Create Client Instance MCP Client->>MCP Server: Initialize Connection MCP Server-->>MCP Client: Connection Established Note right of MCP Client: Tool Registration Phase MCP Client->>MCP Server: get_available_tools() MCP Server-->>MCP Client: Return Tool List MCP Client->>LLM: Register Tools via Function Calling loop Interactive Session User->>Main: Enter Prompt Main->>LLM: Send Query Note right of LLM: LLM decides whether to use tools alt LLM Decides to Use Tool LLM-->>Main: Request Tool Call Main->>MCP Client: Execute Tool Call MCP Client->>MCP Server: Call Tool MCP Server->>Docker SQLite: Execute Query Docker SQLite-->>MCP Server: Query Result MCP Server-->>MCP Client: Tool Response MCP Client-->>Main: Tool Result Main->>LLM: Send Tool Result LLM-->>Main: Final Response with Tool Info else Direct Response LLM-->>Main: Direct Response without Tool end Main-->>User: Display Response end

SHOW ME THE CODE

mcp-client

代码见我的 Gayhub: https://github.com/jalr4ever/Tiny-OAI-MCP-Agent

运行体验,运行结果如下:

Enter your prompt (or 'quit' to exit): show tables

Response: Here are the tables available in the database:

1. **users**
2. **stock**

If you need more information about either of these tables or want to perform operations on them, just let me know!

Enter your prompt (or 'quit' to exit): create table students with 4 columuns for me, decide column detail by yourself

Response: The table **students** has been created successfully with the following columns:

1. **id**: INTEGER PRIMARY KEY
2. **name**: TEXT NOT NULL
3. **age**: INTEGER NOT NULL
4. **enrollment_date**: DATE NOT NULL

If you need to add data to this table or perform any further actions, feel free to ask!

Enter your prompt (or 'quit' to exit): show tables

Response: Now, in addition to the previous tables, the database also contains the newly created **students** table. Here’s the updated list:

1. **users**
2. **stock**
3. **students**

If you have any tasks in mind for these tables or need more information, just let me know!

mcp_server.py

这里使用的是 Anthropic 官方提供的 mcp_server ,直接使用 docker 跑了,具体的代码也可以见:https://github.com/modelcontextprotocol/servers/tree/main/src/sqlite

如果你想使用这个 Server 运行,可以下载到本地,按照 README 跑

四、How LLM Using MCP?

LLM 模型只提供 Function Calling,它可不知道啥是 MCP Client ,啥是 MCP Server ,那 LLM 是如何集成起 MCP 跑起来的?

答:通过 Function Calling 集成!

通过上述章节,可以发现 MCP Client 的构造函数中,上来就是 mcp_tools = await mcp_client.get_available_tools() ,去查询出 tool 列表(如果你去 Github 看 MCP Server ,会发现其也有一个方法会返回 Tool 列表),然后通过 Function Calling 注册这些 Tool 到 LLM 中,随后 LLM 在对话中,会自行猜测使用哪些 Tool。所以,也可能会出现 Tool 调用不对的情况,毕竟是 LLM 自行决定的,不过 LLM 的智慧变好是个必然的结果,这个未来不会是问题

sequenceDiagram participant User participant Main participant MCP Client participant MCP Server participant LLM Note right of MCP Client: Initialization & Tool Registration Main->>MCP Client: Create Client Instance MCP Client->>MCP Server: get_available_tools() MCP Server-->>MCP Client: Return Tool List MCP Client->>LLM: Register Tools via Function Calling loop Interactive Session User->>Main: Enter Prompt Main->>LLM: Send Query Note right of LLM: LLM autonomously
decides tool usage alt LLM Decides to Use Tool LLM-->>Main: Request Tool Call Main->>MCP Client: Execute Tool Call MCP Client->>MCP Server: Call Tool MCP Server-->>MCP Client: Tool Response MCP Client-->>Main: Tool Result Main->>LLM: Send Tool Result LLM-->>Main: Final Response else Direct Response LLM-->>Main: Direct Response end Main-->>User: Display Response end


2