什么是 MCP?
MCP(Model Context Protocol,模型上下文协议) 是由 Anthropic 推出的一种开放标准,旨在统一大型语言模型(LLM)与外部数据源和工具之间的通信协议。MCP 的主要目的在于解决当前 AI 模型因数据孤岛限制而无法充分发挥潜力的难题,MCP 使得 AI 应用能够安全地访问和操作本地及远程数据,为 AI 应用提供了连接万物的接口。
plantuml代码
plantuml 代码:@startuml
'https://plantuml.com/sequence-diagram
autonumber
actor "User\n(You/I/Other_Humans/...)" as u
entity "AI_Client\n(Cherry/OWU/Cursor/Cline/...)" as c
entity "AI_LLM\n(GPT/Claude/Gemini/Deepseek/...)" as l
collections "Functions\n(calculator/command_executor/searcher/...)" as f
collections "MCP_Server\n(Yours/Mine/Other_Human's/...)" as m
u -> c: Set some Functions properties prepare to use,\n such as google-searcher key for searching on web.
opt
c -> c: Check some Function settings,\n such as whether google-searcher key valid.
end
u -> c: Type some nature language questions,\n **with turning on some Functions, such as web_search.**\n (May need especial prompt for better effect)
alt Some Functions turn on
c -> c: Build the request payload\n with the **TOOLS** property.
autonumber 1 "<font color=red><b>0 Request_to_LLM"
c -> l: Nature language prompt as messages content for "user" role,\n **TOOLS** property formed as an object contain the Functions user set before.
autonumber 6
l -> l: SOMETHING THE GOD MAGIC HAPPENING!!!\n The LLM will **DECIDE** to use the functions or not with the prompt question.\n It will tell us by returning the NAME and ARGUMENTs of a Function\n it think we should use.
c <-- l: Return the response of THE GOD MAGIC THING.\n Contain the Function NAME and ARGUMENTs.
alt no MCP
c -> c: Gets what it wants and runs it's feature!\n <font color=red>Different features code for different functions, **ONE ON ONE**. It's sooooo pain for the clients.</font>\n Such as the feature 'Search from google' and the feature 'run some shell' and so ...
c -> f: Invoke the Function with the arguments it takes from LLM response.
f -> f: Just runs as what it's code is.\n Searching... Executing... Calculating..\n Or something more complex.
c <-- f: Result of the function. Such as the result of web searching or result of some calculating or result of something more complex.
else <font color=red>MCP!MCP!!MCP!!!MCP!!!!HERE!!!!!</font>
autonumber 8
c -> c: Gets what it wants and runs it's <font color=red>MCPClient</font> feature!\n <font color=red>One feature code for different MCP servers, **ONE FOR ALL OF THEM**. It's sooooo HAPPY for the clients.</font>\n Such as the feature 'Search from google MCP SERVER' and the feature 'run some shell MCP SERVER' and so ...
c -> m: Invoke the MCP Server with the arguments it takes from LLM response.
m -> m: Just runs as what it's code is.\n Searching... Executing... Calculating..\n Or something more complex.
c <-- m: Result of the function. Such as the result of web searching or result of some calculating or result of something more complex.
end
c -> c: Build the request to LLM with the function running result.
autonumber 2 "<font color=red><b>0 Request_to_LLM"
c -> l: Request to LLM for GOD MAGIC, **but this time take along the result from function running**.\n and usually all the messages before like normal Multi-turns dialogue.
autonumber 14
l -> l: SOMETHING THE GOD MAGIC HAPPENING!!! AGAIN!!!\n Actually it's just like what happen when we do like alllllllll the time we do.
c <-- l: Get the final response from LLM!
c -> c: Render the beautiful things to show for us.
u <-- c: Get want we want!!!\n OR not, so sad, but just need one mooooooore request.
else
autonumber 4
c -> l: Just normally request like alllllllll the time we do.
end
@enduml
渲染网站:点击看看
拷出来了一份:
机翻了中文版
html 代码:@startuml
'https://plantuml.com/sequence-diagram
autonumber
actor "User\n(You/I/Other_Humans/...)" as u
entity "AI_Client\n(Cherry/OWU/Cursor/Cline/...)" as c
entity "AI_LLM\n(GPT/Claude/Gemini/Deepseek/...)" as l
collections "Functions\n(calculator/command_executor/searcher/...)" as f
collections "MCP_Server\n(Yours/Mine/Other_Human's/...)" as m
u -> c: 设置一些准备使用的函数属性,例如\n,例如google-searcher键,用于在web上搜索。
opt
c -> c: 检查一些功能设置,例如google-search key是否有效。
end
u -> c: 键入一些自然语言问题,\n**并打开一些功能,例如网络搜索。**\n(可能需要特别提示以获得更好的效果)
alt 一些功能开启
c -> c: 使用TOOLS属性构建请求有效负载。
autonumber 1 "<font color=red><b>0 Request_to_LLM"
c -> l: 自然语言提示作为“用户”角色的消息内容,\n TOOLS属性作为对象形成,包含用户之前设置的Functions。
autonumber 6
l -> l: 神奇的事情发生了!!\nLLM将根据提示问题**决定**是否使用这些函数。\n它将通过返回函数的NAME和参数来告诉我们,\n如果它认为我们应该使用。
c <-- l: 返回神奇的事情的响应,包含函数名和参数。
alt no MCP
c -> c: 得到它想要的并运行它的功能!\n <font color=red>不同功能不同特性代码,**一对一**。这对客户端来说太痛苦了。</font>\n例如功能‘Search from谷歌’和功能‘run some shell’等等…
c -> f: 用从LLM响应中获取的参数调用函数。
f -> f: 就像代码一样运行。\n 搜索...执行...计算...或者更复杂的东西。
c <-- f: 函数的结果。例如网络搜索的结果或某些计算的结果或更复杂的结果。
else <font color=red>MCP!MCP!!MCP!!!MCP!!!!HERE!!!!!</font>
autonumber 8
c -> c: 得到它想要的并运行它的<font color=red>MCPClient</font>功能!\n <font color=red>一个功能代码为不同的MCP服务器,**一个应付所有的**。这对客户端来说太开心了。</font>\n例如功能‘从谷歌MCP SERVER搜索’和功能‘运行一些shell MCP SERVER’等等…
c -> m: 使用从LLM响应中获取的参数调用MCP服务器。
m -> m: 就像代码一样运行。\n 搜索...执行...计算...或者更复杂的东西。
c <-- m: 函数的结果。例如网络搜索的结果或某些计算的结果或更复杂的结果。
end
c -> c: 用函数运行结果构建对LLM的请求。
autonumber 2 "<font color=red><b>0 Request_to_LLM"
c -> l: 请求LLM为了那个神奇的事情,**但这次带走了函数运行的结果**。\n和之前的所有消息,就像正常的多回合对话。
autonumber 14
l -> l: 神奇的事情发生了!!再次! !\n事实上,这就像我们一直做的事情一样。
c <-- l: 得到LLM的最终回应!
c -> c: 把美丽的事物呈现给我们。
u <-- c: 得到我们想要的!!\n或者没有,很伤心,但只需要再来亿个请求。
else
autonumber 4
c -> l: 就像往常一样请求就行了。
end
@enduml

总结
实现了功能调用和客户端的解耦。
所以它其实并没带来什么新东西,但这是发展出生态的基石。
其实简单来说就是多次 LLM 调用,然后每次调用以后都会附有相应的函数或者脚本处理。相当于自动化了一些。
最后
附精选的 MCP 服务器

隐藏内容,扫码公众号查看,发【验证码】获验证码