-
-
Notifications
You must be signed in to change notification settings - Fork 490
Feature request: ability to use MCP servers #696
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I think this would be a very interesting feature. There's a nifty little project called fastmcp (https://github.com/jlowin/fast mcp). Think fastapi, but allows your llm to access external functionality, eg you could create an mcp server to connect to your email, then on the command line you could ask llm to search/send email etc Currently mcp is partially implemented in claude desktop, but doing so in llm would allow for complete integration and ability to use other open source models. Think of it as a standardised way of implementing function calling. |
shouldn't be too hard, with FastMCP, no? |
Related / required: #607 |
Corrected link for fastmcp https://github.com/jlowin/fastmcp |
Spotted this nice CLI that you can configure to use MCP servers! |
Interesting that this is also named llm |
What, does MCP stand for? |
Right |
Are you considering implementing MCP @simonw ? |
FastMCP is now the official MCP SDK: https://github.com/modelcontextprotocol/python-sdk But yeah, this would be super cool. I've recently seen just how powerful and useful MCP can be and would love to play with it more, but I don't want to have to use Claude Desktop to do it. (It's not available on Linux, anyway.) Most of my AI queries these days are via I wouldn't mind contributing to this integration, I'm just not familiar enough yet with the inner workings of |
so I've actually written my own cli mcp tool using typer and python and it's open sourced here: |
This CLI uses langchain under the hood. So it should work for any model where the langchain folks support tool calling. This support is mostly about knowing how to tell the model about the available tools, e.g. e.g. look at this code: And compare it with the first code listing on this page: |
I plan to implement MCP support as a plugin on top of tools, once that lands: |
ability to use MCP Servers?
The text was updated successfully, but these errors were encountered: