Skip to content

Tools #898

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
simonw opened this issue Apr 8, 2025 · 7 comments
Open

Tools #898

simonw opened this issue Apr 8, 2025 · 7 comments

Comments

@simonw
Copy link
Owner

simonw commented Apr 8, 2025

Starting a new tracking issue for tool support, the next big LLM feature.

Tools will be Python functions, LLM will provide an abstraction to get those to work against different providers. This will be covered by both the Python library and the CLI tool.

Plugins will be able to provide new tools.

Previous related issues:

The relatively new Schemas feature is a useful point of reference too.

@simonw
Copy link
Owner Author

simonw commented Apr 8, 2025

I think the first step here is going to be designing and adding a tool definition abstraction to the Prompt class. Next step is the same thing but for tool requests on the Response class.

The fiddly bit will be which part of the code handles acting on those tool requests, executing the associated Python functions and triggering a follow-up prompt with the results.

I had originally intended this to be a thing that llm.Conversation handles - see #90 - but I'm open to instead doing it as a thing where you can use response.reply(...) instead (I'm beginning to suspect linear conversations may be a limiting design decision, since I may some day want to support branching conversations, especially to take advantage of prompt caching and fragments and attachments).

I'm going to want asyncio support for this stuff too - which could be as simple (at first) as executing non-async tool functions in a thread (maybe only functions that have been marked as long running, no need to do that for Python functions that execute fast).

@yibie
Copy link

yibie commented Apr 10, 2025

Could I ask, MCP support will be contain in Tools function?

@simonw
Copy link
Owner Author

simonw commented Apr 10, 2025

I don't plan to implement MCP directly in LLM core, but I anticipate building a plugin that adds MCP support to LLM and builds on top of the new tools facility.

That way I can iterate on the plugin as MCP itself evolves independently of LLM core.

@ghukill
Copy link

ghukill commented Apr 12, 2025

I stumbled on this issue just a day before learning about Anthropic's prompt caching feature, and kind of glazed over the prompt caching mention at the time.

But as I found myself getting acquainted with both at the same time, it does feel like there is some affinity there.

@shane-kercheval
Copy link

shane-kercheval commented Apr 14, 2025

Hey, I came across this issue through your blog. I think I’ve built something along similar lines, though it looks like we’ve taken different approaches. My project is mostly for learning (not trying to promote anything here), feel free to take/repurpose anything that might be useful.

Here are a few relevant files:

@gwbischof
Copy link

gwbischof commented Apr 26, 2025

It would be great if this could support Human in the Loop (HITL)
I think Codex https://github.com/openai/codex did a good job with making a CLI/TUI for interfacing with its coding agent.

For example, if you give your LLM a Tool to execute commands on your computer, than I think it makes sense that you get a prompt to accept or make suggestions for the action.

Codex looks like this:
Image

My use case is that I would like to make custom agents with google-adk and have a CLI interface for them.

@simonw
Copy link
Owner Author

simonw commented Apr 26, 2025

That's a really good call. I was going to leave that entirely up to plugins but it would make sense for the core library to include an easy "make this action a human-in-the-loop one" flag.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants