Skip to content

History Cleaning #545

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
butterflai-ai opened this issue Apr 18, 2025 · 6 comments
Open

History Cleaning #545

butterflai-ai opened this issue Apr 18, 2025 · 6 comments
Labels
enhancement New feature or request

Comments

@butterflai-ai
Copy link

We need a way to manage context more precisely. In the current implementation, every call, tool invocation, and handoff is appended to the history, but we can’t control how many—or which—items are sent to the LLM each turn (e.g., only tool calls, only messages, only handoffs, or any combination).

It would also be helpful to add a sliding‑window history selector that retains only the last ten turns of conversation. That way, tools could leverage both short‑term (fast) and long‑term (slow) memories as needed.

What dou you think? How are you currently handling history needs?

@butterflai-ai butterflai-ai added the enhancement New feature or request label Apr 18, 2025
@rm-openai
Copy link
Collaborator

Couldn't you truncate the input you send to Runner.run? Just filter or truncate it? Let me know if there's something else you're picturing

@butterflai-ai
Copy link
Author

yeah but what i want is that when i call an agent as tool i can select whats the input that sends the "orchestrator" agent to the following agent

@rm-openai
Copy link
Collaborator

You can do that already. For example:

@function_tool
def call_my_agent(input: MyType):
  # Create a new input object
  new_input = ...
  # Call the agent with the new input
  result = await Runner.run(new_agent, new_input)

orchestrator = Agent(
   ...,
   tools=[call_my_agent]
)

Would that work?

@butterflai-ai
Copy link
Author

but can i select a subset of the general history? with the Runcontextwrapper or something like that?

@rm-openai
Copy link
Collaborator

Oh hmm, no. That's only available for handoffs. We could/should add this to the context, as you requested. PR welcome, otherwise I can get to it at some point.

@koakuma-chan
Copy link

@rm-openai It would be helpful if there was a built-in feature to compress the history after it reaches a certain size threshold. A la /compact in Claude Code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants