-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Returning function call's responses in raw_response_event
#328
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Not sure if this is exactly what you are after, but if you have the function call return an object (dataclass for eg) and set the output_type of the agent making the call to the dataclass Class type then the results returned from the agent are not reinterpreted by the LLM. So in a weather_agent you could have the following code to set up the function call
and then when you instantiate the weather_agent you can pass in the output_type eg.
|
Sorry for my question might be confusing. What I want to ask is that when I use this to restream the raw output from the LLM so the stream can be read by async for event in stream.stream_events():
if event.type == "raw_response_event":
event_data = event.data
event_type = event_data.type
yield "event: " + event_type + "\n"
yield "data: " + event_data.model_dump_json() + "\n\n" There isn't any official event type that indicates the finish of a function call (https://platform.openai.com/docs/api-reference/responses-streaming/response/output_text). The closest thing may be |
@levulinh Would a use case for this be to retrieve the id of the previous conversation? |
@abbas-khaku I suppose handling the history on the server side would work in this case. However, my ui library doesn't seem to support including the message ids in the request. I think I should rephrase my question to "Is there any good React UI library that works with Agents SDK?". |
This issue is stale because it has been open for 7 days with no activity. |
This issue was closed because it has been inactive for 3 days since being marked as stale. |
I am currently trying to build a chat web app for my agent, which is hosted using
fastapi
. Since the UI library I'm using accepts the OpenAI Responses API format, I return exactly what is yielded inraw_response_event
. However, I found that there's no way to retrieve the response from my custom function calls, so I can't build a complete message history with function call results. I understand that I can explicitly get the function's result fromrun_item_stream_event
, but is there a way to not introducing a new type of event in the stream?Does anyone have a solution for this? And how are you building a web UI for the agents?
Any suggestions are much appreciated. Thanks.
The text was updated successfully, but these errors were encountered: