Skip to content

Returning function call's responses in raw_response_event #328

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
levulinh opened this issue Mar 25, 2025 · 6 comments
Closed

Returning function call's responses in raw_response_event #328

levulinh opened this issue Mar 25, 2025 · 6 comments
Labels
question Question about using the SDK stale

Comments

@levulinh
Copy link

I am currently trying to build a chat web app for my agent, which is hosted using fastapi. Since the UI library I'm using accepts the OpenAI Responses API format, I return exactly what is yielded in raw_response_event. However, I found that there's no way to retrieve the response from my custom function calls, so I can't build a complete message history with function call results. I understand that I can explicitly get the function's result from run_item_stream_event, but is there a way to not introducing a new type of event in the stream?
Does anyone have a solution for this? And how are you building a web UI for the agents?
Any suggestions are much appreciated. Thanks.

@levulinh levulinh added the question Question about using the SDK label Mar 25, 2025
@neuronwave
Copy link

Not sure if this is exactly what you are after, but if you have the function call return an object (dataclass for eg) and set the output_type of the agent making the call to the dataclass Class type then the results returned from the agent are not reinterpreted by the LLM.

So in a weather_agent you could have the following code to set up the function call

class Weather(BaseModel):
  city: str
  temperature_range: str
  conditions: str

@function_tool
def get_weather(ctx: RunContextWrapper, city: str | None) -> Weather:
  print("[debug] get_weather called")
  print(city)
  return Weather(city=city, temperature_range="14-20C", conditions="Sunny with wind")

and then

when you instantiate the weather_agent you can pass in the output_type eg.

class WeatherAgent(Agent):

  def __init__(self, raw=False):
    super().__init__(
      name="Weather agent",
      instructions=(
        "You are an agent who can get the weather"
      ),
      handoff_description="A weather predictor that can provide accurate forecasts",
      tools=[get_weather],
      output_type = Weather

@levulinh
Copy link
Author

Sorry for my question might be confusing. What I want to ask is that when I use this to restream the raw output from the LLM so the stream can be read by openai.responses():

 async for event in stream.stream_events():
        if event.type == "raw_response_event":
            event_data = event.data
            event_type = event_data.type
            yield "event: " + event_type + "\n"
            yield "data: " + event_data.model_dump_json() + "\n\n"

There isn't any official event type that indicates the finish of a function call (https://platform.openai.com/docs/api-reference/responses-streaming/response/output_text). The closest thing may be response.output_item.done, but it also doesn't include the results from the function call. So, I wonder if there's a way to provide the results of the functions within just raw_response_event event type.

@abbas-khaku
Copy link

@levulinh Would a use case for this be to retrieve the id of the previous conversation?

@levulinh
Copy link
Author

@abbas-khaku I suppose handling the history on the server side would work in this case. However, my ui library doesn't seem to support including the message ids in the request. I think I should rephrase my question to "Is there any good React UI library that works with Agents SDK?".

Copy link

github-actions bot commented Apr 8, 2025

This issue is stale because it has been open for 7 days with no activity.

@github-actions github-actions bot added the stale label Apr 8, 2025
Copy link

This issue was closed because it has been inactive for 3 days since being marked as stale.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Apr 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about using the SDK stale
Projects
None yet
Development

No branches or pull requests

3 participants