Skip to content

Start and finish streaming trace in impl metod #540

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 21, 2025
Merged

Start and finish streaming trace in impl metod #540

merged 1 commit into from
Apr 21, 2025

Conversation

rm-openai
Copy link
Collaborator

@rm-openai rm-openai commented Apr 17, 2025

Closes #435 and closes #538.

Unit tests.

@JilinJL
Copy link

JilinJL commented Apr 18, 2025

@rm-openai Is there any problem here, please viewit very much, 🌹🌹
#516

@@ -404,10 +404,6 @@ def run_streamed(
disabled=run_config.tracing_disabled,
)
)
# Need to start the trace here, because the current trace contextvar is captured at
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might be wrong but won't this break the following use case:

    result = Runner.run_streamed(
        agent, "Write ten haikus about recursion in programming."
    )

    i = 0
    async for event in result.stream_events():
        with custom_span("Processing event " + str(i)):
            print(event)
            i += 1

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, you're right. Talked on slack, the fix for this would be to manually attach to the trace via:

  async for event in result.stream_events():
        with custom_span("Processing event " + str(i), parent=result.trace):
            print(event)
            i += 1

@rm-openai rm-openai merged commit 616d8e7 into main Apr 21, 2025
10 checks passed
@rm-openai rm-openai deleted the rm/pr540 branch April 21, 2025 17:08
hankun11 pushed a commit to hankun11/openai-agents-python that referenced this pull request Apr 23, 2025
* upstream/main:
  Examples: Fix financial_research_agent instructions (openai#573)
  Adding extra_headers parameters to ModelSettings (openai#550)
  v0.0.12 (openai#564)
  Pass through organization/project headers to tracing backend, fix speech_group enum (openai#562)
  Docs and tests for litellm (openai#561)
  RFC: automatically use litellm if possible (openai#534)
  Fix visualize graph filename to without extension. (openai#554)
  Start and finish streaming trace in impl metod (openai#540)
  Enable non-strict output types (openai#539)
  Examples for image inputs (openai#553)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants