Skip to content

Support AWS Bedrock #86

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pontus-devoteam opened this issue Mar 12, 2025 · 12 comments
Closed

Support AWS Bedrock #86

pontus-devoteam opened this issue Mar 12, 2025 · 12 comments
Labels
needs-more-info Waiting for a reply/more info from the author question Question about using the SDK stale

Comments

@pontus-devoteam
Copy link

pontus-devoteam commented Mar 12, 2025

I would like to run this through AWS Bedrock instance, instead of your API.

@pontus-devoteam pontus-devoteam added the enhancement New feature or request label Mar 12, 2025
@rm-openai rm-openai added question Question about using the SDK needs-more-info Waiting for a reply/more info from the author and removed enhancement New feature or request labels Mar 12, 2025
@rm-openai
Copy link
Collaborator

You can use any model or model provider. Easy way is to use an OpenAI compatible endpoint with base_url. Hard way is to build your own instance of Model. Does that work?

@aristide1997
Copy link

Unfortunately bedrock is not compatible with OpenAI style api requests unless it’s through the bedrock gateway (3rd party). Is a native bedrock client implementation on the table?

@rm-openai
Copy link
Collaborator

Would be hard for the OpenAI team to prioritize this. The SDK has the abstractions to support it (i.e. the Model interface), so we'd be open to PRs adding support via that interface.

@pontus-devoteam
Copy link
Author

I will have a look and see how I could contribute on that, one way would be to add support for Lite LLM, they take care of all the variations of schemas, authentication etc.

@pontus-devoteam
Copy link
Author

pontus-devoteam commented Mar 13, 2025

Created a PR for to support this #125

@rm-openai
Copy link
Collaborator

Thanks, this is a great PR. I think the right move is to add a separate "extensions" library that is an optional dependency, and this there. Let me talk to the team internally and see what they think!

@pontus-devoteam
Copy link
Author

Yeah from a dependancy standpoint I would understand, but its also kinda part of the core of the Agent SDK to give the developer choice of what provider it wants to use, more then offer a base_url and api_key as option, that would not be close to acceptance for many providers in terms of configurations etc.

Also a big hassle Lite LLM solves is the rate limit management for each provider, so on and so forth you get my memo.

Happy to contribute

@pontus-devoteam
Copy link
Author

In the meantime if anyone wants to build using Go lang, you could use:
https://github.com/pontus-devoteam/agent-sdk-go

Copy link

This issue is stale because it has been open for 7 days with no activity.

@github-actions github-actions bot added the stale label Mar 23, 2025
@EnggQasim
Copy link
Contributor

Created a PR for to support this #125

I love to see Litellm contribution with openai-agents SDK Thank you @pontus-devoteam

@github-actions github-actions bot removed the stale label Mar 24, 2025
Copy link

This issue is stale because it has been open for 7 days with no activity.

@github-actions github-actions bot added the stale label Mar 31, 2025
Copy link

This issue was closed because it has been inactive for 3 days since being marked as stale.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Apr 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-more-info Waiting for a reply/more info from the author question Question about using the SDK stale
Projects
None yet
Development

No branches or pull requests

4 participants