-
Notifications
You must be signed in to change notification settings - Fork 1.3k
## Custom Model Provider Not Working #485
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
Comments
What errors are you running into? |
This issue is stale because it has been open for 7 days with no activity. |
I have a resembling issue which might help:
Then:
Raises:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
📌 Context
I'm currently working on integrating a custom LLM into my application. Specifically, I'm using Groq's
llama3-8b-8192
model through theChatOpenAI
class from thelangchain_openai
package:I aim to integrate this model into my existing setup, which utilizes a custom
ModelProvider
:Please guide how to do it.
The text was updated successfully, but these errors were encountered: