langchain-google-genai: 2.1.7#
LangChain Google Generative AI Integration
This module integrates Google’s Generative AI models, specifically the Gemini series, with the LangChain framework. It provides classes for interacting with chat models and generating embeddings, leveraging Google’s advanced AI capabilities.
Chat Models
The ChatGoogleGenerativeAI
class is the primary interface for interacting with Google’s Gemini chat models. It allows users to send and receive messages using a specified Gemini model, suitable for various conversational AI applications.
LLMs
The GoogleGenerativeAI
class is the primary interface for interacting with Google’s Gemini LLMs. It allows users to generate text using a specified Gemini model.
Embeddings
The GoogleGenerativeAIEmbeddings
class provides functionalities to generate embeddings using Google’s models.
These embeddings can be used for a range of NLP tasks, including semantic analysis, similarity comparisons, and more.
Installation
To install the package, use pip:
## Using Chat Models
After setting up your environment with the required API key, you can interact with the Google Gemini models.
## Using LLMs
The package also supports generating text with Google’s models.
## Embedding Generation
The package also supports creating embeddings with Google’s models, useful for textual similarity and other NLP applications.
chat_models#
Classes
Google AI chat models integration. |
|
Custom exception class for errors associated with the Google GenAI API. |
embeddings#
Classes
Google Generative AI Embeddings. |
genai_aqa#
Classes
Input to GenAIAqa.invoke. |
|
Output from GenAIAqa.invoke. |
|
Google's Attributed Question and Answering service. |
google_vector_store#
Classes
Google GenerativeAI Vector Store. |
|
Do nothing embedding model where the embedding is done by the server. |
llms#
Classes
Google GenerativeAI models. |