langchain-google-vertexai: 2.0.27#
LangChain Google Generative AI Integration
This module contains the LangChain integrations for Vertex AI service - Google foundational models, third-party foundational modela available on Vertex Model Garden and.
Supported integrations
Google’s foundational models:
Gemini
family,Codey
, embeddings -ChatVertexAI
,VertexAI
,VertexAIEmbeddings
.Other Google’s foundational models: Imagen -
VertexAIImageCaptioning
,VertexAIImageCaptioningChat
,VertexAIImageEditorChat
,VertexAIImageGeneratorChat
,VertexAIVisualQnAChat
.Third-party foundational models available as a an API (mdel-as-a-service) on Vertex Model Garden (Mistral, Llama, Anthropic) -
model_garden.ChatAnthropicVertex
,model_garden_maas.VertexModelGardenLlama
,model_garden_maas.VertexModelGardenMistral
.Third-party foundational models deployed on Vertex AI endpoints from Vertex Model Garden or Huggingface -
VertexAIModelGarden
.Gemma deployed on Vertex AI endpoints or locally -
GemmaChatLocalHF
,GemmaChatLocalKaggle
,GemmaChatVertexAIModelGarden
,GemmaLocalHF
,GemmaLocalKaggle
,GemmaVertexAIModelGarden
.Vector Search on Vertex AI -
VectorSearchVectorStore
,VectorSearchVectorStoreDatastore
,VectorSearchVectorStoreGCS
.Vertex AI evaluators for generative AI -
VertexPairWiseStringEvaluator
,VertexStringEvaluator
.
Take a look at detailed documentation for each class for further details.
Installation
- You need to enable required Google Cloud APIs (depending on the integration you’re using) and set up credentials by either:
Having credentials configured for your environment (gcloud, workload identity, etc…)
Storing the path to a service account JSON file as the
GOOGLE_APPLICATION_CREDENTIALS
environment variable
This codebase uses the google.auth
library which first looks for the application
credentials variable mentioned above, and then looks for system-level auth.
More information:
google.auth
API reference
callbacks#
Classes
Callback Handler that tracks VertexAI info. |
chains#
Functions
|
Create a runnable sequence that uses OpenAI functions. |
|
Get the appropriate function output parser given the user functions. |
chat_models#
Classes
Google Cloud Vertex AI chat model integration. |
embeddings#
Classes
Google Cloud VertexAI embedding models. |
evaluators#
Classes
Evaluate the perplexity of a predicted string. |
|
Evaluate the perplexity of a predicted string. |
functions_utils#
Classes
Parse an output as a pydantic object. |
gemma#
Classes
Needed for mypy typing to recognize model_name as a valid arg. |
|
Needed for mypy typing to recognize model_name as a valid arg. |
|
Local gemma model loaded from HuggingFace. |
|
Local gemma chat model loaded from Kaggle. |
|
Create a new model by parsing and validating input data from keyword arguments. |
Functions
|
Converts a list of messages to a chat prompt for Gemma. |
llms#
Classes
Google Vertex AI large language models. |
model_garden#
Classes
Create a new model by parsing and validating input data from keyword arguments. |
|
Large language models served from Vertex AI Model Garden. |
model_garden_maas#
Classes
Integration for Llama 3.1 on Google Cloud Vertex AI Model-as-a-Service. |
|
Create a new model by parsing and validating input data from keyword arguments. |
utils#
Functions
|
Creates a cache for content in some model. |
vectorstores#
Classes
Stores documents in Google Cloud DataStore. |
|
Abstract interface of a key, text storage for retrieving documents. |
|
Stores documents in Google Cloud Storage. |
|
VertexAI VectorStore that handles the search and indexing using Vector Search and stores the documents in Google Cloud Storage. |
|
|
VectorSearch with DatasTore document storage. |
Alias of VectorSearchVectorStore for consistency with the rest of vector stores with different document storage backends. |
vision_models#
Classes
Implementation of the Image Captioning model as an LLM. |
|
Implementation of the Image Captioning model as a chat. |
|
Given an image and a prompt, edits the image. |
|
Generates an image from a prompt. |
|
Chat implementation of a visual QnA model |