This is a sample Apigee proxy to demonstrate the routing capabilities of Apigee across different LLM providers. In this sample we will use Google VertexAI, Mistral and HuggingFace as the LLM providers
- Provision Apigee X
- Configure external access for API traffic to your Apigee X instance
- Enable Vertex AI in your project
- Create a HuggingFace Account and create an Access Token. When creating the token choose 'Read' for the token type.
- Similar to HuggingFace, create a Mistral Account and create an API Key
- Make sure the following tools are available in your terminal's $PATH (Cloud Shell has these preconfigured)
- gcloud SDK
- apigeecli
- unzip
- curl
- jq
Proceed to this notebook and follow the steps in the Setup and Testing sections.