Skip to content

Latest commit

 

History

History

llm-routing

llm-routing

This is a sample Apigee proxy to demonstrate the routing capabilities of Apigee across different LLM providers. In this sample we will use Google VertexAI, Mistral and HuggingFace as the LLM providers

architecture

Pre-Requisites

  1. Provision Apigee X
  2. Configure external access for API traffic to your Apigee X instance
  3. Enable Vertex AI in your project
  4. Create a HuggingFace Account and create an Access Token. When creating the token choose 'Read' for the token type.
  5. Similar to HuggingFace, create a Mistral Account and create an API Key
  6. Make sure the following tools are available in your terminal's $PATH (Cloud Shell has these preconfigured)

Get started

Proceed to this notebook and follow the steps in the Setup and Testing sections.