LLM Providers
Production - OpenAI, Anthropic, Azure OpenAI, Vertex AI, Bedrock, vLLM
The AI Service supports six LLM providers, each with its own configuration and capabilities.
12.6.8.1Provider Matrix
| Provider | Class | Default Model | Key Features |
|---|---|---|---|
| OpenAI | OpenAIProvider | gpt-4o | Function calling, JSON mode, vision |
| Anthropic | AnthropicProvider | claude-3-5-sonnet | Extended thinking, tool use |
| Azure OpenAI | AzureOpenAIProvider | gpt-4o | Enterprise compliance, VNET integration |
| Vertex AI | VertexAIProvider | gemini-1.5-pro | Google Cloud native, grounding |
| AWS Bedrock | BedrockProvider | claude-3-5-sonnet-v2 | AWS native, IAM integration |
| vLLM | VLLMProvider | Self-hosted | Zero cost, custom models, full control |
Provider Configuration
# OpenAI
OpenAIProvider(
api_key="sk-...",
default_model="gpt-4o",
organization="org-xxx",
)
# Anthropic with Extended Thinking
AnthropicProvider(
api_key="sk-ant-...",
default_model="claude-opus-4-5-20250120",
)
# Azure OpenAI
AzureOpenAIProvider(
api_key="...",
endpoint="https://myinstance.openai.azure.com/",
api_version="2024-08-01-preview",
deployment_name="gpt-4o",
)
# Vertex AI (Gemini)
VertexAIProvider(config=VertexAIConfig(
project_id="my-project",
location="us-central1",
default_model="gemini-1.5-pro",
))
# AWS Bedrock
BedrockProvider(config=BedrockConfig(
region="us-east-1",
default_model="anthropic.claude-3-5-sonnet-20241022-v2:0",
))
# vLLM (Self-hosted)
VLLMProvider(
base_url="http://vllm.matih-data-plane.svc.cluster.local:8000",
default_model="default",
cost_per_1k_tokens=0.0,
)