MATIH Platform is in active MVP development. Documentation reflects current implementation status.
12. AI Service
LLM Infrastructure
LLM Providers

LLM Providers

Production - OpenAI, Anthropic, Azure OpenAI, Vertex AI, Bedrock, vLLM

The AI Service supports six LLM providers, each with its own configuration and capabilities.


12.6.8.1Provider Matrix

ProviderClassDefault ModelKey Features
OpenAIOpenAIProvidergpt-4oFunction calling, JSON mode, vision
AnthropicAnthropicProviderclaude-3-5-sonnetExtended thinking, tool use
Azure OpenAIAzureOpenAIProvidergpt-4oEnterprise compliance, VNET integration
Vertex AIVertexAIProvidergemini-1.5-proGoogle Cloud native, grounding
AWS BedrockBedrockProviderclaude-3-5-sonnet-v2AWS native, IAM integration
vLLMVLLMProviderSelf-hostedZero cost, custom models, full control

Provider Configuration

# OpenAI
OpenAIProvider(
    api_key="sk-...",
    default_model="gpt-4o",
    organization="org-xxx",
)
 
# Anthropic with Extended Thinking
AnthropicProvider(
    api_key="sk-ant-...",
    default_model="claude-opus-4-5-20250120",
)
 
# Azure OpenAI
AzureOpenAIProvider(
    api_key="...",
    endpoint="https://myinstance.openai.azure.com/",
    api_version="2024-08-01-preview",
    deployment_name="gpt-4o",
)
 
# Vertex AI (Gemini)
VertexAIProvider(config=VertexAIConfig(
    project_id="my-project",
    location="us-central1",
    default_model="gemini-1.5-pro",
))
 
# AWS Bedrock
BedrockProvider(config=BedrockConfig(
    region="us-east-1",
    default_model="anthropic.claude-3-5-sonnet-20241022-v2:0",
))
 
# vLLM (Self-hosted)
VLLMProvider(
    base_url="http://vllm.matih-data-plane.svc.cluster.local:8000",
    default_model="default",
    cost_per_1k_tokens=0.0,
)