Integration Overview
The AI Service integrates with a broad set of internal platform services and external systems through event-driven, real-time, and request-response patterns. These integrations enable feedback loops, data lineage tracking, real-time streaming, personalization, and copilot capabilities that extend the core conversational analytics engine.
Integration Map
| Integration | Protocol | Direction | Purpose |
|---|---|---|---|
| Kafka | aiokafka | Bidirectional | Event streaming for feedback, agent thinking, learning signals |
| WebSocket | FastAPI WebSocket | Server to Client | Real-time streaming of agent responses and status updates |
| Feedback/RLHF | REST + Kafka | Inbound | User feedback collection and reinforcement learning signals |
| Data Lineage | REST + Kafka | Outbound | Query lineage tracking and column-level provenance |
| dbt | REST | Inbound | Semantic model import and metric definitions |
| Personalization | Internal | Bidirectional | User preference learning and response customization |
| AI Copilot | REST + WebSocket | Bidirectional | Contextual assistance and proactive suggestions |
Event-Driven Architecture
The AI Service uses Apache Kafka as the backbone for asynchronous event processing. Events flow through the following topics:
| Topic | Producer | Consumer | Schema |
|---|---|---|---|
ai-feedback-events | AI Service | Feedback Pipeline | FeedbackEvent |
ai-learning-signals | Feedback Pipeline | Model Tuning | LearningSignal |
ai-insights | Analysis Agent | Notification Service | InsightEvent |
agent-thinking-events | Orchestrator | Context Graph | ThinkingEvent |
agent-decision-events | Router Agent | Analytics Store | DecisionEvent |
agent-trace-events | All Agents | Trace Store | TraceEvent |
agent-metrics-events | Quality Metrics | Monitoring | MetricsEvent |
query-lineage-events | SQL Agent | Lineage Service | LineageEvent |
audit-events | All (read-only) | Compliance | AuditEvent |
Configuration
All integration endpoints are configured through environment variables bound to Pydantic Settings:
# Kafka
KAFKA_BOOTSTRAP_SERVERS=localhost:9092
KAFKA_SECURITY_PROTOCOL=PLAINTEXT
KAFKA_CONSUMER_GROUP=ai-service
# WebSocket
WS_MAX_CONNECTIONS=100
WS_HEARTBEAT_INTERVAL=30
# External services
QUERY_ENGINE_URL=http://query-engine:8080
CATALOG_SERVICE_URL=http://catalog-service:8086
SEMANTIC_LAYER_URL=http://semantic-layer:8086Graceful Degradation
All integrations follow a graceful degradation pattern. If an external dependency is unavailable, the AI Service continues operating with reduced functionality rather than failing entirely:
| Integration | Degraded Behavior |
|---|---|
| Kafka unavailable | Feedback events are buffered in memory with a bounded queue |
| WebSocket failure | Falls back to HTTP polling |
| Lineage service down | Lineage events are dropped silently |
| dbt unavailable | Uses cached semantic models |
| Personalization failure | Returns default non-personalized responses |
Detailed Sections
Each integration is covered in depth in its own section:
| Section | What You Will Learn |
|---|---|
| Kafka Event Streaming | Topic configuration, producer/consumer patterns, schema management |
| WebSocket Communication | Connection lifecycle, message protocol, heartbeats |
| Feedback and RLHF | Feedback collection, learning signals, model improvement |
| Data Lineage | Query provenance, column-level tracking, lineage graph |
| dbt Integration | Semantic model import, metric definitions, materialization |
| Personalization Engine | User profiles, preference learning, adaptive responses |
| AI Copilot | Contextual suggestions, proactive assistance, copilot modes |