MATIH Platform is in active MVP development. Documentation reflects current implementation status.
8. Platform Services
Canary Deployments

Canary Deployments

The API Gateway supports canary deployments by splitting traffic between a stable version and a canary version of a service. This is implemented through Kong upstreams with weighted targets, allowing gradual rollout of new service versions with controllable traffic percentages.


Canary Configuration

CanaryConfig Properties

PropertyTypeDefaultDescription
stableHostStringrequiredHostname of the stable service version
stablePortintrequiredPort of the stable service
canaryHostStringrequiredHostname of the canary service version
canaryPortintrequiredPort of the canary service
canaryWeightint10Percentage of traffic routed to canary (0-100)

Configure Canary Traffic Split

Endpoint: POST /api/v1/gateway/services/:serviceName/canary

curl -X POST http://localhost:8080/api/v1/gateway/services/ai-service/canary \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer ${TOKEN}" \
  -d '{
    "stableHost": "ai-service-v1.tenant-acme",
    "stablePort": 8000,
    "canaryHost": "ai-service-v2.tenant-acme",
    "canaryPort": 8000,
    "canaryWeight": 10
  }'

This routes 90% of traffic to the stable version and 10% to the canary version.


How It Works

When configureCanaryTraffic is called, the GatewayManagementService performs these steps:

  1. Creates or updates a Kong upstream named <serviceName>-upstream with the round-robin algorithm
  2. Adds the stable target with weight 100 - canaryWeight (e.g., weight 90)
  3. Adds the canary target with weight canaryWeight (e.g., weight 10)

Kong distributes traffic proportionally based on the target weights within the upstream.

Traffic Flow

Client Request
      |
      v
  Kong Route --> Kong Service --> Kong Upstream (ai-service-upstream)
                                       |
                          +------------+------------+
                          |                         |
                   Stable Target            Canary Target
                   (weight: 90)             (weight: 10)
                   ai-service-v1:8000       ai-service-v2:8000

Progressive Rollout Strategy

A typical canary deployment follows a progressive weight increase:

StageCanary WeightDurationAction
15%15 minutesInitial smoke test
210%30 minutesMonitor error rates
325%1 hourValidate performance
450%2 hoursLoad testing
5100%--Full promotion

At each stage, update the canary weight:

curl -X POST http://localhost:8080/api/v1/gateway/services/ai-service/canary \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer ${TOKEN}" \
  -d '{
    "stableHost": "ai-service-v1.tenant-acme",
    "stablePort": 8000,
    "canaryHost": "ai-service-v2.tenant-acme",
    "canaryPort": 8000,
    "canaryWeight": 25
  }'

Rollback

To roll back a canary deployment, set the canary weight to 0:

curl -X POST http://localhost:8080/api/v1/gateway/services/ai-service/canary \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer ${TOKEN}" \
  -d '{
    "stableHost": "ai-service-v1.tenant-acme",
    "stablePort": 8000,
    "canaryHost": "ai-service-v2.tenant-acme",
    "canaryPort": 8000,
    "canaryWeight": 0
  }'

Canary deployments work at the Kong upstream level. The service route must point to the upstream (not directly to a target host) for traffic splitting to take effect.