MATIH Platform is in active MVP development. Documentation reflects current implementation status.
13. ML Service & MLOps
Model Lifecycle
A/B Testing

A/B Testing

The A/B testing framework enables statistical comparison of model variants with configurable traffic allocation, significance testing, and early stopping.


Test Configuration

from src.lifecycle.ab_testing import ABTestConfig, AllocationStrategy
 
config = ABTestConfig(
    allocation_strategy=AllocationStrategy.RANDOM,
    control_percentage=50.0,
    confidence_level=0.95,
    minimum_detectable_effect=0.05,
    power=0.8,
    min_samples_per_variant=1000,
    max_samples_per_variant=100000,
    min_duration_hours=24,
    max_duration_days=30,
    early_stopping_enabled=True,
    early_stopping_threshold=0.01,
    primary_metric="conversion_rate",
    guardrail_metrics=["latency_p99", "error_rate"],
)

Allocation Strategies

StrategyDescription
randomRandom assignment per request
stickyConsistent assignment by user hash
mabMulti-armed bandit (adaptive)
epsilon_greedyExplore/exploit with epsilon

Metric Types

TypeExampleStatistical Test
continuousLatency, revenueWelch's t-test
binaryConversion, clickChi-square, proportion z-test
countEvents per sessionPoisson test

Source Files

FilePath
A/B Testingdata-plane/ml-service/src/lifecycle/ab_testing.py
Versioning A/Bdata-plane/ml-service/src/versioning/ab_testing.py