MATIH Platform is in active MVP development. Documentation reflects current implementation status.
9. Query Engine & SQL
Connectors

Trino Connectors

Trino connectors provide the interface between Trino and external data sources. Each connector implements data source-specific protocols for metadata discovery, predicate pushdown, and data retrieval. The MATIH platform includes a curated set of connectors for common data sources.


Built-In Connectors

ConnectorSource TypeDescription
delta_lakeObject StorageReads Delta Lake tables from S3/MinIO
icebergObject StorageReads Apache Iceberg tables
hiveObject StorageReads Hive-compatible tables (Parquet, ORC, Avro)
postgresqlRDBMSConnects to PostgreSQL databases
mysqlRDBMSConnects to MySQL databases
sqlserverRDBMSConnects to SQL Server databases
mongodbDocument StoreConnects to MongoDB collections
elasticsearchSearch EngineQueries Elasticsearch indices
kafkaStreamingReads from Kafka topics as tables
memoryIn-MemoryTemporary in-memory tables
systemInternalTrino system metadata

Connector Capabilities

ConnectorPredicate PushdownProjection PushdownAggregation PushdownWrite Support
delta_lakeYesYesPartialYes
icebergYesYesPartialYes
postgresqlYesYesYesYes
mysqlYesYesPartialYes
mongodbPartialYesNoNo
elasticsearchPartialYesPartialNo
kafkaNoYesNoNo

Adding a Connector

Custom connectors are added by creating a properties file in the Trino catalog directory:

# mydb.properties
connector.name=postgresql
connection-url=jdbc:postgresql://host:5432/database
connection-user=${DB_USER}
connection-password=${DB_PASSWORD}

Credentials must be provided via Kubernetes secrets, not hardcoded in properties files.


Connector-Specific Configuration

Delta Lake

PropertyDescription
hive.metastore.uriHive metastore Thrift URI
delta.metadata.cache-ttlMetadata cache duration
hive.s3.endpointS3-compatible storage endpoint

PostgreSQL / MySQL

PropertyDescription
connection-urlJDBC connection URL
connection-userDatabase username (from secret)
connection-passwordDatabase password (from secret)

Kafka

PropertyDescription
kafka.nodesKafka bootstrap servers
kafka.table-namesTopics to expose as tables
kafka.default-schemaDefault schema name

Connector Health

The Query Engine monitors connector health through Trino's catalog listing API. Failed connectors are flagged in the platform status dashboard and excluded from query routing decisions.