JSON Export
The Query Engine supports two JSON export formats: standard JSON arrays and JSON Lines (newline-delimited JSON).
JSON Array Export
Exports results as a single JSON array:
curl -X POST http://query-engine:8080/v1/queries/a1b2c3d4-e5f6-7890/export \
-H "Content-Type: application/json" \
-H "X-Tenant-ID: 550e8400-e29b-41d4-a716-446655440000" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{"format": "JSON"}'Output format:
[
{"customer_id": "C-001", "total_orders": 42, "revenue": 12500.00},
{"customer_id": "C-002", "total_orders": 17, "revenue": 8200.50}
]JSON Lines Export
Exports results as one JSON object per line, suitable for streaming processing:
curl -X POST http://query-engine:8080/v1/queries/a1b2c3d4-e5f6-7890/export \
-H "Content-Type: application/json" \
-H "X-Tenant-ID: 550e8400-e29b-41d4-a716-446655440000" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{"format": "JSON_LINES"}'Output format:
{"customer_id":"C-001","total_orders":42,"revenue":12500.00}
{"customer_id":"C-002","total_orders":17,"revenue":8200.50}Implementation
Both formats are written using chunked processing with the Jackson ObjectMapper:
private long writeJsonExport(UUID executionId, Path path, boolean lineDelimited)
throws IOException {
try (BufferedWriter writer = ...) {
if (!lineDelimited) writer.write("[");
long rowCount = 0;
int offset = 0;
boolean first = true;
while (offset < maxExportRows) {
var page = resultStorageService.fetchPage(executionId, offset, chunkSize);
if (page.getRows().isEmpty()) break;
for (Map<String, Object> row : page.getRows()) {
if (lineDelimited) {
writer.write(objectMapper.writeValueAsString(row));
writer.newLine();
} else {
if (!first) writer.write(",");
writer.newLine();
writer.write(objectMapper.writeValueAsString(row));
first = false;
}
rowCount++;
}
offset += page.getRows().size();
if (!page.isHasMore()) break;
}
if (!lineDelimited) { writer.newLine(); writer.write("]"); }
return rowCount;
}
}When to Use Each Format
| Format | Use Case |
|---|---|
| JSON | Small results, API consumption, frontend display |
| JSON Lines | Large datasets, streaming pipelines, log ingestion, Spark input |