All systems operational

Data Platform

Everything you need to build, monitor, and ship analytics across all environments.

4Environments·50+dbt Models·3AWS Regions·8DAGs

Quick Access

Jump directly into any platform tool.

Environments

Four ClickHouse Cloud clusters spanning two AWS regions.

Staging

Single Source
Region
us-east-1
Host

Prod 1 (US)

Single Source
Region
us-east-1
Host

Prod 2 (EU)

Single Source
Region
eu-west-1
Host

Prod Analytics

Federated
Region
us-east-1
Host

Architecture

Medallion architecture: from raw events to business-ready metrics.

Sources
Kafka (MSK)
Event streams
Kafka Connect
Iceberg Sink
S3 Iceberg
Parquet + metadata
Bronze
icebergS3() Views
Query-time, no copy
Materialized Tables
Local copies
Silver
Staging Models
ReplacingMergeTree
Intermediate
Business logic
Gold
Marts
KPIs, aggregations
Federation
remoteSecure()
Consumers
Hex
BI & dashboards
In-App
Customer-facing
Cube.js
SQL API
Elementary
Observability

Observability

Data quality, test results, and model health powered by Elementary.

Staging

US East

Prod 1 (US)

US East

Prod 2 (EU)

EU West

Getting Started

Up and running in three steps.

1

Clone & Install

Set up the dbt project and install dependencies.

git clone git@github.com:kustomer/kustomer-analytics.git
cd kustomer-analytics
python3 -m venv venv && source venv/bin/activate
pip install dbt-clickhouse==1.10.0
dbt deps
2

Local Development

Start local ClickHouse via Tilt and build models.

# In ~/Dev/kustomer/clouddev
tilt up

# Back in kustomer-analytics
cp profiles.yml.example ~/.dbt/profiles.yml
dbt build --target clickhouse_local
3

Deploy

Push to main — CI handles staging. Prod needs approval.

git checkout -b feat/my-model
git push -u origin HEAD
# CircleCI: build → staging auto-deploy
# Approve prod1 / prod2 in CircleCI UI