The open-source stack for ClickHouse's suite of agentic analytic tools — your chat, your models, your data.
Powered by ClickHouse, LibreChat, and Langfuse.
Learn more at clickhouse.ai
This project runs a fully self-hosted agentic analytics environment with Docker Compose. It connects a chat UI (LibreChat) to your data (ClickHouse) via MCP, with full LLM observability (Langfuse) — all in a single docker compose up command.
| Component | Purpose | Port |
|---|---|---|
| LibreChat | Modern Chat UI with multi-model / provider support (OpenAI, Anthropic, Google) | 3080 |
| ClickHouse MCP | MCP server that gives agents access to ClickHouse | 8000 |
| Langfuse | LLM observability — traces, evals, prompt management | 3000 |
| ClickHouse | World's fastest analytical database | 8123 |
| PostgreSQL | Transactional database for Langfuse | 5432 |
| MongoDB | Transactional database for LibreChat | 27017 |
| MinIO | S3-compatible object storage | 9090 |
| Redis | Caching and queue | 6379 |
| Meilisearch | Full-text search for LibreChat | 7700 |
| pgvector | Vector database for RAG | 5433 |
| RAG API | Retrieval-augmented generation service for LibreChat | 8001 |
- Docker and Docker Compose v2+
./scripts/prepare-demo.shThis is your fastest way to get started with the Agentic Data Stack. It generates a .env file with random credentials for all services, then presents an interactive menu to optionally configure API keys for OpenAI, Anthropic, and/or Google. Any providers you skip will remain as user_provided, letting users enter their own keys in the LibreChat UI.
You can also generate credentials separately and customize the initial administrator account credentials:
USER_EMAIL="you@example.com" USER_PASSWORD="supersecret" USER_NAME="YourName" ./scripts/generate-env.shLearn more about configuring your LibreChat instance at https://librechat.ai/docs.
Note: To use LibreChat's file search / RAG features, the RAG API needs a real API key for embeddings —
user_providedwon't work because the RAG API calls the embeddings endpoint directly. IfOPENAI_API_KEYis set touser_provided, setRAG_OPENAI_API_KEYto a valid OpenAI key (it overridesOPENAI_API_KEYfor RAG only). You can also switch embedding providers viaEMBEDDINGS_PROVIDER(openai,azure,huggingface,huggingfacetei,ollama). See the RAG API docs for details.
docker compose up -d- LibreChat — http://localhost:3080
- Langfuse — http://localhost:3000
- MinIO Console — http://localhost:9091 (Find credentials in
.envunder MINIO_ROOT_* fields)
An admin user is created automatically on first startup using the credentials from your .env file.
LibreChat connects to ClickHouse through the MCP server, allowing AI agents to query and analyze your data. All LLM interactions are traced in Langfuse for observability, evaluation, and prompt management.
| Script | Description |
|---|---|
scripts/prepare-demo.sh |
Generate .env and interactively configure API keys |
scripts/generate-env.sh |
Generate .env with random credentials |
scripts/reset-all.sh |
Stop all containers and wipe all data/volumes |
scripts/create-librechat-user.sh |
Manually create a LibreChat admin user |
scripts/init-librechat-user.sh |
Auto-init user on container startup (used internally) |
- LibreChat —
librechat.yamlconfigures endpoints, MCP servers, and agent capabilities - Environment —
.envholds all credentials and service configuration (see.env.examplefor reference) - Docker —
docker-compose.ymlincludes the three compose files:langfuse-compose.yml— Langfuse, ClickHouse, PostgreSQL, Redis, MinIOclickhouse-mcp-compose.yml— ClickHouse MCP serverlibrechat-compose.yml— LibreChat, MongoDB, Meilisearch, pgvector, RAG API
To tear down all containers and delete all data:
./scripts/reset-all.shThen set up again and start fresh:
./scripts/prepare-demo.sh
docker compose up -d- clickhouse.ai — Project homepage
- Documentation — Full setup guide for adding ClickHouse MCP to LibreChat
- ClickHouse MCP — MCP server for ClickHouse
- LibreChat — Chat UI
- Langfuse — LLM observability
