GETTING STARTED
CLI Quickstart
Install HatiData, run your first query locally, and push to the cloud in 5 minutes.
1Install the CLI
The hati CLI ships with the HatiData Agent SDK. One install gets you both.
$ pip install hatidata-agent
# Verify the install
$ hati --version
hatidata-agent 0.4.0
2Initialize Your Project
Run hati init in any directory. This creates a .hati/ folder with a local database and a config file.
$ hati init
Created .hati/local.duckdb
Created .hati/config.toml
Ready! Run 'hati query' to get started.
3Query Locally
Run Snowflake-compatible SQL against your local HatiData instance. No cloud account required.
# Create a table
$ hati query "CREATE TABLE events (id INT, name VARCHAR, ts TIMESTAMP)"
# Insert data
$ hati query "INSERT INTO events VALUES (1, 'signup', NOW())"
# Query it back
$ hati query "SELECT * FROM events"
┌────┬────────┬─────────────────────┐
│ id │ name │ ts │
├────┼────────┼─────────────────────┤
│ 1 │ signup │ 2026-02-17 10:30:00 │
└────┴────────┴─────────────────────┘
4Configure Cloud Credentials
Connect your local project to HatiData Cloud. You can find your API key and org ID in the Dashboard.
$ hati config set api_key hd_live_your_key_here
$ hati config set org_id org_your_org_id
5Push to Cloud
Sync your local tables to HatiData Cloud with a single command. Your data stays in your VPC — only metadata leaves.
$ hati push --target cloud
Pushing events... done (1 row, 0.3s)
Sync complete. 1 table pushed to cloud.
Next Steps
- MCP Quickstart — Connect your AI agent via the Model Context Protocol
- Agent Memory — Give your agents persistent memory across sessions
- LangChain Integration — Use HatiData as a LangChain memory backend and vector store
- CrewAI Integration — Add HatiData tools to your CrewAI workflows