Give Your Agent a Brain
Connect Claude, Cursor, or any MCP-compatible agent to HatiData. 24 tools across SQL, Memory, Chain-of-Thought, Triggers, and Branches.
60-Second Setup
Three steps. No Docker, no cloud account required.
Install the MCP server
$ pip install hatidata-agent
Add to your MCP config
Claude Desktop / Claude Code (~/.claude/mcp.json)
{
"mcpServers": {
"hatidata": {
"command": "hatidata-mcp-server",
"args": ["--host", "localhost", "--port", "5439", "--agent-id", "my-agent", "--database", "hatidata"]
}
}
}Cursor (.cursor/mcp.json)
{
"mcpServers": {
"hatidata": {
"command": "hatidata-mcp-server",
"args": ["--host", "localhost", "--port", "5439", "--agent-id", "cursor-agent", "--database", "hatidata"]
}
}
}Any MCP host (stdio transport)
$ hatidata-mcp-server --host localhost --port 5439 --agent-id my-agent --database hatidata
Initialize a local warehouse
$ hati init # creates .hati/ with local DuckDB
Your agent now has access to all 24 tools. Ask it to list tables or query SELECT 1 to verify the connection.
Want hybrid SQL?
Standard SQL works immediately. For semantic search (semantic_match, JOIN_VECTOR), get a free cloud key (50 queries/day) and set HATIDATA_CLOUD_KEY in your environment.
Cloud Setup
$29/moConnect your agent to a managed HatiData Cloud instance. Requires an API key from your HatiData dashboard.
Claude Desktop / Claude Code
{
"mcpServers": {
"hatidata": {
"command": "hatidata-mcp-server",
"args": [
"--host", "your-org.proxy.hatidata.com",
"--port", "5439",
"--agent-id", "my-agent",
"--database", "hatidata"
],
"env": {
"HATIDATA_API_KEY": "hd_live_your_api_key_here"
}
}
}
}Cursor
{
"mcpServers": {
"hatidata": {
"command": "hatidata-mcp-server",
"args": [
"--host", "your-org.proxy.hatidata.com",
"--port", "5439",
"--agent-id", "cursor-agent",
"--database", "hatidata"
],
"env": {
"HATIDATA_API_KEY": "hd_live_your_api_key_here"
}
}
}
}Any MCP host (stdio transport)
$ HATIDATA_API_KEY=hd_live_... hatidata-mcp-server --host your-org.proxy.hatidata.com --port 5439 --agent-id my-agent --database hatidata
| Local (Free) | Cloud ($29/mo) | |
|---|---|---|
| Host | localhost | your-org.proxy.hatidata.com |
| API key required | No | Yes — HATIDATA_API_KEY |
| Hybrid SQL | With free cloud key (50/day) | Included (10K/day) |
| Data location | Your machine | Managed cloud |
| Setup | hati init | Sign up, get connection string |
23 Tools, 5 Categories
Every tool call passes through HatiData's full security pipeline: authentication, ABAC policy evaluation, audit logging, and metering.
SQL Tools
6 toolsQuery the warehouse, explore schemas, and track usage.
query
Execute SQL and return JSON results
sql
list_tables
List all tables the agent can access
no parameters
describe_table
Get column names, types, and nullability
table_name
list_schemas
List all schemas in the database
no parameters
read_query
Execute a read-only SQL query and return results
sql
get_usage_stats
Get query usage and resource consumption stats
agent_id?, since?
Memory Tools
5 toolsStore, search, and manage long-term agent memory with vector similarity.
store_memory
Store a memory with type, metadata, and importance
content, memory_type?, metadata?, importance?
search_memory
Semantic search across stored memories
query, top_k?, memory_type?, min_importance?
get_agent_state
Retrieve a named state value for the agent
key
set_agent_state
Set or update a named state value
key, value
delete_memory
Permanently delete a memory entry by ID
memory_id
Chain-of-Thought Tools
3 toolsRecord and replay agent reasoning traces with SHA-256 hash chains.
log_reasoning_step
Log a step (observation, thought, action, decision, etc.)
session_id, step_type, content, metadata?, importance?
replay_decision
Replay a full reasoning chain with hash verification
session_id, verify_chain?
get_session_history
List reasoning sessions with optional filtering
limit?, agent_id?, since?
Trigger Tools
4 toolsSemantic triggers that fire when content matches a concept above a threshold.
register_trigger
Register a trigger with concept, threshold, and action
name, concept, action, threshold?, cooldown_seconds?
list_triggers
List all registered triggers
status?
delete_trigger
Delete a trigger by ID
trigger_id
test_trigger
Test a trigger against sample content (dry run)
trigger_id, content
Branch Tools
5 toolsIsolated data branches for speculative what-if analysis with zero-copy views.
branch_create
Create an isolated branch from main data
name?, description?, ttl_seconds?
branch_query
Execute SQL within a branch's isolated schema
branch_id, sql
branch_merge
Merge branch changes back to main
branch_id, strategy?
branch_discard
Discard a branch and clean up
branch_id
branch_list
List all active branches
status?
Works With
Claude Desktop
~/.claude/mcp.json
Claude Code
~/.claude/mcp.json
Cursor
.cursor/mcp.json
Any MCP Host
stdio transport
Configuration
Pass these as environment variables in your MCP config or export them in your shell.
| Variable | Default | Description |
|---|---|---|
| HATIDATA_HOST | localhost | Proxy hostname (cloud: your-org.proxy.hatidata.com) |
| HATIDATA_PORT | 5439 | Proxy port |
| HATIDATA_API_KEY | (none) | Org API key for cloud auth (hd_live_...) |
| HATIDATA_AGENT_ID | mcp-agent | Agent identifier for audit and billing |
| HATIDATA_CLOUD_KEY | (none) | Cloud key for hybrid SQL transpilation |
| HATIDATA_FRAMEWORK | mcp | Framework tag (langchain, crewai, etc.) |
| HATIDATA_PRIORITY | normal | Scheduling priority: low, normal, high, critical |
Ready to connect?
Get started locally for free, or sign up for a cloud key to unlock hybrid SQL.