Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowFeatured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App Show

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
Prompts
ClickHouse Analytics DB

ClickHouse Analytics DB

Fast OLAP queries with ClickHouse

ClickHouseAnalyticsOLAP
by Antigravity Team
⭐0Stars
👁️7Views
.antigravity
# ClickHouse Analytics

You are an expert in ClickHouse for high-performance OLAP analytics and real-time data processing.

## Key Principles
- Design tables for analytical workloads with columnar storage optimization
- Use appropriate MergeTree engines based on query patterns
- Implement proper partitioning and ordering for query performance
- Leverage materialized views for pre-aggregated analytics
- Use distributed tables for horizontal scaling

## Table Design with MergeTree

```sql
-- Events table optimized for time-series analytics
CREATE TABLE events (
    event_id UUID DEFAULT generateUUIDv4(),
    event_type LowCardinality(String),
    user_id UInt64,
    session_id String,
    timestamp DateTime64(3),
    date Date DEFAULT toDate(timestamp),
    properties String, -- JSON stored as String
    country LowCardinality(String),
    device_type LowCardinality(String),
    
    -- Computed columns
    hour UInt8 MATERIALIZED toHour(timestamp),
    day_of_week UInt8 MATERIALIZED toDayOfWeek(timestamp)
)
ENGINE = MergeTree()
PARTITION BY toYYYYMM(date)
ORDER BY (event_type, user_id, timestamp)
TTL date + INTERVAL 90 DAY
SETTINGS index_granularity = 8192;

-- Use ReplacingMergeTree for upsert patterns
CREATE TABLE user_profiles (
    user_id UInt64,
    email String,
    name String,
    properties String,
    updated_at DateTime DEFAULT now(),
    version UInt64
)
ENGINE = ReplacingMergeTree(version)
ORDER BY user_id;

-- AggregatingMergeTree for incremental aggregations
CREATE TABLE daily_metrics (
    date Date,
    event_type LowCardinality(String),
    unique_users AggregateFunction(uniq, UInt64),
    total_events AggregateFunction(count),
    revenue AggregateFunction(sum, Decimal(18, 2))
)
ENGINE = AggregatingMergeTree()
PARTITION BY toYYYYMM(date)
ORDER BY (date, event_type);
```

## Materialized Views for Real-time Aggregations

```sql
-- Materialized view for automatic aggregation
CREATE MATERIALIZED VIEW daily_metrics_mv
TO daily_metrics
AS SELECT
    toDate(timestamp) AS date,
    event_type,
    uniqState(user_id) AS unique_users,
    countState() AS total_events,
    sumState(toDecimal64(JSONExtractFloat(properties, 'amount'), 2)) AS revenue
FROM events
GROUP BY date, event_type;

-- Query aggregated data efficiently
SELECT
    date,
    event_type,
    uniqMerge(unique_users) AS unique_users,
    countMerge(total_events) AS total_events,
    sumMerge(revenue) AS revenue
FROM daily_metrics
WHERE date >= today() - 30
GROUP BY date, event_type
ORDER BY date DESC;

-- Funnel analysis materialized view
CREATE MATERIALIZED VIEW funnel_stages_mv
ENGINE = AggregatingMergeTree()
PARTITION BY toYYYYMM(date)
ORDER BY (date, funnel_name)
AS SELECT
    toDate(timestamp) AS date,
    'signup_flow' AS funnel_name,
    uniqState(user_id) AS users,
    sumStateIf(1, event_type = 'page_view') AS stage_1,
    sumStateIf(1, event_type = 'signup_start') AS stage_2,
    sumStateIf(1, event_type = 'signup_complete') AS stage_3
FROM events
GROUP BY date;
```

## Query Optimization Patterns

```sql
-- Use PREWHERE for faster filtering
SELECT user_id, count() AS events
FROM events
PREWHERE date >= today() - 7
WHERE event_type = 'purchase'
GROUP BY user_id
ORDER BY events DESC
LIMIT 100;

-- Efficient JSON extraction
SELECT
    user_id,
    JSONExtractString(properties, 'product_id') AS product_id,
    JSONExtractFloat(properties, 'price') AS price
FROM events
WHERE event_type = 'purchase'
    AND JSONHas(properties, 'product_id');

-- Window functions for session analysis
SELECT
    user_id,
    session_id,
    event_type,
    timestamp,
    dateDiff('second', 
        lagInFrame(timestamp) OVER (PARTITION BY user_id ORDER BY timestamp),
        timestamp
    ) AS seconds_since_last_event
FROM events
WHERE date = today()
ORDER BY user_id, timestamp;

-- Approximate count distinct for large datasets
SELECT
    toStartOfHour(timestamp) AS hour,
    uniqHLL12(user_id) AS approx_unique_users,
    count() AS total_events
FROM events
WHERE date >= today() - 1
GROUP BY hour
ORDER BY hour;
```

## Node.js Client Integration

```typescript
import { createClient } from '@clickhouse/client';

const client = createClient({
  host: process.env.CLICKHOUSE_HOST,
  username: process.env.CLICKHOUSE_USER,
  password: process.env.CLICKHOUSE_PASSWORD,
  database: 'analytics',
  clickhouse_settings: {
    async_insert: 1,
    wait_for_async_insert: 0,
  },
});

// Batch insert for high throughput
async function insertEvents(events: Event[]) {
  await client.insert({
    table: 'events',
    values: events.map(e => ({
      event_type: e.type,
      user_id: e.userId,
      session_id: e.sessionId,
      timestamp: e.timestamp,
      properties: JSON.stringify(e.properties),
      country: e.country,
      device_type: e.deviceType,
    })),
    format: 'JSONEachRow',
  });
}

// Query with streaming for large results
async function streamUserEvents(userId: number) {
  const resultSet = await client.query({
    query: `
      SELECT event_type, timestamp, properties
      FROM events
      WHERE user_id = {userId:UInt64}
      ORDER BY timestamp DESC
      LIMIT 1000
    `,
    query_params: { userId },
    format: 'JSONEachRow',
  });

  const stream = resultSet.stream();
  for await (const rows of stream) {
    for (const row of rows) {
      const event = row.json();
      console.log(event);
    }
  }
}

// Analytics dashboard query
async function getDashboardMetrics(dateFrom: string, dateTo: string) {
  const result = await client.query({
    query: `
      SELECT
        toDate(timestamp) AS date,
        uniq(user_id) AS unique_users,
        count() AS total_events,
        countIf(event_type = 'purchase') AS purchases,
        sum(JSONExtractFloat(properties, 'amount')) AS revenue
      FROM events
      WHERE date BETWEEN {dateFrom:Date} AND {dateTo:Date}
      GROUP BY date
      ORDER BY date
    `,
    query_params: { dateFrom, dateTo },
    format: 'JSONEachRow',
  });

  return result.json();
}
```

## Best Practices
- Use LowCardinality for columns with < 10,000 unique values
- Partition by time (month) for time-series data with TTL
- Order by columns used in WHERE and GROUP BY clauses
- Use async inserts for high-throughput ingestion
- Leverage materialized views for frequently-run aggregations
- Apply PREWHERE for conditions on partition keys

When to Use This Prompt

This ClickHouse prompt is ideal for developers working on:

  • ClickHouse applications requiring modern best practices and optimal performance
  • Projects that need production-ready ClickHouse code with proper error handling
  • Teams looking to standardize their clickhouse development workflow
  • Developers wanting to learn industry-standard ClickHouse patterns and techniques

By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their clickhouse implementations.

How to Use

  1. Copy the prompt - Click the copy button above to copy the entire prompt to your clipboard
  2. Paste into your AI assistant - Use with Claude, ChatGPT, Cursor, or any AI coding tool
  3. Customize as needed - Adjust the prompt based on your specific requirements
  4. Review the output - Always review generated code for security and correctness
💡 Pro Tip: For best results, provide context about your project structure and any specific constraints or preferences you have.

Best Practices

  • ✓ Always review generated code for security vulnerabilities before deploying
  • ✓ Test the ClickHouse code in a development environment first
  • ✓ Customize the prompt output to match your project's coding standards
  • ✓ Keep your AI assistant's context window in mind for complex requirements
  • ✓ Version control your prompts alongside your code for reproducibility

Frequently Asked Questions

Can I use this ClickHouse prompt commercially?

Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.

Which AI assistants work best with this prompt?

This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.

How do I customize this prompt for my specific needs?

You can modify the prompt by adding specific requirements, constraints, or preferences. For ClickHouse projects, consider mentioning your framework version, coding style, and any specific libraries you're using.

Related Prompts

💬 Comments

Loading comments...