Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowAI ToolzShinyLaunchMillion Dot HomepageSolver ToolsFeatured on FazierFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowAI ToolzShinyLaunchMillion Dot HomepageSolver Tools

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
Prompts
Database Optimization Guide

Database Optimization Guide

Optimize database performance with indexing, query tuning, and caching strategies

databasepostgresqloptimizationindexing
by antigravity-team
⭐0Stars
.antigravity
# Database Optimization Guide for Google Antigravity

Master database optimization techniques to build high-performance applications with Google Antigravity IDE.

## Query Optimization

```typescript
// lib/db/optimized-queries.ts
import { db } from "./client";
import { users, orders, orderItems, products } from "./schema";
import { eq, and, gte, lte, sql, desc, asc } from "drizzle-orm";

// Avoid N+1 queries with proper joins
export async function getOrdersWithItems(userId: string) {
  // BAD: N+1 query pattern
  // const orders = await db.select().from(orders).where(eq(orders.userId, userId));
  // for (const order of orders) {
  //   order.items = await db.select().from(orderItems).where(eq(orderItems.orderId, order.id));
  // }
  
  // GOOD: Single query with join
  return db
    .select({
      order: orders,
      item: orderItems,
      product: products
    })
    .from(orders)
    .leftJoin(orderItems, eq(orders.id, orderItems.orderId))
    .leftJoin(products, eq(orderItems.productId, products.id))
    .where(eq(orders.userId, userId))
    .orderBy(desc(orders.createdAt));
}

// Use pagination for large datasets
export async function getProducts(options: {
  page: number;
  limit: number;
  category?: string;
  minPrice?: number;
  maxPrice?: number;
}) {
  const { page, limit, category, minPrice, maxPrice } = options;
  const offset = (page - 1) * limit;
  
  const conditions = [];
  if (category) conditions.push(eq(products.category, category));
  if (minPrice !== undefined) conditions.push(gte(products.price, minPrice));
  if (maxPrice !== undefined) conditions.push(lte(products.price, maxPrice));
  
  const [items, countResult] = await Promise.all([
    db.select()
      .from(products)
      .where(conditions.length ? and(...conditions) : undefined)
      .limit(limit)
      .offset(offset)
      .orderBy(asc(products.name)),
    db.select({ count: sql<number>`count(*)` })
      .from(products)
      .where(conditions.length ? and(...conditions) : undefined)
  ]);
  
  return {
    items,
    pagination: {
      page,
      limit,
      total: countResult[0].count,
      totalPages: Math.ceil(countResult[0].count / limit)
    }
  };
}

// Use cursor-based pagination for real-time data
export async function getRecentOrders(cursor?: string, limit = 20) {
  const query = db.select().from(orders).orderBy(desc(orders.createdAt)).limit(limit + 1);
  
  if (cursor) {
    query.where(lte(orders.createdAt, new Date(cursor)));
  }
  
  const results = await query;
  const hasMore = results.length > limit;
  const items = hasMore ? results.slice(0, -1) : results;
  
  return {
    items,
    nextCursor: hasMore ? items[items.length - 1].createdAt.toISOString() : null
  };
}
```

## Index Strategy

```sql
-- migrations/optimize_indexes.sql

-- Composite index for common query patterns
CREATE INDEX CONCURRENTLY idx_orders_user_created 
ON orders (user_id, created_at DESC);

-- Partial index for active records only
CREATE INDEX CONCURRENTLY idx_products_active 
ON products (category, price) 
WHERE status = 'active';

-- Expression index for case-insensitive search
CREATE INDEX CONCURRENTLY idx_users_email_lower 
ON users (LOWER(email));

-- GIN index for full-text search
CREATE INDEX CONCURRENTLY idx_products_search 
ON products USING GIN (to_tsvector('english', name || ' ' || description));

-- BRIN index for time-series data
CREATE INDEX CONCURRENTLY idx_events_timestamp 
ON events USING BRIN (created_at);

-- Analyze query plans
EXPLAIN (ANALYZE, BUFFERS, FORMAT JSON)
SELECT * FROM orders 
WHERE user_id = 'uuid-here' 
AND created_at > NOW() - INTERVAL '30 days'
ORDER BY created_at DESC 
LIMIT 10;
```

## Connection Pooling

```typescript
// lib/db/pool.ts
import { Pool } from "pg";
import { drizzle } from "drizzle-orm/node-postgres";
import * as schema from "./schema";

// Configure pool for optimal performance
const pool = new Pool({
  connectionString: process.env.DATABASE_URL,
  max: 20, // Maximum connections
  idleTimeoutMillis: 30000, // Close idle connections after 30s
  connectionTimeoutMillis: 2000, // Fail fast on connection
  maxUses: 7500, // Recycle connections periodically
});

// Monitor pool health
pool.on("error", (err) => {
  console.error("Unexpected pool error:", err);
});

pool.on("connect", () => {
  console.log("New client connected to pool");
});

export const db = drizzle(pool, { schema });

// Graceful shutdown
process.on("SIGTERM", async () => {
  await pool.end();
  process.exit(0);
});
```

## Query Caching

```typescript
// lib/db/cache.ts
import { Redis } from "ioredis";
import { db } from "./client";

const redis = new Redis(process.env.REDIS_URL!);

interface CacheOptions {
  ttl?: number;
  tags?: string[];
}

export async function cachedQuery<T>(
  key: string,
  queryFn: () => Promise<T>,
  options: CacheOptions = {}
): Promise<T> {
  const { ttl = 300, tags = [] } = options;
  
  // Try cache first
  const cached = await redis.get(key);
  if (cached) {
    return JSON.parse(cached);
  }
  
  // Execute query
  const result = await queryFn();
  
  // Cache result
  await redis.setex(key, ttl, JSON.stringify(result));
  
  // Track tags for invalidation
  for (const tag of tags) {
    await redis.sadd(`tag:${tag}`, key);
  }
  
  return result;
}

export async function invalidateTag(tag: string): Promise<void> {
  const keys = await redis.smembers(`tag:${tag}`);
  if (keys.length > 0) {
    await redis.del(...keys);
    await redis.del(`tag:${tag}`);
  }
}

// Usage example
export async function getProductById(id: string) {
  return cachedQuery(
    `product:${id}`,
    () => db.query.products.findFirst({ where: eq(products.id, id) }),
    { ttl: 600, tags: ["products", `product:${id}`] }
  );
}
```

## Best Practices

1. **Use EXPLAIN ANALYZE** to understand query performance
2. **Create targeted indexes** for frequent query patterns
3. **Implement connection pooling** to reduce overhead
4. **Cache frequently accessed** data appropriately
5. **Use pagination** for large result sets
6. **Avoid SELECT stars** in production queries
7. **Monitor slow queries** and optimize proactively

Google Antigravity provides intelligent query analysis and index recommendations for optimal database performance.

When to Use This Prompt

This database prompt is ideal for developers working on:

  • database applications requiring modern best practices and optimal performance
  • Projects that need production-ready database code with proper error handling
  • Teams looking to standardize their database development workflow
  • Developers wanting to learn industry-standard database patterns and techniques

By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their database implementations.

How to Use

  1. Copy the prompt - Click the copy button above to copy the entire prompt to your clipboard
  2. Paste into your AI assistant - Use with Claude, ChatGPT, Cursor, or any AI coding tool
  3. Customize as needed - Adjust the prompt based on your specific requirements
  4. Review the output - Always review generated code for security and correctness
💡 Pro Tip: For best results, provide context about your project structure and any specific constraints or preferences you have.

Best Practices

  • ✓ Always review generated code for security vulnerabilities before deploying
  • ✓ Test the database code in a development environment first
  • ✓ Customize the prompt output to match your project's coding standards
  • ✓ Keep your AI assistant's context window in mind for complex requirements
  • ✓ Version control your prompts alongside your code for reproducibility

Frequently Asked Questions

Can I use this database prompt commercially?

Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.

Which AI assistants work best with this prompt?

This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.

How do I customize this prompt for my specific needs?

You can modify the prompt by adding specific requirements, constraints, or preferences. For database projects, consider mentioning your framework version, coding style, and any specific libraries you're using.

Related Prompts

💬 Comments

Loading comments...