Optimize database performance with indexing, query tuning, and caching strategies
# Database Optimization Guide for Google Antigravity
Master database optimization techniques to build high-performance applications with Google Antigravity IDE.
## Query Optimization
```typescript
// lib/db/optimized-queries.ts
import { db } from "./client";
import { users, orders, orderItems, products } from "./schema";
import { eq, and, gte, lte, sql, desc, asc } from "drizzle-orm";
// Avoid N+1 queries with proper joins
export async function getOrdersWithItems(userId: string) {
// BAD: N+1 query pattern
// const orders = await db.select().from(orders).where(eq(orders.userId, userId));
// for (const order of orders) {
// order.items = await db.select().from(orderItems).where(eq(orderItems.orderId, order.id));
// }
// GOOD: Single query with join
return db
.select({
order: orders,
item: orderItems,
product: products
})
.from(orders)
.leftJoin(orderItems, eq(orders.id, orderItems.orderId))
.leftJoin(products, eq(orderItems.productId, products.id))
.where(eq(orders.userId, userId))
.orderBy(desc(orders.createdAt));
}
// Use pagination for large datasets
export async function getProducts(options: {
page: number;
limit: number;
category?: string;
minPrice?: number;
maxPrice?: number;
}) {
const { page, limit, category, minPrice, maxPrice } = options;
const offset = (page - 1) * limit;
const conditions = [];
if (category) conditions.push(eq(products.category, category));
if (minPrice !== undefined) conditions.push(gte(products.price, minPrice));
if (maxPrice !== undefined) conditions.push(lte(products.price, maxPrice));
const [items, countResult] = await Promise.all([
db.select()
.from(products)
.where(conditions.length ? and(...conditions) : undefined)
.limit(limit)
.offset(offset)
.orderBy(asc(products.name)),
db.select({ count: sql<number>`count(*)` })
.from(products)
.where(conditions.length ? and(...conditions) : undefined)
]);
return {
items,
pagination: {
page,
limit,
total: countResult[0].count,
totalPages: Math.ceil(countResult[0].count / limit)
}
};
}
// Use cursor-based pagination for real-time data
export async function getRecentOrders(cursor?: string, limit = 20) {
const query = db.select().from(orders).orderBy(desc(orders.createdAt)).limit(limit + 1);
if (cursor) {
query.where(lte(orders.createdAt, new Date(cursor)));
}
const results = await query;
const hasMore = results.length > limit;
const items = hasMore ? results.slice(0, -1) : results;
return {
items,
nextCursor: hasMore ? items[items.length - 1].createdAt.toISOString() : null
};
}
```
## Index Strategy
```sql
-- migrations/optimize_indexes.sql
-- Composite index for common query patterns
CREATE INDEX CONCURRENTLY idx_orders_user_created
ON orders (user_id, created_at DESC);
-- Partial index for active records only
CREATE INDEX CONCURRENTLY idx_products_active
ON products (category, price)
WHERE status = 'active';
-- Expression index for case-insensitive search
CREATE INDEX CONCURRENTLY idx_users_email_lower
ON users (LOWER(email));
-- GIN index for full-text search
CREATE INDEX CONCURRENTLY idx_products_search
ON products USING GIN (to_tsvector('english', name || ' ' || description));
-- BRIN index for time-series data
CREATE INDEX CONCURRENTLY idx_events_timestamp
ON events USING BRIN (created_at);
-- Analyze query plans
EXPLAIN (ANALYZE, BUFFERS, FORMAT JSON)
SELECT * FROM orders
WHERE user_id = 'uuid-here'
AND created_at > NOW() - INTERVAL '30 days'
ORDER BY created_at DESC
LIMIT 10;
```
## Connection Pooling
```typescript
// lib/db/pool.ts
import { Pool } from "pg";
import { drizzle } from "drizzle-orm/node-postgres";
import * as schema from "./schema";
// Configure pool for optimal performance
const pool = new Pool({
connectionString: process.env.DATABASE_URL,
max: 20, // Maximum connections
idleTimeoutMillis: 30000, // Close idle connections after 30s
connectionTimeoutMillis: 2000, // Fail fast on connection
maxUses: 7500, // Recycle connections periodically
});
// Monitor pool health
pool.on("error", (err) => {
console.error("Unexpected pool error:", err);
});
pool.on("connect", () => {
console.log("New client connected to pool");
});
export const db = drizzle(pool, { schema });
// Graceful shutdown
process.on("SIGTERM", async () => {
await pool.end();
process.exit(0);
});
```
## Query Caching
```typescript
// lib/db/cache.ts
import { Redis } from "ioredis";
import { db } from "./client";
const redis = new Redis(process.env.REDIS_URL!);
interface CacheOptions {
ttl?: number;
tags?: string[];
}
export async function cachedQuery<T>(
key: string,
queryFn: () => Promise<T>,
options: CacheOptions = {}
): Promise<T> {
const { ttl = 300, tags = [] } = options;
// Try cache first
const cached = await redis.get(key);
if (cached) {
return JSON.parse(cached);
}
// Execute query
const result = await queryFn();
// Cache result
await redis.setex(key, ttl, JSON.stringify(result));
// Track tags for invalidation
for (const tag of tags) {
await redis.sadd(`tag:${tag}`, key);
}
return result;
}
export async function invalidateTag(tag: string): Promise<void> {
const keys = await redis.smembers(`tag:${tag}`);
if (keys.length > 0) {
await redis.del(...keys);
await redis.del(`tag:${tag}`);
}
}
// Usage example
export async function getProductById(id: string) {
return cachedQuery(
`product:${id}`,
() => db.query.products.findFirst({ where: eq(products.id, id) }),
{ ttl: 600, tags: ["products", `product:${id}`] }
);
}
```
## Best Practices
1. **Use EXPLAIN ANALYZE** to understand query performance
2. **Create targeted indexes** for frequent query patterns
3. **Implement connection pooling** to reduce overhead
4. **Cache frequently accessed** data appropriately
5. **Use pagination** for large result sets
6. **Avoid SELECT stars** in production queries
7. **Monitor slow queries** and optimize proactively
Google Antigravity provides intelligent query analysis and index recommendations for optimal database performance.This database prompt is ideal for developers working on:
By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their database implementations.
Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.
This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.
You can modify the prompt by adding specific requirements, constraints, or preferences. For database projects, consider mentioning your framework version, coding style, and any specific libraries you're using.