Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowAI ToolzShinyLaunchMillion Dot HomepageSolver ToolsFeatured on FazierFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowAI ToolzShinyLaunchMillion Dot HomepageSolver Tools

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
Prompts
Supabase Migrations Patterns

Supabase Migrations Patterns

Manage database schema changes safely in Google Antigravity with Supabase migrations.

databasemigrationssupabasepostgresql
by antigravity-team
⭐0Stars
.antigravity
# Supabase Migrations for Google Antigravity

Manage database schema changes safely with proper migration workflows.

## Create Table Migration

```sql
-- supabase/migrations/20240101_create_profiles.sql
CREATE TABLE IF NOT EXISTS public.profiles (
    id UUID PRIMARY KEY REFERENCES auth.users(id) ON DELETE CASCADE,
    email TEXT UNIQUE NOT NULL,
    full_name TEXT,
    avatar_url TEXT,
    created_at TIMESTAMPTZ DEFAULT NOW(),
    updated_at TIMESTAMPTZ DEFAULT NOW()
);

ALTER TABLE public.profiles ENABLE ROW LEVEL SECURITY;

CREATE POLICY "Users can view all profiles" ON public.profiles FOR SELECT USING (true);
CREATE POLICY "Users can update own profile" ON public.profiles FOR UPDATE USING (auth.uid() = id);

CREATE OR REPLACE FUNCTION update_updated_at() RETURNS TRIGGER AS $$
BEGIN NEW.updated_at = NOW(); RETURN NEW; END; $$ LANGUAGE plpgsql;

CREATE TRIGGER profiles_updated_at BEFORE UPDATE ON public.profiles FOR EACH ROW EXECUTE FUNCTION update_updated_at();
```

## Safe Column Addition

```sql
-- supabase/migrations/20240102_add_settings.sql
ALTER TABLE public.profiles ADD COLUMN IF NOT EXISTS settings JSONB DEFAULT '{}'::jsonb;
CREATE INDEX IF NOT EXISTS idx_profiles_settings ON public.profiles USING gin(settings);
```

## Foreign Key Addition

```sql
-- supabase/migrations/20240103_add_posts.sql
CREATE TABLE IF NOT EXISTS public.posts (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    author_id UUID NOT NULL REFERENCES public.profiles(id) ON DELETE CASCADE,
    title TEXT NOT NULL,
    content TEXT,
    published BOOLEAN DEFAULT false,
    created_at TIMESTAMPTZ DEFAULT NOW()
);

CREATE INDEX IF NOT EXISTS idx_posts_author ON public.posts(author_id);
CREATE INDEX IF NOT EXISTS idx_posts_published ON public.posts(published, created_at DESC);

ALTER TABLE public.posts ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Published posts viewable" ON public.posts FOR SELECT USING (published = true);
CREATE POLICY "Authors CRUD own posts" ON public.posts FOR ALL USING (auth.uid() = author_id);
```

## Data Migration

```sql
-- supabase/migrations/20240104_migrate_data.sql
DO $$
DECLARE batch_size INT := 1000; rows_updated INT;
BEGIN
    LOOP
        UPDATE public.profiles SET settings = jsonb_set(COALESCE(settings, '{}'::jsonb), '{migrated}', 'true'::jsonb)
        WHERE id IN (SELECT id FROM public.profiles WHERE settings->'migrated' IS NULL LIMIT batch_size);
        GET DIAGNOSTICS rows_updated = ROW_COUNT;
        EXIT WHEN rows_updated = 0;
        COMMIT; PERFORM pg_sleep(0.1);
    END LOOP;
END $$;
```

## Rollback Pattern

```sql
-- Forward migration
DO $$
BEGIN
    IF NOT EXISTS (SELECT 1 FROM information_schema.columns WHERE table_name = 'profiles' AND column_name = 'subscription_tier') THEN
        ALTER TABLE public.profiles ADD COLUMN subscription_tier TEXT DEFAULT 'free';
        ALTER TABLE public.profiles ADD CONSTRAINT valid_tier CHECK (subscription_tier IN ('free', 'pro', 'enterprise'));
    END IF;
END $$;

-- Rollback (keep in separate file)
-- ALTER TABLE public.profiles DROP CONSTRAINT IF EXISTS valid_tier;
-- ALTER TABLE public.profiles DROP COLUMN IF EXISTS subscription_tier;
```

## TypeScript Runner

```typescript
// scripts/run-migration.ts
import { createClient } from "@supabase/supabase-js";
import fs from "fs";

const supabase = createClient(process.env.SUPABASE_URL!, process.env.SUPABASE_SERVICE_ROLE_KEY!);

async function runMigration(file: string) {
    const sql = fs.readFileSync(`./supabase/migrations/${file}`, "utf8");
    console.log(`Running: ${file}`);
    const { error } = await supabase.rpc("exec_sql", { sql });
    if (error) { console.error("Failed:", error); process.exit(1); }
    console.log("Completed");
}

runMigration(process.argv[2]);
```

## Best Practices

1. **Backup**: Take backups before production migrations
2. **Test First**: Run in staging before production
3. **Small Changes**: Make incremental changes
4. **Rollback Ready**: Have rollback plans
5. **Monitor**: Watch metrics during migrations

When to Use This Prompt

This database prompt is ideal for developers working on:

  • database applications requiring modern best practices and optimal performance
  • Projects that need production-ready database code with proper error handling
  • Teams looking to standardize their database development workflow
  • Developers wanting to learn industry-standard database patterns and techniques

By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their database implementations.

How to Use

  1. Copy the prompt - Click the copy button above to copy the entire prompt to your clipboard
  2. Paste into your AI assistant - Use with Claude, ChatGPT, Cursor, or any AI coding tool
  3. Customize as needed - Adjust the prompt based on your specific requirements
  4. Review the output - Always review generated code for security and correctness
💡 Pro Tip: For best results, provide context about your project structure and any specific constraints or preferences you have.

Best Practices

  • ✓ Always review generated code for security vulnerabilities before deploying
  • ✓ Test the database code in a development environment first
  • ✓ Customize the prompt output to match your project's coding standards
  • ✓ Keep your AI assistant's context window in mind for complex requirements
  • ✓ Version control your prompts alongside your code for reproducibility

Frequently Asked Questions

Can I use this database prompt commercially?

Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.

Which AI assistants work best with this prompt?

This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.

How do I customize this prompt for my specific needs?

You can modify the prompt by adding specific requirements, constraints, or preferences. For database projects, consider mentioning your framework version, coding style, and any specific libraries you're using.

Related Prompts

💬 Comments

Loading comments...