Manage database schema changes safely in Google Antigravity with Supabase migrations.
# Supabase Migrations for Google Antigravity
Manage database schema changes safely with proper migration workflows.
## Create Table Migration
```sql
-- supabase/migrations/20240101_create_profiles.sql
CREATE TABLE IF NOT EXISTS public.profiles (
id UUID PRIMARY KEY REFERENCES auth.users(id) ON DELETE CASCADE,
email TEXT UNIQUE NOT NULL,
full_name TEXT,
avatar_url TEXT,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
ALTER TABLE public.profiles ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Users can view all profiles" ON public.profiles FOR SELECT USING (true);
CREATE POLICY "Users can update own profile" ON public.profiles FOR UPDATE USING (auth.uid() = id);
CREATE OR REPLACE FUNCTION update_updated_at() RETURNS TRIGGER AS $$
BEGIN NEW.updated_at = NOW(); RETURN NEW; END; $$ LANGUAGE plpgsql;
CREATE TRIGGER profiles_updated_at BEFORE UPDATE ON public.profiles FOR EACH ROW EXECUTE FUNCTION update_updated_at();
```
## Safe Column Addition
```sql
-- supabase/migrations/20240102_add_settings.sql
ALTER TABLE public.profiles ADD COLUMN IF NOT EXISTS settings JSONB DEFAULT '{}'::jsonb;
CREATE INDEX IF NOT EXISTS idx_profiles_settings ON public.profiles USING gin(settings);
```
## Foreign Key Addition
```sql
-- supabase/migrations/20240103_add_posts.sql
CREATE TABLE IF NOT EXISTS public.posts (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
author_id UUID NOT NULL REFERENCES public.profiles(id) ON DELETE CASCADE,
title TEXT NOT NULL,
content TEXT,
published BOOLEAN DEFAULT false,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_posts_author ON public.posts(author_id);
CREATE INDEX IF NOT EXISTS idx_posts_published ON public.posts(published, created_at DESC);
ALTER TABLE public.posts ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Published posts viewable" ON public.posts FOR SELECT USING (published = true);
CREATE POLICY "Authors CRUD own posts" ON public.posts FOR ALL USING (auth.uid() = author_id);
```
## Data Migration
```sql
-- supabase/migrations/20240104_migrate_data.sql
DO $$
DECLARE batch_size INT := 1000; rows_updated INT;
BEGIN
LOOP
UPDATE public.profiles SET settings = jsonb_set(COALESCE(settings, '{}'::jsonb), '{migrated}', 'true'::jsonb)
WHERE id IN (SELECT id FROM public.profiles WHERE settings->'migrated' IS NULL LIMIT batch_size);
GET DIAGNOSTICS rows_updated = ROW_COUNT;
EXIT WHEN rows_updated = 0;
COMMIT; PERFORM pg_sleep(0.1);
END LOOP;
END $$;
```
## Rollback Pattern
```sql
-- Forward migration
DO $$
BEGIN
IF NOT EXISTS (SELECT 1 FROM information_schema.columns WHERE table_name = 'profiles' AND column_name = 'subscription_tier') THEN
ALTER TABLE public.profiles ADD COLUMN subscription_tier TEXT DEFAULT 'free';
ALTER TABLE public.profiles ADD CONSTRAINT valid_tier CHECK (subscription_tier IN ('free', 'pro', 'enterprise'));
END IF;
END $$;
-- Rollback (keep in separate file)
-- ALTER TABLE public.profiles DROP CONSTRAINT IF EXISTS valid_tier;
-- ALTER TABLE public.profiles DROP COLUMN IF EXISTS subscription_tier;
```
## TypeScript Runner
```typescript
// scripts/run-migration.ts
import { createClient } from "@supabase/supabase-js";
import fs from "fs";
const supabase = createClient(process.env.SUPABASE_URL!, process.env.SUPABASE_SERVICE_ROLE_KEY!);
async function runMigration(file: string) {
const sql = fs.readFileSync(`./supabase/migrations/${file}`, "utf8");
console.log(`Running: ${file}`);
const { error } = await supabase.rpc("exec_sql", { sql });
if (error) { console.error("Failed:", error); process.exit(1); }
console.log("Completed");
}
runMigration(process.argv[2]);
```
## Best Practices
1. **Backup**: Take backups before production migrations
2. **Test First**: Run in staging before production
3. **Small Changes**: Make incremental changes
4. **Rollback Ready**: Have rollback plans
5. **Monitor**: Watch metrics during migrationsThis database prompt is ideal for developers working on:
By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their database implementations.
Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.
This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.
You can modify the prompt by adding specific requirements, constraints, or preferences. For database projects, consider mentioning your framework version, coding style, and any specific libraries you're using.