Reliable background task processing with Trigger.dev for Google Antigravity projects including scheduled jobs, webhooks, and long-running tasks.
# Trigger.dev Background Tasks for Google Antigravity
Build reliable background task infrastructure with Trigger.dev in your Google Antigravity IDE projects. This comprehensive guide covers task definitions, scheduling, webhooks, and long-running job patterns optimized for Gemini 3 agentic development.
## Trigger.dev Configuration
Set up Trigger.dev with proper TypeScript configuration:
```typescript
// trigger.config.ts
import { defineConfig } from '@trigger.dev/sdk/v3';
export default defineConfig({
project: 'antigravity-ai',
runtime: 'node',
logLevel: 'log',
retries: {
enabledInDev: true,
default: {
maxAttempts: 3,
minTimeoutInMs: 1000,
maxTimeoutInMs: 10000,
factor: 2,
randomize: true,
},
},
dirs: ['./src/trigger'],
});
```
## Task Definitions
Create type-safe background tasks:
```typescript
// src/trigger/tasks/process-upload.ts
import { task, logger } from '@trigger.dev/sdk/v3';
import { db } from '@/lib/db';
import { s3 } from '@/lib/s3';
import { imageProcessor } from '@/lib/image-processor';
export const processUpload = task({
id: 'process-upload',
maxDuration: 300, // 5 minutes
retry: {
maxAttempts: 3,
minTimeoutInMs: 5000,
},
run: async (payload: {
uploadId: string;
userId: string;
fileKey: string;
fileType: string;
}) => {
const { uploadId, userId, fileKey, fileType } = payload;
logger.info('Starting upload processing', { uploadId, fileType });
// Update status to processing
await db.upload.update({
where: { id: uploadId },
data: { status: 'processing' },
});
try {
// Download file from S3
const file = await s3.getObject({
Bucket: process.env.S3_BUCKET!,
Key: fileKey,
});
const buffer = await file.Body?.transformToByteArray();
if (!buffer) throw new Error('Failed to download file');
// Process based on file type
let processedData;
if (fileType.startsWith('image/')) {
processedData = await processImage(buffer, uploadId);
} else if (fileType === 'application/pdf') {
processedData = await processPdf(buffer, uploadId);
} else {
processedData = await processGeneric(buffer, uploadId);
}
// Update with processed data
await db.upload.update({
where: { id: uploadId },
data: {
status: 'completed',
processedAt: new Date(),
metadata: processedData,
},
});
logger.info('Upload processing completed', { uploadId });
return { success: true, uploadId, processedData };
} catch (error) {
logger.error('Upload processing failed', { uploadId, error });
await db.upload.update({
where: { id: uploadId },
data: {
status: 'failed',
errorMessage: error instanceof Error ? error.message : 'Unknown error',
},
});
throw error; // Re-throw to trigger retry
}
},
});
async function processImage(buffer: Uint8Array, uploadId: string) {
// Generate thumbnails
const thumbnails = await imageProcessor.generateThumbnails(buffer, [
{ width: 150, height: 150, suffix: 'thumb' },
{ width: 400, height: 400, suffix: 'medium' },
{ width: 1200, height: 1200, suffix: 'large' },
]);
// Upload thumbnails to S3
for (const thumb of thumbnails) {
await s3.putObject({
Bucket: process.env.S3_BUCKET!,
Key: `processed/${uploadId}/${thumb.suffix}.webp`,
Body: thumb.buffer,
ContentType: 'image/webp',
});
}
// Extract metadata
const metadata = await imageProcessor.extractMetadata(buffer);
return {
thumbnails: thumbnails.map((t) => t.suffix),
width: metadata.width,
height: metadata.height,
format: metadata.format,
};
}
```
## Scheduled Tasks
Implement cron-based scheduled tasks:
```typescript
// src/trigger/tasks/scheduled-reports.ts
import { schedules, logger } from '@trigger.dev/sdk/v3';
import { db } from '@/lib/db';
import { emailService } from '@/lib/email-service';
import { analyticsService } from '@/lib/analytics';
export const dailyDigest = schedules.task({
id: 'daily-digest',
cron: '0 9 * * *', // Every day at 9 AM UTC
run: async () => {
logger.info('Starting daily digest generation');
// Get all users with digest enabled
const users = await db.user.findMany({
where: {
emailPreferences: { path: ['dailyDigest'], equals: true },
subscriptionStatus: 'active',
},
select: {
id: true,
email: true,
name: true,
},
});
logger.info(`Generating digests for ${users.length} users`);
const results = await Promise.allSettled(
users.map(async (user) => {
// Get personalized stats
const stats = await analyticsService.getUserDailyStats(user.id);
// Get trending prompts
const trending = await db.prompt.findMany({
where: { createdAt: { gte: new Date(Date.now() - 24 * 60 * 60 * 1000) } },
orderBy: { viewCount: 'desc' },
take: 5,
});
// Send digest email
await emailService.sendDailyDigest(user.email, {
userName: user.name,
stats,
trending,
});
return { userId: user.id, success: true };
})
);
const successful = results.filter((r) => r.status === 'fulfilled').length;
const failed = results.filter((r) => r.status === 'rejected').length;
logger.info('Daily digest completed', { successful, failed });
return { successful, failed, total: users.length };
},
});
export const weeklyCleanup = schedules.task({
id: 'weekly-cleanup',
cron: '0 3 * * 0', // Every Sunday at 3 AM UTC
run: async () => {
logger.info('Starting weekly cleanup');
// Clean up expired sessions
const expiredSessions = await db.session.deleteMany({
where: { expires: { lt: new Date() } },
});
// Clean up old notifications
const oldNotifications = await db.notification.deleteMany({
where: {
read: true,
createdAt: { lt: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000) },
},
});
// Clean up orphaned uploads
const orphanedUploads = await db.upload.deleteMany({
where: {
status: 'pending',
createdAt: { lt: new Date(Date.now() - 24 * 60 * 60 * 1000) },
},
});
logger.info('Weekly cleanup completed', {
sessions: expiredSessions.count,
notifications: oldNotifications.count,
uploads: orphanedUploads.count,
});
return {
deletedSessions: expiredSessions.count,
deletedNotifications: oldNotifications.count,
deletedUploads: orphanedUploads.count,
};
},
});
```
## Triggering Tasks from API Routes
Trigger background tasks from your application:
```typescript
// app/api/uploads/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { tasks } from '@trigger.dev/sdk/v3';
import { processUpload } from '@/trigger/tasks/process-upload';
import { db } from '@/lib/db';
import { getServerSession } from 'next-auth';
export async function POST(request: NextRequest) {
const session = await getServerSession(authOptions);
if (!session?.user) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
}
const formData = await request.formData();
const file = formData.get('file') as File;
if (!file) {
return NextResponse.json({ error: 'No file provided' }, { status: 400 });
}
// Create upload record
const upload = await db.upload.create({
data: {
userId: session.user.id,
fileName: file.name,
fileType: file.type,
fileSize: file.size,
status: 'pending',
},
});
// Upload to S3
const fileKey = `uploads/${session.user.id}/${upload.id}/${file.name}`;
await uploadToS3(file, fileKey);
// Trigger background processing
const handle = await tasks.trigger<typeof processUpload>(
'process-upload',
{
uploadId: upload.id,
userId: session.user.id,
fileKey,
fileType: file.type,
}
);
return NextResponse.json({
uploadId: upload.id,
taskId: handle.id,
status: 'processing',
});
}
```
## Best Practices
1. **Use idempotency keys** for task deduplication
2. **Implement proper error handling** with meaningful error messages
3. **Set appropriate timeouts** for long-running tasks
4. **Use structured logging** for debugging and monitoring
5. **Break large tasks** into smaller subtasks
6. **Monitor task queues** for backlogs and failures
7. **Test tasks locally** before deploying to productionThis trigger-dev prompt is ideal for developers working on:
By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their trigger-dev implementations.
Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.
This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.
You can modify the prompt by adding specific requirements, constraints, or preferences. For trigger-dev projects, consider mentioning your framework version, coding style, and any specific libraries you're using.