Implement secure file uploads and management with AWS S3 in Google Antigravity applications including presigned URLs and CDN integration.
# AWS S3 File Management Patterns
Build robust file management systems with AWS S3 in your Google Antigravity applications. This guide covers secure uploads, presigned URLs, CDN integration, and best practices.
## S3 Client Configuration
Set up the AWS S3 client with proper configuration:
```typescript
// lib/aws/s3-client.ts
import { S3Client, PutObjectCommand, GetObjectCommand, DeleteObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
const s3Client = new S3Client({
region: process.env.AWS_REGION!,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
},
});
export const BUCKET_NAME = process.env.AWS_S3_BUCKET!;
export interface UploadOptions {
contentType: string;
maxSizeBytes?: number;
folder?: string;
metadata?: Record<string, string>;
}
export async function generatePresignedUploadUrl(
key: string,
options: UploadOptions
): Promise<{ uploadUrl: string; key: string }> {
const command = new PutObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
ContentType: options.contentType,
Metadata: options.metadata,
});
const uploadUrl = await getSignedUrl(s3Client, command, {
expiresIn: 3600, // 1 hour
});
return { uploadUrl, key };
}
export async function generatePresignedDownloadUrl(
key: string,
expiresIn = 3600
): Promise<string> {
const command = new GetObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
});
return getSignedUrl(s3Client, command, { expiresIn });
}
export async function deleteObject(key: string): Promise<void> {
const command = new DeleteObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
});
await s3Client.send(command);
}
```
## API Routes for File Upload
Create secure API routes for file management:
```typescript
// app/api/upload/route.ts
import { NextRequest, NextResponse } from "next/server";
import { generatePresignedUploadUrl } from "@/lib/aws/s3-client";
import { createClient } from "@/lib/supabase/server";
import { nanoid } from "nanoid";
const ALLOWED_TYPES = ["image/jpeg", "image/png", "image/webp", "application/pdf"];
const MAX_SIZE = 10 * 1024 * 1024; // 10MB
export async function POST(request: NextRequest) {
const supabase = await createClient();
const { data: { user } } = await supabase.auth.getUser();
if (!user) {
return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}
const { filename, contentType, size } = await request.json();
// Validate file type
if (!ALLOWED_TYPES.includes(contentType)) {
return NextResponse.json(
{ error: "File type not allowed" },
{ status: 400 }
);
}
// Validate file size
if (size > MAX_SIZE) {
return NextResponse.json(
{ error: "File too large. Maximum size is 10MB" },
{ status: 400 }
);
}
// Generate unique key
const extension = filename.split(".").pop();
const key = `uploads/${user.id}/${nanoid()}.${extension}`;
const { uploadUrl } = await generatePresignedUploadUrl(key, {
contentType,
metadata: {
userId: user.id,
originalFilename: filename,
},
});
// Store file record in database
await supabase.from("files").insert({
key,
filename,
content_type: contentType,
size,
user_id: user.id,
status: "pending",
});
return NextResponse.json({ uploadUrl, key });
}
```
## React Upload Component
Build a complete upload component:
```typescript
// components/FileUpload.tsx
"use client";
import { useState, useCallback } from "react";
import { useDropzone } from "react-dropzone";
interface FileUploadProps {
onUploadComplete: (key: string) => void;
accept?: Record<string, string[]>;
maxSize?: number;
}
export function FileUpload({ onUploadComplete, accept, maxSize = 10 * 1024 * 1024 }: FileUploadProps) {
const [uploading, setUploading] = useState(false);
const [progress, setProgress] = useState(0);
const [error, setError] = useState<string | null>(null);
const uploadFile = async (file: File) => {
setUploading(true);
setError(null);
setProgress(0);
try {
// Get presigned URL
const response = await fetch("/api/upload", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
filename: file.name,
contentType: file.type,
size: file.size,
}),
});
if (!response.ok) {
const data = await response.json();
throw new Error(data.error || "Upload failed");
}
const { uploadUrl, key } = await response.json();
// Upload directly to S3
const uploadResponse = await fetch(uploadUrl, {
method: "PUT",
body: file,
headers: {
"Content-Type": file.type,
},
});
if (!uploadResponse.ok) {
throw new Error("Failed to upload to S3");
}
// Confirm upload
await fetch("/api/upload/confirm", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ key }),
});
setProgress(100);
onUploadComplete(key);
} catch (err) {
setError((err as Error).message);
} finally {
setUploading(false);
}
};
const onDrop = useCallback((acceptedFiles: File[]) => {
if (acceptedFiles.length > 0) {
uploadFile(acceptedFiles[0]);
}
}, []);
const { getRootProps, getInputProps, isDragActive } = useDropzone({
onDrop,
accept,
maxSize,
multiple: false,
});
return (
<div
{...getRootProps()}
className={`border-2 border-dashed rounded-lg p-8 text-center cursor-pointer transition-colors ${
isDragActive ? "border-blue-500 bg-blue-50" : "border-gray-300 hover:border-gray-400"
}`}
>
<input {...getInputProps()} />
{uploading ? (
<div className="space-y-2">
<div className="w-full bg-gray-200 rounded-full h-2">
<div className="bg-blue-600 h-2 rounded-full transition-all" style={{ width: `${progress}%` }} />
</div>
<p className="text-sm text-gray-600">Uploading... {progress}%</p>
</div>
) : (
<>
<p className="text-gray-600">
{isDragActive ? "Drop the file here" : "Drag & drop a file, or click to select"}
</p>
{error && <p className="text-red-500 text-sm mt-2">{error}</p>}
</>
)}
</div>
);
}
```
## Best Practices
1. **Presigned URLs**: Always use presigned URLs for direct browser uploads
2. **Validation**: Validate file type and size on both client and server
3. **Unique Keys**: Generate unique keys to prevent overwrites
4. **Metadata**: Store original filename and user ID in S3 metadata
5. **CDN Integration**: Use CloudFront for faster delivery
6. **Cleanup**: Implement lifecycle policies for orphaned filesThis aws prompt is ideal for developers working on:
By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their aws implementations.
Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.
This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.
You can modify the prompt by adding specific requirements, constraints, or preferences. For aws projects, consider mentioning your framework version, coding style, and any specific libraries you're using.