Data extraction for Google Antigravity IDE
# Web Scraping Patterns for Google Antigravity
Master web scraping in Google Antigravity IDE. This guide covers HTML parsing, browser automation, and rate limiting.
## Cheerio for HTML Parsing
```typescript
import * as cheerio from "cheerio";
async function scrapeArticle(url: string) {
const response = await fetch(url);
const html = await response.text();
const $ = cheerio.load(html);
return {
title: $("h1").first().text().trim(),
content: $("article").text().trim(),
author: $("meta[name="author"]").attr("content"),
publishedAt: $("time").attr("datetime"),
images: $("article img")
.map((_, el) => $(el).attr("src"))
.get()
};
}
```
## Playwright for Dynamic Content
```typescript
import { chromium } from "playwright";
async function scrapeWithBrowser(url: string) {
const browser = await chromium.launch({ headless: true });
const page = await browser.newPage();
await page.goto(url, { waitUntil: "networkidle" });
// Wait for dynamic content
await page.waitForSelector(".product-list");
const products = await page.evaluate(() => {
return Array.from(document.querySelectorAll(".product")).map(el => ({
name: el.querySelector(".name")?.textContent,
price: el.querySelector(".price")?.textContent,
url: el.querySelector("a")?.href
}));
});
await browser.close();
return products;
}
```
## Rate Limiting and Politeness
```typescript
import pLimit from "p-limit";
const limit = pLimit(2); // 2 concurrent requests
async function scrapeMultiple(urls: string[]) {
const results = await Promise.all(
urls.map(url => limit(async () => {
await delay(1000); // 1 second between requests
return scrapeArticle(url);
}))
);
return results;
}
function delay(ms: number) {
return new Promise(resolve => setTimeout(resolve, ms));
}
```
## Best Practices
1. **Respect robots.txt** - Check allowed paths
2. **Rate limit requests** - Be polite to servers
3. **Handle errors gracefully** - Retry with backoff
Google Antigravity IDE provides scraping scaffolding.This Web Scraping prompt is ideal for developers working on:
By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their web scraping implementations.
Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.
This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.
You can modify the prompt by adding specific requirements, constraints, or preferences. For Web Scraping projects, consider mentioning your framework version, coding style, and any specific libraries you're using.