2026-05-05 12:17:25+08
Collecting data from the web manually is impossible at scale. This prompt generates a BeautifulSoup or Selenium script to extract data ethically and efficiently.
Write a Python script using BeautifulSoup to scrape the titles and prices of books from "example.com/books". Include a "User-Agent" header to prevent being blocked and save the data to a CSV file.
Including a "User-Agent" and a "Rate Limit" (time.sleep) is standard practice to ensure you aren't accidentally performing a DDoS attack on the site.
import requests from bs4 import BeautifulSoup ... headers = {'User-Agent': '...'}