Prerequisites
- A Bright Data account (includes $2 free credit)
- cURL, Python 3 or Node.js 18+ installed
Get your API token
Go to the user settings page in your Bright Data account and copy your API token.If you don’t have an account yet, sign up at brightdata.com. New users get $2 free credit for testing.
Send a request
We’ll use the Posts — Collect by URL endpoint with a synchronous request. Replace You should see a
YOUR_API_TOKEN with your actual token:200 status code. This takes 10 to 30 seconds.Review the response
The Bright Data Reddit Scraper API returns a JSON array with structured post data:Each post object includes post details, community stats, engagement metrics and attached media. See the full response schema.
Common questions
Can I scrape multiple posts in one request?
Can I scrape multiple posts in one request?
Yes. Add more objects to the input array. Synchronous requests support up to 20 URLs. For larger batches or for discovery by keyword or subreddit, use the async
/trigger endpoint.Can I scrape comments from a post?
Can I scrape comments from a post?
Yes, with the separate Comments dataset. Use dataset ID You can also pass
gd_lvzdpsdlw09j6t702 and pass the post URL:days_back to limit results to comments posted within the last N days.Getting a 401 or 403 error?
Getting a 401 or 403 error?
Verify your API token is correct and hasn’t expired. Generate a new token from Account settings. See the authentication guide for details.
Request is timing out?
Request is timing out?
Synchronous requests have a 1-minute timeout. If the request exceeds this limit, it automatically switches to async and returns a
snapshot_id. Use the async workflow for large batches.Empty or partial response data?
Empty or partial response data?
Verify the Reddit post URL is publicly accessible and correctly formatted. The URL should follow the pattern
https://www.reddit.com/r/{subreddit}/comments/{post_id}/{slug}/. Private subreddits and deleted posts cannot be scraped.Next steps
Send your first request
Explore every endpoint with full examples in cURL, Python and Node.js.
Async batch requests
Scrape hundreds of posts or run keyword discovery in a single batch job.
API reference
Endpoint specs, parameters and response schemas.