Prerequisites
- A Bright Data account with an active API token
- Familiarity with the async request workflow
- A publicly accessible HTTP endpoint (or a testing tool like webhook.site)
How webhooks work
When you trigger an async collection with awebhook URL, Bright Data sends a POST request to your endpoint with the scraped data once the job completes. No polling required.
Step 1: Set up a test webhook
For testing, use webhook.site to get a temporary public URL:- Open webhook.site in your browser
- Copy the unique URL displayed (e.g.,
https://webhook.site/abc-123-def) - Keep the page open to monitor incoming requests
Step 2: Trigger a collection with the webhook URL
Add thewebhook query parameter to your async /trigger request:
| Parameter | Description |
|---|---|
webhook | Your HTTP endpoint URL that receives the POST payload |
uncompressed_webhook | Set to true to receive uncompressed JSON (default is gzip) |
format | Output format: json, ndjson, or csv |
Step 3: Verify delivery
Once the collection completes (typically 30-60 seconds for a few profiles), check your webhook.site page. You should see aPOST request with the scraped data.
The payload is the same JSON array you would receive from a direct API download:
Production webhook setup
For production, point thewebhook URL to your own server endpoint.
Express.js handler
server.js
Flask handler
server.py
Webhook with authorization
If your endpoint requires authentication, add thewebhook_header_Authorization parameter:
Allowlist webhook IPs
If your server uses an IP allowlist, add the following Bright Data webhook source IPs:Troubleshooting
Webhook not receiving data?
Webhook not receiving data?
- Verify the URL is publicly accessible (not
localhost) - Check that your endpoint returns a
200status code within 30 seconds - Verify the webhook IPs above are allowlisted if you have firewall rules
Receiving compressed data?
Receiving compressed data?
If you omit
uncompressed_webhook=true, data arrives gzip-compressed. Add uncompressed_webhook=true to your trigger URL, or decompress the payload on your server.Payload too large for your server?
Payload too large for your server?
Large collections can produce payloads up to 1 GB. Set
express.json({ limit: "100mb" }) in Express.js or equivalent in your framework. If you need to handle very large datasets, use S3 delivery instead.Next steps
Deliver to Amazon S3
Store results directly in your S3 bucket.
All delivery options
Snowflake, Azure, GCS, and more.