Skip to main content

IDE panel

The IDE panel is where you write and test your scraper code. Each labeled component below corresponds to a section of the IDE interface.

A - Templates

A library of pre-built scraper code created by Bright Data’s engineers, covering common websites and scraping patterns. Templates are starting points and may require adjustments if a target website’s structure has changed.

B - Stages

Stages allow a scraper to operate across multiple steps in sequence. Each stage receives input from the previous one via next_stage() or run_stage(). Use stages when scraping requires navigating across multiple page types - for example, collecting URLs from a listing page, then extracting details from each URL.
See Functions for a full code example.

C - Functions reference

An in-IDE reference panel listing all available scraping functions with descriptions and usage examples.
See Interaction functions and Parser functions.

D - Debugging tabs

TabDescription
InputDefine input parameters and select an input set to run a preview test
OutputStructured data returned by the scraper after a preview run
ChildrenInput sets passed to the next stage in a multi-stage scraper
Run logFull code execution log for the most recent preview
Browser networkBrowser-level network activity log (equivalent to DevTools > Network tab)
Last errorsThe most recent error messages, including error codes and affected inputs (last 1,000 stored)
Crawl inspectorAll pages crawled during a batch job, including successes and failures. For multi-stage scrapers, use Search for children to view pages generated from each parent
Output schemaField names and data types for the scraper’s output. Click Edit Schema to modify input or output schema

E - Input

ControlDescription
Add input parameterDefine a new input parameter by name and type
New inputAdd a value to an input set for testing
PreviewRun the scraper against a selected input set

F - Settings

SettingDescription
WorkerSelect Browser Worker or Code Worker for this scraper
Error modeDefine scraper behavior when an error occurs
Take screenshotCapture screenshots of loaded pages during preview runs
See Worker types for guidance on choosing between Browser and Code Workers.

G - Self-Healing Tool

AI-powered code refactor. Accepts plain-language prompts to fix errors or modify input/output fields without manual code editing.
See Self-Healing Tool.

H - Preview

Runs the scraper against the currently selected input set. Results appear in the Output debugging tab.

Scraper Dashboard Menu

The Dashboard lists all your scrapers under My Scrapers. Each scraper has an action menu with the following options:
ActionDescription
Initiate manuallyStart a collection run directly from the UI
Initiate by APITrigger a collection programmatically via API
Run on scheduleConfigure a recurring collection - daily, weekly, or at a custom interval
Delivery preferencesSet the output format and delivery destination for completed jobs
CodeOpen the scraper in the IDE
TicketsView open support tickets for this scraper
Report an issueSubmit a report for platform, scraper, or data quality issues