Getting Started

Integrating Web Unblocker is easy, especially if you have previously used regular proxies for web scraping. The only difference is that we require you to ignore the SSL certificate using the -k or --insecure cURL flags (or an equivalent expression in the language of your choice).

To make a request using Web Unblocker, you need to use the proxy endpoint. See a cURL example below. You can find code samples in other languages here or complete code examples on our GitHub.

Use to check the parameters of your IPs—this domain delivers information from four geolocation databases: MaxMind, IP2Location, DB-IP, and The parameters include IP address, provider, country, city, ZIP code, ASN, organization name, time zone, and meta (when disclosed by database).

curl -k -x -U "USERNAME:PASSWORD" ""

If you are observing low success rates or retrieve empty content, please try adding additional "x-oxylabs-render: html" header with your request.

If Web Unblocker is being used to scrape websites dependent on loading data via JavaScript, refer to the JavaScript rendering section. The product is not designed to be used with headless browsers (e.g., Chromium, PhantomJS, Splash, etc.) and their drivers (e.g., Playwright, Selenium, Puppeteer, etc.) directly.

Watch the video below for an example of scraping difficult target without getting blocked:


If you want to learn more about getting data on a large scale with Web Unblocker - we suggest watching this Scraping Experts lesson:

Last updated