Proxy Endpoint
Send and receive data via the Oxylabs Web Scraper API Proxy Endpoint. Access target pages directly through a simple URL-based integration.
Endpoint
GET realtime.oxylabs.io:60000Input
curl -k -x https://realtime.oxylabs.io:60000 \
-U 'USERNAME:PASSWORD' \
-H 'x-oxylabs-user-agent-type: desktop_chrome' \
-H 'x-oxylabs-geo-location: Germany' \
'https://www.example.com'import requests
from pprint import pprint
# Use your SERP API credentials here.
USERNAME, PASSWORD = 'YOUR_USERNAME', 'YOUR_PASSWORD'
# Define proxy dict.
proxies = {
'http': f'http://{USERNAME}:{PASSWORD}@realtime.oxylabs.io:60000',
'https': f'https://{USERNAME}:{PASSWORD}@realtime.oxylabs.io:60000'
}
# To set a specific geo-location, user-agent or to render Javascript
# it is required to send parameters as request headers.
headers = {
'x-oxylabs-user-agent-type': 'desktop_chrome',
'x-oxylabs-geo-location': 'Germany',
#'X-Oxylabs-Render': 'html', # Uncomment if you want to render JavaScript within the page.
}
response = requests.request(
'GET',
'https://www.example.com',
headers = headers, # Pass the defined headers.
verify=False, # Accept our certificate.
proxies=proxies,
)
# Print result page to stdout.
pprint(response.text)
# Save returned HTML to 'result.html' file.
with open('result.html', 'w') as f:
f.write(response.text)Output
Accepted parameters
Parameter
Description
Last updated
Was this helpful?

