SERP Scraper API is designed to extract real-time public data from leading search engines.
Getting started
Create your API user credentials : Sign up for a free trial or purchase the product in the Oxylabs dashboard to create your API user credentials (USERNAME
and PASSWORD
).
If you need more than one API user for your account, please contact our customer support or message our 24/7 live chat support.
Request sample
cURL Python Node.js PHP C# Golang HTTP Java JSON
Copy curl 'https://realtime.oxylabs.io/v1/queries' \
--user 'USERNAME:PASSWORD' \
-H 'Content-Type: application/json' \
-d '{
"source": "google_search",
"query": "adidas",
"geo_location": "California,United States",
"parse": true
}'
Copy import requests
from pprint import pprint
# Structure payload.
payload = {
'source' : 'google_search' ,
'query' : 'adidas' ,
'geo_location' : 'California,United States' ,
'parse' : True
}
# Get response.
response = requests . request (
'POST' ,
'https://realtime.oxylabs.io/v1/queries' ,
auth = ( 'USERNAME' , 'PASSWORD' ),
json = payload,
)
# Print prettified response to stdout.
pprint (response. json ())
Copy const https = require ( "https" );
const username = "USERNAME" ;
const password = "PASSWORD" ;
const body = {
source : "google_search" ,
query : "adidas" ,
geo_location : "California,United States" ,
parse : true ,
};
const options = {
hostname : "realtime.oxylabs.io" ,
path : "/v1/queries" ,
method : "POST" ,
headers : {
"Content-Type" : "application/json" ,
Authorization :
"Basic " + Buffer .from ( ` ${ username } : ${ password } ` ) .toString ( "base64" ) ,
} ,
};
const request = https .request (options , (response) => {
let data = "" ;
response .on ( "data" , (chunk) => {
data += chunk;
});
response .on ( "end" , () => {
const responseData = JSON .parse (data);
console .log ( JSON .stringify (responseData , null , 2 ));
});
});
request .on ( "error" , (error) => {
console .error ( "Error:" , error);
});
request .write ( JSON .stringify (body));
request .end ();
Copy <? php
$params = array (
'source' => 'google_search' ,
'query' => 'adidas' ,
'geo_location' => 'California,United States' ,
'parse' => true
);
$ch = curl_init () ;
curl_setopt ( $ch , CURLOPT_URL , "https://realtime.oxylabs.io/v1/queries" ) ;
curl_setopt ( $ch , CURLOPT_RETURNTRANSFER , 1 ) ;
curl_setopt ( $ch , CURLOPT_POSTFIELDS , json_encode ( $params )) ;
curl_setopt ( $ch , CURLOPT_POST , 1 ) ;
curl_setopt ( $ch , CURLOPT_USERPWD , "USERNAME" . ":" . "PASSWORD" ) ;
$headers = array ();
$headers[] = "Content-Type: application/json" ;
curl_setopt ( $ch , CURLOPT_HTTPHEADER , $headers ) ;
$result = curl_exec ( $ch ) ;
echo $result;
if ( curl_errno ( $ch ) ) {
echo 'Error:' . curl_error ( $ch ) ;
}
curl_close ( $ch ) ;
Copy using System ;
using System . Net . Http ;
using System . Net . Http . Json ;
using System . Threading . Tasks ;
namespace OxyApi
{
class Program
{
static async Task Main ()
{
const string Username = "USERNAME" ;
const string Password = "PASSWORD" ;
var parameters = new {
source = "google_search" ,
query = "adidas" ,
geo_location = "California,United States" ,
parse = true
};
var client = new HttpClient ();
Uri baseUri = new Uri ( "https://realtime.oxylabs.io" );
client . BaseAddress = baseUri;
var requestMessage = new HttpRequestMessage ( HttpMethod . Post , "/v1/queries" );
requestMessage . Content = JsonContent . Create (parameters);
var authenticationString = $"{Username}:{Password}" ;
var base64EncodedAuthenticationString = Convert.ToBase64String(System.Text.Encoding.ASCII.GetBytes(authenticationString));
requestMessage . Headers . Add ( "Authorization" , "Basic " + base64EncodedAuthenticationString);
var response = await client . SendAsync (requestMessage);
var contents = await response . Content . ReadAsStringAsync ();
Console . WriteLine (contents);
}
}
}
Copy package main
import (
"bytes"
"encoding/json"
"fmt"
"io/ioutil"
"net/http"
)
func main () {
const Username = "USERNAME"
const Password = "PASSWORD"
payload := map [ string ] interface {}{
"source" : "google_search" ,
"query" : "adidas" ,
"geo_location" : "California,United States" ,
"parse" : true ,
}
jsonValue, _ := json. Marshal (payload)
client := & http . Client {}
request, _ := http. NewRequest ( "POST" ,
"https://realtime.oxylabs.io/v1/queries" ,
bytes. NewBuffer (jsonValue),
)
request. SetBasicAuth (Username, Password)
request.Header. Set ( "Content-Type" , "application/json" )
response, _ := client. Do (request)
responseText, _ := ioutil. ReadAll (response.Body)
fmt. Println ( string (responseText))
}
Copy https://realtime.oxylabs.io/v1/queries?source=google_search&query=adidas&geo_location=California%2CUnited%20States&parse=true&access_token=12345abcde
Copy package org . example ;
import okhttp3 . * ;
import org . json . JSONObject ;
import java . util . concurrent . TimeUnit ;
public class Main implements Runnable {
private static final String AUTHORIZATION_HEADER = "Authorization" ;
public static final String USERNAME = "USERNAME" ;
public static final String PASSWORD = "PASSWORD" ;
public void run () {
JSONObject jsonObject = new JSONObject() ;
jsonObject . put ( "source" , "google_search" );
jsonObject . put ( "query" , "adidas" );
jsonObject . put ( "geo_location" , "California,United States" );
jsonObject . put ( "parse" , true );
Authenticator authenticator = (route , response) -> {
String credential = Credentials . basic (USERNAME , PASSWORD);
return response
. request ()
. newBuilder ()
. header (AUTHORIZATION_HEADER , credential)
. build ();
};
var client = new OkHttpClient . Builder ()
. authenticator (authenticator)
. readTimeout ( 180 , TimeUnit . SECONDS )
. build ();
var mediaType = MediaType . parse ( "application/json; charset=utf-8" );
var body = RequestBody . create ( jsonObject . toString () , mediaType);
var request = new Request . Builder ()
. url ( "https://realtime.oxylabs.io/v1/queries" )
. post (body)
. build ();
try ( var response = client . newCall (request) . execute ()) {
if ( response . body () != null ) {
try ( var responseBody = response . body ()) {
System . out . println ( responseBody . string ());
}
}
} catch ( Exception exception) {
System . out . println ( "Error: " + exception . getMessage ());
}
System . exit ( 0 );
}
public static void main ( String [] args) {
new Thread( new Main()) . start ();
}
}
Copy {
"source" : "google_search" ,
"query" : "adidas" ,
"geo_location" : "California,United States" ,
"parse" : true
}
We use synchronous Realtime integration method in our examples. If you would like to use Proxy Endpoint or asynchronous Push-Pull integration, refer to the integration methods section.
Request parameter values
source - Set the scraper that will be used to process your request.
URL or query - Provide the URL
or query
for the type of page you want to scrape. Refer to the table below and the corresponding target sub-pages for detailed guidance on when to use each parameter.
Optionally, you can include additional parameters such as geo_location
, user_agent_type
, parse
, render
and more to customize your scraping request. Read more: Features .
- mandatory parameter
Target Source (Scraping URL) Source (Using Query) google_search
,
google_ads
,
google_images
,
google_lens
,
google_maps
,
google_travel_hotels
,
google_suggest
,
google_trends_explore
If you need any assistance in making your first request, feel free to contact us via the 24/7 available live chat.
Testing via Scraper APIs Playground
Login to Oxyabs dashboard and try SERP Scraper API in the Scraper APIs Playground.
Testing via Postman
Get started with our API using Postman, a handy tool for making HTTP requests. Download our SERP Scraper API Postman collection and import it. This collection includes examples that demonstrate the functionality of the scraper. Customize the examples to your needs or start scraping right away.
For step-by-step instructions, watch our video tutorial below. If you're new to Postman, check out this short guide .
All information herein is provided on an “as is” basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on this page. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website’s terms of service or receive a scraping license.