Cloud Storage
Scraper API job results are stored in our storage. You can get your results from our storage by GET
ting the /results
endpoint.
As an alternative, we can upload the results to your cloud storage. This way, you don't have to make extra requests to fetch results – everything goes directly to your storage bucket.
Currently, we support these cloud storage services:
If you'd like to use a different type of storage, please contact your account manager to discuss the feature delivery timeline.
The upload path looks like this: YOUR_BUCKET_NAME/job_ID.json
. You'll find the job ID in the response that you receive from us after submitting a job.
Input
storage_type
Your cloud storage type.
gcs
(Google Cloud Storage);
s3
(AWS S3);
s3_compatible
(any S3-compatible storage).
storage_url
Your cloud storage bucket name / URL.
Any
s3
orgcs
bucket name;Any
s3-compatible
storage URL.
Google Cloud Storage
The payload below makes Web Scraper API scrape https://example.com
and put the result on a Google Cloud Storage bucket.
{
"source": "universal",
"query": "https://example.com",
"storage_type": "gcs",
"storage_url": "bucket_name/path"
}
To get your job results uploaded to your Google Cloud Storage bucket, please set up special permissions for our service as shown below:
Create a custom role

Add storage.objects.create
permission

Assign it to Oxylabs
In the New members field, enter the following Oxylabs service account email:

Amazon S3
The payload below makes Web Scraper API scrape https://example.com
and put the result on an Amazon S3 bucket.
{
"source": "universal",
"query": "https://example.com",
"storage_type": "s3",
"storage_url": "bucket_name/path"
}
To get your job results uploaded to your Amazon S3 bucket, please set up access permissions for our service. To do that, go to https://s3.console.aws.amazon.com/ → S3
→ Storage
→ Bucket Name
(if you don't have one, create a new one)
→ Permissions
→ Bucket Policy
.

You can find the bucket policy attached below or in the code sample area.
Don't forget to change the bucket name under YOUR_BUCKET_NAME
. This policy allows us to write to your bucket, give access to uploaded files to you, and know the location of the bucket.
{
"Version": "2012-10-17",
"Id": "Policy1577442634787",
"Statement": [
{
"Sid": "Stmt1577442633719",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::324311890426:user/oxylabs.s3.uploader"
},
"Action": "s3:GetBucketLocation",
"Resource": "arn:aws:s3:::YOUR_BUCKET_NAME"
},
{
"Sid": "Stmt1577442633719",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::324311890426:user/oxylabs.s3.uploader"
},
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": "arn:aws:s3:::YOUR_BUCKET_NAME/*"
}
]
}
Alibaba Cloud Object Storage Service (OSS)
The payload below makes Web Scraper API scrape https://example.com
and put the result on an Alibaba Cloud OSS bucket.
{
"source": "universal",
"query": "https://example.com",
"storage_type": "s3_compatible",
"storage_url": "https://ACCESS_KEY_ID:ACCESS_KEY_SECRET@BUCKET_NAME.oss-REGION.aliyuncs.com/FOLDER_NAME"
}
Forming the Storage URL
Storage URL format:
https://ACCESS_KEY_ID:ACCESS_KEY_SECRET@BUCKET_NAME.oss-REGION.aliyuncs.com/FOLDER_NAME
Currently, we cannot upload to the root bucket. Please provide a specific folder name for your uploads.
Here’s where you'll find the BUCKET_NAME
and oss-REGION
of your bucket:

Creating the Access Key and Secret
In order to use the S3-compatible interface with Alibaba OSS, you must create the ACCESS_KEY_ID
and ACCESS_KEY_SECRET
as shown below. For more information, see How to use Amazon S3 SDKs to access OSS.
Go to the AccessKey Account Menu

Log on to the RAM console
Access the RAM console by using an Alibaba Cloud account or a RAM user who has administrative rights.
Go to Identities → Users in the left-side navigation pane
Select Create User and use the RAM User AccessKey:


Grant permissions to the RAM user
The newly created RAM user has no permissions. You must grant AliyunOSSFullAccess permissions to the RAM user. Then, the RAM user can access the required Alibaba Cloud resources. For more information, see Grant permissions to RAM users.

Get your AccessKey ID and AccessKey Secret
When permissions are granted, return to the Authentication section and, in the Access Key section, select Create AccessKey. Choose to create an Access Key for a Third-Party service. You'll then see an ACCESS_KEY_ID
and ACCESS_KEY_SECRET
, which you can then use in your requests.
Alibaba OSS Rate limits
When doing concurrent uploads to Alibaba OSS, it's possible to hit their account/bucket rate limits, and the uploads will start timing out with the following error:

In this case, please contact Alibaba OSS support to increase your OSS rate limits.
Other S3-compatible storage
If you'd like to get your results delivered to an S3-compatible storage location, you'll have to include your bucket's ACCESS_KEY:SECRET
auth string in the storage_url
value in the payload:
{
"source": "universal",
"url": "https://example.com",
"storage_type": "s3_compatible",
"storage_url": "https://ACCESS_KEY:[email protected]/my-videos"
}
Last updated
Was this helpful?