Cloud Storage

Scraper API job results are stored in our storage. You can get your results from our storage by GETting the /results endpoint.

As an alternative, we can upload the results onto your cloud storage. This way, you don't have to make extra requests to fetch results - everything goes directly to your storage bucket.

Cloud storage integration works only with Push-Pull integration method.

Currently, we support Amazon S3 and Google Cloud Storage. If you would like to use a different type of storage, please contact your account manager to discuss the feature delivery timeline.

The upload path looks like this: YOUR_BUCKET_NAME/job_ID.json. You will find the job ID in the response that you receive from us after submitting a job.

Input

Parameter
Description
Valid values

storage_type

Your cloud storage type.

s3 (AWS S3);

s3_compatible (any S3-compatible storage; gcs (Google Cloud Storage).

storage_url

Your cloud storage bucket name / URL.

  • Any s3 or gcs bucket name;

  • Any s3-compatible storage URL.

Google Cloud Storage

The payload below makes Web Scraper API scrape htttps://example.com and put the result on a Google Cloud Storage bucket.

{
    "source": "universal",
    "query": "https://example.com",
    "storage_type": "gcs",
    "storage_url": "bucket_name/path"
}

To get your job results uploaded to your Google Cloud Storage bucket, please set up special permissions for our service. To do that, create a custom role with the storage.objects.create permission, and assign it to the Oxylabs service account email [email protected].

Amazon S3

The payload below makes Web Scraper API scrape htttps://example.com and put the result on an Amazon S3 bucket.

{
    "source": "universal",
    "query": "https://example.com",
    "storage_type": "s3",
    "storage_url": "bucket_name/path"
}

To get your job results uploaded to your Amazon S3 bucket, please set up access permissions for our service. To do that, go to https://s3.console.aws.amazon.com/ > S3 > Storage > Bucket Name (if don't have one, create a new one) > Permissions > Bucket Policy

You can find the bucket policy attached below or in the code sample area.

s3 bucket policy

Don't forget to change the bucket name under YOUR_BUCKET_NAME. This policy allows us to write to your bucket, give access to uploaded files to you, and know the location of the bucket.

{
    "Version": "2012-10-17",
    "Id": "Policy1577442634787",
    "Statement": [
        {
            "Sid": "Stmt1577442633719",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::324311890426:user/oxylabs.s3.uploader"
            },
            "Action": "s3:GetBucketLocation",
            "Resource": "arn:aws:s3:::YOUR_BUCKET_NAME"
        },
        {
            "Sid": "Stmt1577442633719",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::324311890426:user/oxylabs.s3.uploader"
            },
            "Action": [
                "s3:PutObject",
                "s3:PutObjectAcl"
            ],
            "Resource": "arn:aws:s3:::YOUR_BUCKET_NAME/*"
        }
    ]
}

Alibaba Cloud Object Storage Service (OSS)

The payload below makes Web Scraper API scrape htttps://example.com and put the result on an Alibaba OSS bucket.

{
    "source": "universal",
    "query": "https://example.com",
    "storage_type": "s3_compatible",
    "storage_url": "https://ACCESS_KEY_ID:ACCESS_KEY_SECRET@BUCKET_NAME.oss-REGION.aliyuncs.com/FOLDER_NAME"
}

Forming the Storage URL

Storage URL format:

https://ACCESS_KEY_ID:ACCESS_KEY_SECRET@BUCKET_NAME.oss-REGION.aliyuncs.com/FOLDER_NAME

Currently we cannot upload to the root bucket. Please provide a specific folder name for your uploads.

Here’s where you will find the BUCKET_NAME and oss-REGION of your bucket:

Creating the Access Key and Secret

Here we will create the ACCESS_KEY_ID and ACCESS_KEY_SECRET for using S3 compatible interface to Alibaba OSS. For more information, see How to use Amazon S3 SDKs to access OSS - Object Storage Service - Alibaba Cloud Documentation Center.

  1. Go to AccessKey Account Menu:

  2. Log on to the RAM console by using an Alibaba Cloud account or a RAM user who has administrative rights.

  3. In the left-side navigation pane, choose Identities > Users.

  4. On the Users page, click Create User. Use the RAM User AccessKey.

  5. Grant permissions to the RAM user. The newly created RAM user has no permissions. You must grant AliyunOSSFullAccess permissions to the RAM user. Then, the RAM user can access the required Alibaba Cloud resources. For more information, see Grant permissions to RAM users.

  6. When permissions are granted, go back to “Authentication” section and in the “Access Key” part pick ”Create AccessKey”. Choose to create Access Key for “Third Party” service”. You will see an AccessKey ID and AccessKey Secret which you can then use in your requests.

Alibaba OSS Ratelimits

When doing concurrent uploads to Alibaba OSS, it is possible to hit their account/bucket ratelimits and the uploads will start timing out with the following error:

In this case, please contact Alibaba OSS support to increase your OSS ratelimits.

Other S3-compatible storage

If you'd like to get your results delivered to an S3-compatible storage location, you will have to include your bucket's ACCESS_KEY:SECRET auth string in the storage_url value in the payload:

{
    "source": "universal",
    "url": "https://example.com",
    "storage_type": "s3_compatible",
    "storage_url": "https://ACCESS_KEY:[email protected]/my-videos"
}

Last updated

Was this helpful?