In order to export your Authoritas SEO platform data into Google's Big Query service, so you can run complex SQL queries on large datasets, or use the data as a basis for a dashboard you're creating in Google Data Studio or other visualisation tools, you will need to complete the following steps.
We will shortly be publishing a series of powerful Google Data Studio reports which utilise these Big Query datasets to help you build and customise a beautiful set of SEO reports in no time at all. The first iteration of these reports will be published in early June 2021.
N/B: Currently this is available free of charge to all paying monthly or annual subscribers on the Business Expert package or to our paying Agency Providers and Agency Partners only.
If you have a legacy Big Query integration - please speak to your client manager about how to migrate to this new, enhanced data feed. It is possible to do this but it is essential you create a new Big Query Project and Service Key and upload this to the platform or you risk over-writing your existing data feed.
What types of SEO platform data can be exported into Big Query?
We have recently added a new Big Query integration option which allows you to export the following data from your Authoritas platform projects into a single, dedicated Big Query project that you have created following the instructions set out below.
Keyword Ranking Data (including tags)
Share of Search
Google Search Console
Google Analytics
Website crawl
Backlink data
How often is the data updated in Big Query?
Every time a new event is created on the platform then we will try and sync this activity as soon as possible with Big Query. This means that if you create Google Data Studio reports or other visualisations using the data exported from the platform you can be assured that the data is up-to-date.
Getting Started - How do I create a Big Query Project and Service Key?
N/B: You only need to create one Big Query project and Service Key!
We will send all your platform data to this single project - but each project will be in a separate data set with identical file names to make it easy to copy a single Google Data Studio report and connect to the Big Query data you want. Each platform project will have a folder with the following files, as illustrated below.
The data structure in your Big Query instance will be like this:
Big Query Project > Data Set > Tables
(You are responsible for paying your own Google Cloud Platform costs. At the time of writing, the first 1 TB of data processed with BigQuery each month is free - but please check Google's latest pricing for Google Big Query).
1) Login to your Google Console
Login to your Google account and then go to https://console.cloud.google.com/
2) Enable Google Big Query Billing
This is essential - You must enable billing on your Google Big Query account for the selected project and add an up-to-date credit card (and keep it up-to-date).
The integration will not work correctly or at all without billing enabled. We stream data to Big Query and this is not a feature on the free Big Query plan. Furthermore, data is deleted automatically by Google after 60 days on free tier accounts.
Please note, this does not necessarily mean you will start incurring additional costs straightaway as Google offers a reasonably generous free allowance.
3) Create a Service Account
Go to IAM & Admin > Service Accounts and follow the wizard
3 a)
Just provide a name and (optionally) a description. Google will automatically create the Service account ID:
3 b)
Assigned the service account you've just created a role of 'BigQuery user':
3 c) Grant users access to this service account
This is an optional step. You can just click on the DONE button here.
4) Create a key file
From the Service Accounts view (IAM & Admin > Service Accounts), click on the Actions column on the right-hand side and select 'Create key'.
This will ask if you want to create a JSON file or a P12 file:
Leave 'JSON' selected and click on 'CREATE'. This will download a JSON file to your machine.
If you now click on IAM & Admin > IAM now, you should see your service account added as a Member with the 'BigQuery User' role assigned to it (if you don't see this (highlighted in the orange box below), you will need to manually add a Member and ensure the Member name is the same as the service key and the 'BigQuery User' role is assigned to this member):
5) Add Big Query integration to one or more projects in your Authoritas account
Login to Authoritas, go to Settings > Integrations and click on 'Google Big Query':
Select one or more Authoritas projects to sync with Big Query:
6) Upload the JSON file
Finally, just upload the JSON file you created in Step 3. The platform will then send over project data to Big Query. You will then be able to use a template in Google Data Studio and populate it using your data in your Big Query instance by changing the data source for the template.