You will need:

  • A service account with permissions to access the specified dataset and table
  • A base64 encoded service account key in JSON format

Service account

It is recommended you create a service account that has permissions to create a table in your BigQuery dataset and write data into it for the Validio platform to egress data. Details of these permissions and roles can be found on GCP here.

Service account needs to be assigned the following roles:

  • BigQuery Data Editor (roles/bigquery.dataEditor)

Service account key

  • Obtain a service account key in JSON form for the service account. GCP instructions can be found here
  • Encode the service account key in base64

BigQuery destination configuration parameters

Description of the fields that can be configured when setting up a BigQuery connector:

Field Required Notes Example
Name Identifier for the connector. Used when setting up pipelines. East coast weather forecast - validated
Credentials Base64 encoded form of the service account key in the JSON format.
Project id Name of BigQuery project. You can find it in the GCP console. weather-forecast
Dataset id Name of the dataset where the table is included. east-coat-validated
Table name Name of a new table that will be created during initialization of destination connector train-data-validated

BigQuery destination table structure

Column Data type Description
time timestamp A time of egress data
is_anomaly bool A flag if a data point is filtered out as anomaly.
applied_filters string A list of id of applied filters and flags of anomaly in JSON format.
datapoint string Original data point in JSON format.