Summary
Data you can sync
CRM data
Including mapping columns to both standard and custom fields on models
Time-series usage data
Sync direction
CRM data:
Specify per model and field
Choose between both directions, send, or receive
Time-series usage data: BigQuery to Planhat
Sync frequency
CRM data: hourly
Time-series usage data: every 5, 10, 30 or 60 minutes, or daily on a schedule
Data mapping
See table below
Who is this article for?
Anyone who would like an introduction to the BigQuery integration
Series
This article is part of a series on the BigQuery integration:
Setting up the BigQuery integration - an overview ⬅️ You are here
Preparing BigQuery and connecting the integration - part 1 of the main instructions
Data mapping in the BigQuery integration - part 2 of the main instructions
Article contents
Introduction
Planhat's BigQuery integration enables you to sync CRM (static) data bidirectionally, and time-series usage data from BigQuery to Planhat. Data is synced automatically, on a fast, regular cadence.
With your BigQuery data in Planhat, you can leverage Planhat's workflow platform to take data-driven action, to drive long-term value for your customers and improve internal efficiency. In addition, you can consolidate BigQuery data with additional data types, such as tickets or emails, to give your team a complete 360° view of your customers. You can also quickly and easily perform advanced data analysis and visualise your BigQuery data, helping your team proactively identify risks and opportunities to retain and grow customer revenue.
Here's a summary of some key specifications related to the BigQuery integration; we discuss these in further detail in this article.
| CRM data | Time-series usage data |
Automatic sync frequency | Hourly | Your choice: every 5, 10, 30 or 60 minutes, or once per day at specified time |
ID (key) requirements | Unique and text | Unique, numeric and incrementing |
Data mapping
Planhat | Sync direction | BigQuery |
CRM data: Planhat models (Company, End User, License and Opportunity etc.) | Bidirectional (or one-way)* | Tables or views |
Time-series usage data (User Activities and Custom Metrics) | From BigQuery to Planhat | Tables or views |
*Note: Data cannot be synced from Planhat back to BigQuery views, as they are read-only.
Time-series usage data is applicable for the models Company, End User, Asset and Project.
Summary of setup steps
There are two main stages to setting up the BigQuery integration:
Preparation in BigQuery and integration authentication
For details, see this article
A. In BigQuery
Create a new project in Google Cloud
Set up OAuth consent screen
Begin to configure credentials
B. In Planhat
Open BigQuery
Copy redirect URI
C. In BigQuery
Generate credentials (using URI from Planhat)
D. In Planhat
Enter BigQuery credentials and authenticate
Enable the integration
Select your BigQuery database
Data mapping in Planhat's BigQuery integration
For details, see this article
The next stage is configuring the mapping in the BigQuery integration in Planhat, for CRM and/or usage data depending on what you want to sync
The main principles for each sync section are:
Choose which data model/type in Planhat to map to which column in BigQuery
For CRM data, choose the sync direction
Specify the unique IDs (keys) used in Planhat and BigQuery
Set up field mapping
For usage data, choose the sync frequency and batch size
Sync frequency
CRM data
CRM data can be synced automatically, both to and from BigQuery, on an hourly basis.
To enable the automatic sync, turn on the toggle switch "activate BigQuery automation synchronization" at the top of the integration.
You also have the option to sync manually via the "Full sync" button, or using the arrow symbols at the right of each sync section.
The BigQuery sync is always a full data sync (rather than a partial sync).
Usage data
You can set Planhat to fetch usage data from BigQuery every 5, 10, 30, or 60 minutes, or daily at a specified time.
Unique IDs
For both CRM data and usage data, the integration uses unique IDs for data mapping. You may hear these referred to as keys, keyables, IDs or identifiers.
The requirements for these unique IDs are slightly different for CRM data compared to usage data.
CRM data
When configuring CRM data mapping, for each sync section (Planhat data model), you select an ID for both the Planhat side and the BigQuery side, which is used to match records between the two systems.
The ID on the Planhat side is typically Source ID or External ID, although you can use email address in some cases, and it's also possible to select the Planhat native ID (although generally this is not recommended - see here for further details). These are fields on the Planhat models, containing a unique ID to specify that particular record.
For the ID on the BigQuery side, select a suitable column from the dropdown menu. The ID/key must be unique and text.
Usage data
When it comes to the one-way sync of usage data (time-series data), there are some differences:
There isn't an ID on the Planhat side, only the BigQuery side
The ID in BigQuery is also used for change tracking - i.e. to identify which rows are new data (that haven't been synced yet)
Rows in BigQuery must be identified by a unique, numeric, incrementing column for the ID (key).
Incrementing means increasing in value for each new row added, so that it is sorted from least to most recent
An ideal key would be an ingestion/sequence timestamp - e.g. 1, 2, ..., 2104, ....
It should be maximum 32 characters
Logs and errors
If you follow the instructions in the next articles in this series, the setup process should be smooth and easy.
If there are any errors with authorisation or sync (for example, if your BigQuery token has expired), they are surfaced in the BigQuery integration.
There are also detailed logs available within the integration, so you can see what actions have occurred and when.
This means it's easy for you to investigate and resolve any issues. Your TAM/CSM and our Support team would be happy to help if you require any additional advice.
Next steps
If you're ready to start setting up the BigQuery integration, part 1 of the setup guide is here.
[Tag for search purposes: Big Query]