Skip to main content
Connect to PostgreSQL

Learn how to set up the PostgreSQL integration. We cover what you can sync, how to do it, and some best practices

Christian Dreyer avatar
Written by Christian Dreyer
Updated over a week ago

Planhat's integration with PostgreSQL allows you to push usage data to Planhat seamlessly. This means that you can send time-series data to Planhat as user activities or custom metrics. This guide walks through the key requirements of the connection with PostgreSQL and how to configure it. The integration is not for CRM data.

📌Important to Know

In order to be able to push usage data from PostgreSQL, any relevant tables must have a unique, incrementing, numeric keyable so that Planhat can identify rows that have not yet been synced to Planhat.

For the column that contains the key:

  1. Each row must have a unique value.

  2. The key must be incrementing (increasing in value for each new row added, limited to a maximum of 32 characters) so that it is sorted from least to most recent

An ideal key would be an ingestion timestamp or sequence timestamp e.g., 1, 2, 3..., 2104, ....


Structuring Your Data

The format of the table to which you are mapping to will vary depending on whether ingesting user activities and/or custom metrics is of interest. Beyond the requirement of an incrementing identifier mentioned above, there are several fields that can be mapped to and the details of how each table can be structured are shown in the below templates:

  • User activities - metrics that relate to an end-user (login, feature-click).

  • Custom metrics - often aggregated metrics (total logins for the day, total feature-clicks). They can relate to companies, assets, projects, or endusers.

More details on this are in the Field Mappings section below.


Connection Setup

Ensure your PostgreSQL instance allows connections from external IPs. If you have a strict IP whitelist policy, contact support or your Technical Account Manager to obtain the Planhat IPs that will connect to your instance.

For the integration setup, you will need:

  • Hostname (ex. 9.tcp.ngrouch.ai)

  • Port (ex. 23493)

  • Username

  • Password

  • Database name

  • Schema

Once you have these values available, navigate to the Operations Module >> Integrations, and find the PostgreSQL integration section (or use the search bar). Here, you will input the hostname and port into the corresponding fields:

Once you have done so, you can click on the blue iconwhich will bring you to this pop-up:

Input your username and password and click on “Set Up.” Now, you are ready to turn the PostgreSQL integration toggle from “Off” to “On." You can then input the database name which will prompt the integration to load the available schemas into a drop-down list. Choose the applicable schema from the schema list.

When have have selected the schema, you can click on "Add New Section" which will present the field configuration section:


Mapping the Data

  • Data type:

    • Select the data type that you wish to sync to: user activities (metrics relating to a specific end-user’s actions) or custom metrics (often aggregated data on the company, asset, project, or end-user level)

  • Sync direction:

    • Set the direction of the sync. You can choose from “not syncing” and “Receive from provider.”

  • Model in PostgreSQL:

    • The table in your database that you wish to pull data from

  • Batch Size:

    • Choose the batch size of the requests.

    • For instance, if your PostgreSQL table contains 100,000 rows, and your batch size is 10,000, Planhat will issue 10 separate sync requests to your database.

  • Run once in:

    • Set the sync frequency. This parameter determines how often Planhat will fetch data from your PostgreSQL database, with options ranging from every 5 minutes to a specific hour each day:

      • 5 minutes

      • 10 minutes

      • 30 minutes

      • 1 hour

      • day at: a specific hour of the day between 00:00 and 23:00

  • Key in PostgreSQL

Once you have specified the details, it should look something like this:

Field Mappings

In the Field mappings section, click on "Map a Field" to bring up two input boxes. The left-hand side represents Planhat fields for the given metric, and the right-hand side represents the column headers in PostgreSQL.

Choosing Fields

User Activities:

For user activities, the mandatory fields are:

  • Email OR externalId

For a full list of the available fields, you can go to this section of our API docs. Most companies use a combination of:

  • Email OR externalId → the email of the end-user or the externalId of the end-user.

  • Action → the name of the event

🚀 Quick tip: Please keep in mind that by default, every user activity counts as a single unit (1). However, if you need to assign a different value to each row, you can specify a 'weight' for that row in another column (see our user activity template). The specified 'weight' multiplies the default value (1), effectively transforming the row's count to match the 'weight' value.

Custom Metrics:

For custom metrics, the mandatory fields are

  • DimensionId → the name of the event

  • Value → the value of the metric

  • ExternalId → the externalId of the company, asset, project, or end-user

For a full list of the available fields, you can go to this section of our API docs. Most companies use all available fields:

  • DimensionId

  • Value

  • ExternalId

  • Date - the date that you want the metric to register on.

  • Model - this is the model on which the metrics should land. This does not need to be specified in your table as we set this value in the configuration section. The available models are: Company, Enduser, Project, Asset.

For more details on custom metrics, you can go to this section of our API docs.

Save and Sync

Now that you have configured the field mapping, you can click on “Save Section.” To add a new section, you would once again click "Add New Section" and configure the fields.

When you are happy with the mapping, and you can confirm that the arrow is green - this means that the sync is active and will run at the frequency (RUN ONCE IN) specified in the configuration setting:

There is no need to press the red sync button unless you are looking to re-fetch data from the last sync point. Please advice with your TAM before doing this.


Verification of Ingestion:

Once the sync has run, you can see the logs by clicking on "show logs." This will show whether the sync was successful or not, the ingestion date, etc. At this point, you can also navigate to the metric tab in the data module and click on refresh list. This will force the surfacing of the newly ingested metrics, which otherwise could take a couple of minutes.


Querying Your Data (for super admins):

If for any reason you are having trouble with the integration, you can ask your Technical Account Manager to query your data to see what might be the issue. Pressing the <> to the left of the blue “Refresh fields” button will bring up a dedicated SQL Query Interface. This feature is designed for super administrators to simplify troubleshooting and eliminate the need to request developer assistance.

Did this answer your question?