The Flow PostgreSQL Consumer allows you to push Flow Measure data into a PostgreSQL database, by specifying your own queries to insert/update tables in the database. This can be used when integrating to other systems where staging tables are required, or to utilize other Analytics/BI tools atop Flow data to perform analytics.
Measure data, model attributes, comments, limit exceptions, user details, and event context can be automatically inserted into a separate PostgreSQL database with its own custom schema.
Configuring a PostgreSQL Consumer
To create a connection using a PostgreSQL Consumer, right-click on Data Consumers, and select “New", "Postgres", then "PostgreSQL Database".
Select the “PostgreSQL Database” consumer to open its editor:
To successfully push data to a PostgreSQL Consumer, the following properties need to be set.
|
Name to identify the consumer. |
|
Location of the PostgreSQL Database (Server Name or IP Address). |
|
Name of the database to be used. |
|
Specify the specific port number the PostgreSQL instance is running on. The default is port 5432. |
|
Configure to use PostgreSQL authentication. Leave blank for Windows authentication. |
|
Configure to use PostgreSQL authentication. Leave blank for Windows authentication. |
|
Time setting to try and connect to the PostgreSQL database. The default 15 seconds. |
|
Use definition file to populate insert queries. |
|
Maximum concurrent connections allowed against the PostgreSQL database. |
After configuring the properties, test the connection by clicking the “Test” button. If successful, a pop-up will confirm your connection or display an error message describing the issue that needs to be rectified.
Configure a Measure to Replicate
From your Flow model, select and open the configuration of a measure that you want to replicate to the configured PostgreSQL data source. Dragging the data consumer onto the specific context to be replicated will create four sample queries that need to be modified to reflect the schema of the database that you are pushing data to.
Note: Both Calendar and Event Context can be pushed to a PostgreSQL consumer.
The four data items that can be utilized to push to the SQL database include:
|
This is where you configure your query to push measure values to a specified schema. Depending on the context, either measure values or event period values can be consumed. |
|
This is where you configure your query to push comments that have been added to measure values, to a PostgreSQL database. When a new comment is added (even if against historical measure values) this query will be executed, allowing you to update your PostgreSQL database with the comments. |
|
This is where you configure your query to push exceptions to a PostgreSQL database. Exceptions are generated when a measure's value for a time period falls outside the specified High and/or Low limit values. |
|
Details of users that either edited measure values or added comments can be used in the consumer queries. |
Depending on the context selected to be pushed for the particular measure, certain placeholders can be utilized in the PostgreSQL queries that will be replaced by values received from the Integration Engine, before the query is executed against the database.
Examples of the particular queries and placeholders for both Calendar and Event contexts is found in What Placeholders can I use when configuring a database-type Data Consumer? ** INCOMPLETE.