Teneo Developers

Scale setup

The Scale setup for the Power BI connector showcases how to connect Power BI Desktop to your Teneo solution and retrieve large amounts of data from a chosen log data source (LDS) by executing shared queries on your behalf. The connector is intended to be used in a production setup with a option to scale and uses a data store (Microsoft Azure SQL in this case) to store the query results. This is not only a smoother connector experience when dealing with large amounts of data, but also more sustainable when working with Power BI. It also helps you combine the Conversational AI log data with other sources in your data lake and lets you analyze it as part of your overall business.

PBI Scale setup - gif

Setup structure

This setup makes use of an extract, transform, load tool, Inquire ETL, to load data into Microsoft Azure SQL to be used in a data lake. Microsoft Power BI is then connected to Azure SQL to access that data and create reports.

scale arch

This approach has the following strengths:

  • Scales well with both data and Power BI usage
  • Enables you to combine the Conversational AI log data with other sources in your data lake and lets you analyze it as part of your overall business
  • Allows you to use the data store of your choice (in these pages, we will use Microsoft Azure SQL, but you can easily use a different data store if preferred.)

Getting started

Prerequisites

These instructions assume you have a Power BI premium license and a published solution. Alternatively, you can publish and use the prebuilt Longberry Baristas solution and queries included in your environment. The published bot should also have been in use in order to generate some interesting log data.

Set up a SQL database in Azure

Start off by setting up an Azure SQL, or requesting one be set up for you depending on your organization. You will need to request a username and password for a login allowed to import data into the database, together with the URL and database name.

Retrieving the Inquire ETL tool

Download Inquire ETL from GitHub. You will then need to set up a configuration file - in Java Properies format - which looks like this:

groovy

1# The URL to Teneo Inquire, for example https://teamname.data.teneo.ai/teneo-inquire
2inquireBackend=
3# Username to reach Teneo Inquire, for example john.doe@example.com
4inquireUser=
5# Password to reach Teneo Inquire
6inquirePassword=
7# The name of your lds, for example longberry_baristas_12345
8lds=
9# outputDir=xxxxx
10# separator=json
11# Timeout for the Inquire TQL queries in seconds
12timeout=1440
13#googleCredentialsPath=xxxxx
14#googleCloudAppName=xxxxx
15#googleSheetId=xxxxx
16# The URL to the Azure SQL database, for example teneo-example.database.windows.net
17azureServerName=
18# The name of the Azure SQL database, for example teneo-example
19azureDatabaseName=
20# Username and password allowed to import data
21azureUser=
22azurePassword=
23

Once you've created the properties file, you can run inquire ETL with the following command: java -jar inquire_etl-0.0.3-jar-with-dependencies.jar --config=YOUR_FILE.properties --azure_sql

There are two more very useful options for Inquire ETL, which are --from and --to, which allows you to specify timestamps for the period you wish to import. For example, --from=2021-01-01T00:00:00Z, which imports data from midnight of January 1st.

Inquire ETL will then retrieve the published queries of the Log Data Source (LDS), execute the queries, and store the results of the queries in Azure SQL.

It's a great idea to set this up in a cron or scheduled job to continously feed data to your Azure SQL data lake. In those cases, remember the from and to parameters to retrieve only the latest data. In the beginning, you will also likely want to bootstrap your data lake with data, in which case we recommend that you use the bootstrap shellscript to load the data one day at a time, as this will go faster.

Publish queries

The solution needs to have published shared queries. There are a few things to you need to do in order to be able to retrieve these in Power BI:

  1. While inside the solution, select the 'SOLUTION' button located in the top left corner. This will take you to the solution backstage.
  2. Select 'Optimization', followed by 'Log Data'.
  3. Expand the 'Manage' button and open up the relevant 'Log Data Source'.
  4. Write a query using Teneo Query Language.
  5. While inside the same query, click the 'Share' button and give your query a name. This will make sure other developers who have access to the solution can use the same query without having to re-create it.
  6. As a last step, we will need to publish the query. This can be done by selecting the query from the 'Shared Queries' section to the left, followed by the 'Publish' button. After pressing that button, another 'Publish' button will appear; click that one as well.

Access data in Power BI

Now you're ready to try the Power BI Desktop data connector for Teneo.

  1. Start off by opening your Power BI Desktop.
  2. Located in the 'Data' section, select the 'SQL Server' button. This will open up a window to enter the information to your Azure server in your Power BI Desktop.
  3. Enter the relevant server host name. For example, teneo-powerbi-connector.database.windows.net.
  4. Leave the 'Data connectivity mode' on 'Import' and proceed by clicking on 'OK'.

You will now be greeted with a different window where you can fill in the credentials for the Azure SQL server you are trying to get data from.
5. Continue by selecting one of the three methods to access your database. In this example, we will select the Database method.
6. Enter your credentials and click on the 'Connect' button. If the connection was set up successfully, Power BI will start to show your queries inside the Azure SQL server.

Congratulations, you have now made a connection between your Teneo solution and Power BI!

Build your own dashboards

The Teneo connector will transfer queries that are saved as 'Shared' to Power BI. The data retrieved from these queries are then used inside Power BI to create dashboards. In this step, we will demonstrate how you can quickly use the connector to build your first visual graph inside a dashboard.

  1. Located on the right hand side is the 'Fields' section. Expand one of them and select any data.
  2. Once selected, you will see a table of data appear.
  3. With the data still selected, click on any appropriate visualization to create your first graph.

You have now created your first visualization inside Power BI with the Teneo connector!

Please note that the video is only a demonstration and the selected queries should be applied to appropriate visualizations for a real use case.

Add more queries

Once you have connected your Teneo solution and Power BI, that data can be refreshed to automatically include new log data from new conversations with your bot. If necessary, you can also add more queries after the initial data load. To do this, simply write and publish your new queries in your Log Data Source (LDS) and then follow the steps from above on how to load data into your Power BI document, this time including the new queries which will now be accessible as long as they are published.

Tuning performance

  • Please see Microsoft blogs for best practices in Power BI.
  • Use Power BI with premium license.
  • Make sure to use a suitably sized Azure SQL database.
  • Redesign your Teneo Query Language queries to aggregate transactional data to minutes, hours, or even days, depending on the balance between granularity and performance.

Teneo Query Language queries can be redesigned to aggregate transactional data by combining the catd transformer with prefix operators.

Consider the following query:

tql

1listAll s.beginTime as 'Time', s.transactionCount as 'API Calls'
2

This query will list the number of API calls per session and can be rewritten to:

tql

1listAll time, sum s.transactionCount as 'API Calls' : catd(pattern="yyyy-MM-dd'T'hh':00:00Z') s.beginTime as time
2

This will aggregate the API Calls per hour.

Download