Test your bot with Botium

While auto-testing in Teneo allows you to perform quality assurance tasks during the development and maintenance of a Teneo solution, you may wish to perform end-to-end tests before you release your bot to ensure the bot behaves as intended, the answers provided are as expected and the integrations work as anticipated.

Botium is a quality assurance framework that was specifically developed for regression testing of chatbots, including Teneo chatbots. Botium is a third party product developed by Botium.ai.

Test projects in Botium

A Botium test project typically contains the following components:

  • A Chatbot which in our case is the Teneo engine that you will perform the tests against
  • One or more Test Sets, which define properties used during testing and which contain...
  • One or more Test Cases, multi-step dialogues that the chatbot is supposed to follow

This page will guide you through setting up a Botium project and run tests against a Teneo chatbot.

Prerequisites

Sign up to Botium

You will need access to a Botium server, otherwise known as a "Botium Box". If you don't have access to Botium, you can sign up for a free trial on the Botium.ai website. Don't forget to mention that you are coming from Teneo.ai!

Published Teneo solution

You need to know the URL of the published Teneo solution that you want to test. If you haven't published your solution yet or don't know the URL of your bot, see Publish your bot to learn more.

Setup instructions

Make sure you've logged in to your Botium instance through your favorite browser before you follow the setup instructions.

Register a chatbot

First, we are going to connect a Teneo chatbot to the Botium instance.

  1. Go to 'Chatbots' in the menu to the left
  2. Then click on 'Register New Chatbot'
  3. Give the chatbot a name
  4. Choose the 'Botium connector for Teneo' from the 'Connector/Chatbot Technology' dropdown menu
  5. Add the published endpoint URL to the field 'Teneo chatbot endpoint URL'

In the settings tab of your chatbot configuration, you can specify static input parameters that will be included in every Botium request your bot. For example, here you could specify a parameter that tells Teneo that the requests are coming from Botium. You can specify multiple static input parameters per chatbot.

Create a new test set

Now we are going to set up a test set in which the test cases are going to be stored.

  1. Go to 'Test Sets' in the menu to the left
  2. Click 'Start Test Set From Scratch'
  3. Give the test set a name and an optional description
  4. Click 'Continue To Test Case Designer', to save the test set

Add a test case

To create a test case, we're going to use Botium's live chat functionality which will allow you to have a dialog with your chatbot directly from within Botium and save that dialog as a test case.

  1. Inside the test set, click on the 'Record Live Chat' button
  2. Choose the chatbot that you want to record the live chat with, note that 'Echo Bot' and 'I am Botium' are default chatbots in Botium
  3. Botium will automatically connect to the chatbot and once connected, Botium will notify you through a pop up at the bottom of your screen
  4. Now, have a dialog with your chatbot
  5. When you are happy with your dialog, click on 'Save Test Case'
  6. Give the test case a name and click 'OK' to save the test case
  7. Once saved, a notification will pop up at the bottom of the screen telling you that the test case has been saved
  8. Click 'Cancel' to return to the test set

The record live chat functionality is available from multiple views in Botium, you can find them here:

  • You chatbot configuration view
  • Test set view
  • Test project view

Set up a Test Project

Let's set up a test project that ties your chatbot, test set and test case together. We're going to use the 'Quickstart' wizard which allows us to select which chatbot we want to test against, and which test set(s) that we want to bind to the test project.

Create a test project an select your Teneo chatbot

  1. Click on ‘Quickstart’ in the menu to the left
  2. Give the test project a name
  3. Click on 'Connect to Chatbot or enter new connection settings' and choose the chatbot that you connected to Botium before
  4. Click ‘NEXT’

Select test set(s)

In the second step, you'll assign the test set(s) that you want to bind to the test project.

  1. Untick the 'Connect to Chatbot or enter new connection settings'
  2. Go to the search field 'Select from registered Test Set(s)' and search for the test set that you created before and assign it
  3. Click 'NEXT'

Select test environments

In the last step, we'll leave everything at default, and click 'Save'.

You've now set up a test project that contains:

  • A Teneo chatbot that we can interact with and test
  • A test set in which we can store our test cases
  • A test case which is the dialog path that the chatbot is supposed to follow

Note that the test project is a static entity, you cannot add additional test sets after a test project has been set up.

Run your first test session

You can start testing your chatbot directly from the test project by clicking on 'Start Test Session Now' button. Once you’ve started the test session, Botium will present the test results. Here you can see which test cases failed and which passed. You can also expand the test cases to inspect the dialogs.

If you'd like to run additional tests, navigate back to your test project and click 'Start test session now'.

Interpret failed tests

Botium provides plenty of information when a test case fails:

  • The user input that was sent to the chatbot
  • The chatbot's response
  • An error message displaying which asserter failed and on what line the test case failed

Note that Botium aborts the test case as soon as the test case fails. It will continue testing the next test case if available.

It's possible to get even more information per transaction by clicking on the <> symbol found next to the user inputs and the chatbot's responses. This will open Botium code which allows you to see in detail what was sent to and received from your chatbot.

Advanced Options

Most chatbots return more than just text responses. This could include links, parameters, or other important information that the front end requires. It's important that we can test all parts of a chatbot to make sure they are functioning as expected.

We will be using Botium's 'Source editor' to explain some of the advanced testing options in Botium. You can modify the source code of a test case by opening a test case and clicking the 'Open in source editor' at the bottom of the test scenario.

The source code of a test case typically looks like this:

User wants to know the time

#me
Hello!

#bot
Hello. It's good to see you!

#me
What time is it?

#bot
My watch says it's time to get some exercise!

Send input parameters

To correctly test the behavior of our chatbots, we may have to send input parameters to see that the chatbot is reacting appropriately. We specify the input parameters in test cases, specifically under the user inputs (#me) like this:

#me
Hello!
UPDATE_CUSTOM QUERY_PARAM | callerCountry | GB

#bot
Hello. It's good to see you!

Let's inspect the part that is sending the input parameter:

The UPDATE_CUSTOM QUERY_PARAM functions tell Botium to send input parameters followed by a key and value pair, separated by a pipe. If you want to send multiple input parameters from the same user input, you will have to create a new line for each input parameter, using the same syntax.

Below is an image that illustrates what input parameters looks like in the engine response JSON when it is being sent to Teneo from Botium:

Test output parameters

Sometimes you might want to evaluate parameters that are included in the engine response JSON, like output parameters:

To test parameters included in the JSON response from Teneo, Botium offers a JSON path asserter. An example scenario that asserts the value of an output parameter called OUTPUT_NODE_IDS may look as follows:

#me
Hello!

#bot
Hello. It's good to see you!
JSON_PATH $.output.parameters.OUTPUT_NODE_IDS | 0581322c-e766-4b47-8de2+c4a1169a787d

Let's inspect the part that checks the JSON path:

An annotated image of how to evaluate values in output parameters

To evaluate the returned JSON, you use the JSON_PATH function along with:

  • The path to the parameter you want to test
  • The value that you want to assert

Note that in the example test scenario above, Botium will assert both the bot's text response (Hello. It's good to see you!) and an output parameter called OUTPUT_NODE_IDS. If you'd prefer to just test a JSON path, you can omit the expected text response. If you want to test multiple JSON paths in the same engine response, you should create a new line for each path, using the same syntax.

It's important to make sure that the links that your chatbot is returning are working as intended. To do this, you can use the link checker that's available in Botium.

#me
What drinks do you serve?

#bot
We serve everything from flat whites to espressos. Please visit https://longberrybaristas.ai/menu/ to see the full menu.
CHECKLINK 200

The CHECKLINK function validates URL's available in both the output text as well as the output's URL field. When testing the URL, Botium asserts if the HTTP response code it received, matched the value specified in the test scenario.

Other useful Botium features

Botium offers many more useful features for testing your bot and we encourage you to browse and explore Botium's own documentation to make the most of Botium.

Here is a list of some of the useful features that might be beneficial when you are composing tests for your Teneo chatbot:

  • Text matching mode: as output nodes can contain dynamic parts, you can change the text matching mode, for example, to allow the user of wildcards in test scenarios
  • Partial conversations: create a subset of a conversation that you can reuse in your test cases
  • Scripting memory: store a part of the bots response in a scripting memory variable and reuse it later in the same test case

Was this page helpful?