Teneo and ChatGPT: Better Together for Conversational AI

openai
Home
Home

Teneo and ChatGPT together elevate business interactions. Teneo’s SaaS solution lets users access ChatGPT through Microsoft Azure OpenAI services. This integration brings cutting-edge AI to their projects. In this piece, we’ll explore how these tools shape top-tier conversational AI.

Overview of ChatGPT

ChatGPT, from OpenAI’s GPT model family, is built on the Transformer neural network structure. It features multiple layers of self-attention mechanisms, enabling the model to grasp word relationships and produce cohesive text. Thanks to its vast training data exposure, it excels in tasks such as language translation, summarization, and answering questions.

What are the differences between the different GPT models?

The GPT (Generative Pre-trained Transformer) models are a series of natural language processing AI models developed by OpenAI. Furthermore, they are designed for various tasks such as text generation, translation, summarization, and more. The main GPT models include:

GPT (Generative Pre trained Transformer): The first GPT model was introduced in 2018 and featured 117 million parameters. It demonstrated strong language understanding capabilities but had limitations in terms of scalability and performance.

GPT-2 (Generative Pre trained Transformer 2) Released in 2019, GPT-2 built upon the success of the original GPT model. It was significantly larger with 1.5 billion parameters and demonstrated improved performance in various language tasks. However, OpenAI initially withheld the full model due to concerns about potential misuse.

GPT-3 (Generative Pre trained Transformer 3): Launched in 2020, GPT-3 is the third iteration in the series and represents a substantial leap forward in terms of size and capabilities. It boasts 175 billion parameters and has demonstrated impressive performance in a wide range of applications, including translation, summarization, and even code generation.

GPT-4 (Generative Pre trained Transformer 4): Launched in March 2023, GPT-4 is a large multimodal model (accepting image and text inputs, emitting text outputs) that, while less capable than humans in many real-world scenarios, exhibits human-level performance on various professional and academic benchmarks. For example, it passes a simulated bar exam with a score around the top 10% of test takers; in contrast, GPT-3.5’s score was around the bottom 10%.

Remember that each GPT model also has smaller versions with fewer parameters, providing a trade-off between computational requirements and performance.

Teneo and GPT Integration Use Cases

GPT For General World Knowledge

Now, you can design a conversational flow that delivers GPT responses straight to your customers. This leverages the vast knowledge embedded in the large language model (LLM). It not only broadens your bot’s understanding of topics outside your project’s core scope but also saves you time in content creation and maintenance.

By positioning the Trigger of the GPT flow rather low within the Trigger Ordering , you also make sure that your business/project-specific content will be handled with preference. The example below demonstrates how this can be achieved.

Answer Variation

You can instantly craft new answer variations by invoking the GPT API before replying to your user. By populating your output node with a variable, the response can be dynamically generated with each interaction. If you prefer not to make API calls to GPT for every interaction, Teneo already lets you add multiple answer variations to an output node, delivering them either randomly or in a specific sequence.

Generative AI can assist in brainstorming response phrasings. Just as you’d engage in a conversation with ChatGPT, you can prompt the API to offer varied phrasing options for the response, which you can then save within Teneo. This approach cuts down on latency and expenses when launching your solution, while also giving you control over the response’s tone and content.

Tip: If you want to use this ‘ChatGPT style’ brainstorming inside the Tryout, just implement our GPT Connector solution, it comes ‘out of the box’. You might want to restrict the trigger of this flow then to a specific command and make it only available on Tryout via a Global Scripted Context.

Dialog Summarization

Prompt Engineering is key when you want to use GPT models for specific conversational AI tasks which are not directly related to answering a user input. We can easily create an additional instance of our GPT Helper that can be tasked with writing summaries of the current conversation state.

You can call this additional GPT instance whenever needed, have it create a summary of the available conversation, and finally store the summary in a global variable inside your Teneo solution.

This can be done easily via a global Postprocessing or End Dialog script which updates the conversational context for the summary and calls the GPT Helper’s completion method. You can find the complete code to run this example in our GPT Example solution which you can download at the end of this page.

Example Dialog:

A dialog summary can be interesting, for example in handovers to a human agent, as in your favorite Contact Center solution – OpenQuestion.

Sentiment Analysis

Like the previous example, you can also create a GPT Helper instance to support you with the task of performing Sentiment Analysis on the conversation.

You can find the complete code to run this example in our GPT Example solution which you can download at the end of this page.

Dialog Example:

Identified Sentiment:

Tip: You can do Sentiment Analysis for six languages directly within Teneo. Check out all info here: Sentiment & Intensity Analysis | Reference documentation | Teneo Developers.

ML (Machine Learning) Data Via Tryout

We have discussed a use case on Data Augmentation in one of our previous articles – GPT models can also help here. It is the same approach as above with Answer Variation. You can use Tryout to get sample utterances for your dataset.

Using GPT to generate data should only be considered as a starting point for your data gathering. Real user inputs are the source of truth here. These can be easily added to your model with Teneo’s Optimization functionalities.


This example illustrates how GPT can help generate data for your ML model. It provides sample utterances around a Book a Flight intent. The rest is then only a copy and pasting task to add the data to your Class Manager.

GPT & Generative AI – Opportunities

By adding a GPT model to your Teneo project you can expand your bot with general world knowledge at a global scale. Likewise, you can focus on the creation of the specific business-related flows of your project. You can also automatize several functionalities used in a typical Conversational AI project, as we have seen for Dialog Summaries and Sentiment Analysis.

GPT & Generative AI – Things To Think About

Adding Large Language Models (LLM) and Gen AI into your project can challenge the control over your bot’s responses to end-users (your clients). Many projects even bring in UX Designers to ensure the quality and tone of the answers align with the company’s image. After all, a bot represents your company too. Meeting these requirements with Generative AI is currently challenging.

Then there is also the point of made-up answers by LLMs (Large Language Models), and the term Hallucinations. Basically, a LLM can derive answers which are not based on facts, not even within its own training data. Truth checking and avoidance of hallucinations have since then become an interesting research activity. Read more in articles here by Bret Kinsella and here by Cobus Greyling.

Responsible AI

The idea behind the Teneo SaaS offer is to provide you with all you need to build a state-of-the-art Conversational AI project. At the same time, we’d like to fulfill our vision of responsible AI usage. Here you can read an article with thoughts around the ethical considerations in conversational AI applications. The security of your infrastructure and the data of your users is always a priority for us. The usage of Generative AI to achieve this, is no exception.

A powerful tool needs to be used in the correct way in order for it to be useful. Before using OpenAI services, make sure to check what kind of data you will be sending over to the service. Furthermore, check if sharing this data with another service is compliant with your companies’ regulations.

You can find an overview around data privacy in Azure OpenAI here. Watch Microsoft’s AI Show episode Being Responsible with Generative AI, which goes through the topic via an enjoyable conversation between Sarah Bird, the Responsible AI expert on Microsoft side, and the always brilliant Seth Juarez:

For example, it could be that you need to use Teneo to anonymize personal information in the data before sending it to the service. Learn more about how to anonymize data in Teneo here.

The seamless connector of GPT with Teneo opens up new possibilities for businesses to leverage the power of advanced conversational AI. By combining these technologies responsibly and effectively, businesses can create state-of-the-art conversational AI projects. These can improve customer service and enhance user experiences. Furthermore, these can expand world knowledge, providing dynamic answer variations, or generating data for machine learning. Teneo and ChatGPT are indeed better together.

Share this on:

Related Posts

The Power of OpenQuestion

We help high-growth companies like Telefónica, HelloFresh and Swisscom find new opportunities through AI conversations.
Interested to learn what we can do for your business?