The Teneo platform's core modules are:
Features such as our unique hybrid approach to linguistic and machine learning are used throughout the modules to overcome the challenges of developing conversational AI applications and to ensure rapid development timescales.
Teneo also includes pre-built resources that enable you to easily expand your chatbot’s capabilities such as integrating to back-end RPA processes or making your web based virtual assistant available on Facebook messenger.
Teneo is a fully-fledged, mature and proven natural language platform that covers the full lifecycle of designing-developing-deploying-running-learning-reporting-optimizing highly sophisticated NLI solutions. It is highly visual, productive and collaborative.
The Teneo Platform is a powerful, highly visual and collaborative integrated development environment (IDE) to design and develop Natural Language Interaction (NLI) solutions.
In addition to this if the developer already possesses some data (such as chat or voice transcripts) Teneo Discovery is another part of the wider Teneo Suite of products that can help to get an overview of what the data is about. This includes identifying what customers are discussing and how they express themselves, find the typical domain vocabulary of the data, decide what knowledge areas to focus on in the NLI application and identify which custom specific knowledge needs to be built. This knowledge can then be exported and fed directly into Teneo Platform.
Whether the developer is already experienced in the platform or not, Teneo allows developers to play around, try new ideas, and work on prototypes, labs, and proofs of concepts. Once the scope of a solution has been defined, Studio can be used to start a more formal implementation process.
In Teneo Studio a solution is never started from scratch, since out-of-the-box Teneo already provides two different types of language resources: the TLRs, which cover the generic language model and contain the most common words and phrases in the chosen language, and the TDRs, which already cover generic knowledge areas to ensure human-like conversations. Teneo has developed resources for 35 different languages.
These generic resources can be used without restrictions; therefore, it is only needed to build language content specific to the project. Indeed, the system allows the user to create their own language resources, template solutions, script libraries, etc. All these knowledge and functionalities may be shared between teams and departments or reused in different projects.
Moreover, thanks to its modular design, custom or project specific pre-processors or third-party NLP tools may be added to the Input Processing chain (e.g. PoS-Taggers, Date and Time recognizers...). These tools are executed on every input and can add additional information to be used in the NLU rules (annotations) or as part of a Hybrid matching process. This modularity also allows Artificial Solutions to quickly expand the Platform to support new languages or extend the existing language support with new functionality – using any number of different technologies.
When developing an NLI solution within Studio, the whole team, or even multiple teams, can work at the same time on the same solution, or, of course, different solutions.
Developing with Teneo’s visual editor offers the following benefits:
Thanks to Master-Local, it is very easy to extend a project by creating a multilingual or multi-domain solution. Using this feature makes the process of rolling out the solution to multiple targets simple and easy by keeping a central Master and only changing in the Locals the parts that you wish to be different; this ensures consistent coverage in all targets. Multimodal behavior can be supported within the same solution too - enabling recognition of inputs via speech, text, touch and gesture all within the same dialogue.
In many situations, the user already possesses data (for instance, a support knowledge base or existing product catalogue) to be included in the solution, this data could be inputs that need to be matched or lists of words / concepts to recognize. Using the Bulk Import feature, all available data can be converted to a csv-based format and uploaded into Studio directly, allowing the developer to immediately create Machine Learned Intent Classes, Entities, Flows, Syntax Conditions and Language Objects to cover this knowledge.
Bulk Import can be executed at any point in the lifecycle, so it is possible to start the development of the NLI solution without having this data ready and add it as soon as it becomes available – or to use Bulk Import in a later phase of the project to extend an existing Production solution to cover new areas.
As previously mentioned where this data is not already structured – Teneo Discovery can be used to gain an understanding of the data – which can then be exported in Bulk Import format to support direct ingestion into Teneo Studio to create functional solution content.
During the creation of the solution, and all through the editing process, Try out allows users to see the behavior of the solution, simply by sending inputs and seeing whether the responses are correct. Additionally, Try out helps to debug the solution, providing a deep dive into the Engine state and behavior and allowing users to easily see what is going on and why. Among other things, Try out displays information regarding how Engine is splitting the input into words and sentences, with which Intents the input was classified, which annotations are added where, which path was taken through the flows and even what values were assigned to the different variables and metadata. All this is intended to give the user all the information they need to enhance and ensure the behavior of their solution.
On the other hand, Auto-test provides a tool to ensure the continued coverage of matching within the solution; by testing every example the developer has provided, Auto-test validates that the user will end up in the right place. This means not only that the example is covered by the intent but also that another intent is not going to “steal” it, once again providing detail and warnings about all the intents which would match, not only the one that is finally triggered. Thus, potential conflicts are identifiable before they become real ones, and it is easier to grasp an idea of the actual solution scope.
Throughout this process the system is proactively checking through the solution as it is modified, looking for potential improvements or things that the user might want to verify as being correct. It is important to highlight the fact that the user is the final responsible for deciding to make a change to resolve the Improvement Suggestion or to Acknowledge. I.e. the user can always choose to leave the solution as it is and ignore the suggestions, which can be hidden from the general list, or apply them – it is always possible to find all the acknowledged suggestions to analyze them again.
In order to better understand the end user’s experience, Teneo allows to publish the implemented solution to a Development publication environment and use the solution in situ with a real interface; for example, the development team may connect using the mobile app and check the behavior of the mobile interface by themselves. By using a specific, separate environment, dev testing can be carried out without affecting live deployment at all, enabling an iterative development process. Note that the platform can support several publishing environments for different purposes, or a single environment with multiple servers if needed, both on premise or in the cloud.
Still within the IDE, it is also possible to publish to a QA publication environment (the platform can support several of these too) and make the work available to a wider audience, perhaps for a round of user acceptance testing, demonstrations, or beta testing. Thus, the development team can continue working on the solution, extending it with the next set of features or fixing any issues found in the testing, all without affecting the QA or Production deployment, so the iterative development cycle can continue alongside the validation and acceptance process.
Once the solution has passed the QA process and is ready to go live, the solution is published to the Runtime Production environment (again as with Dev and QA, the platform supports multiple environments simultaneously). Quality Control then enables the user to publish to a Production environment the exact solution that was tested on the QA environment, ensuring that the go live happens with exactly the same versions and resources as those tested. Thus, the end users get the high level of experience expected and validated during testing.
Once the NLI solution is running live on a Teneo Engine (or group of engines, since Teneo Engine is field proven in both scalability and robustness), the end users can enjoy human like conversations in a solution that has been proved performant and reliable and ensuring the high level of end user experience expected while designing and developing it.
Publishing a solution is not the end of the process, however; it is just the start of the cycle to maintain the highly intelligent and capable solution, as intended. With a working solution as a starting point, this is the phase to ensure that the solution behaves as required and expected when running in the real world, and to expand and cover new areas. To achieve this, all the end user interactions will be logged, and then the Teneo Analytics suite – Teneo Inquire and Teneo Discovery – may be used to gather, interpret and act on this data.
The generated data is held within the company and is immediately accessible to different business units, providing a better understanding of customer attitudes around products and services. Thanks to the Analytics Suite, a Teneo user is able to perform both scheduled and ad-hoc reporting, and analyze, identify and monitor long term trends as well as rapidly emerging issues or requirements, discover concepts and concept associations in the interactions held with the end users.
Moreover, Teneo not only supports, but enforces an Example and Data driven implementation approach: the analytics tool helps to analyze multiple natural language data streams and quickly gain insight into key customer queries and priorities, use data to confidently identify and prioritize the most prolific issues, detect knowledge needs and areas of improvement and thus define the next implementation scope... that is, back to Studio development! In the end, NLI applications will be built based on quantified facts related to end-user requirements.
Exploring logged data can meet different goals and needs:
All these needs are covered by the Teneo development cycle.
This section describes the set of products that make up the Teneo suite.
As a Teneo User, I create and maintain my content within the development environment through Studio. This content is known as a “Solution”, which collects together all the data, interfacing with other components of the platform (Learn, Inquire, Engine, Manager) to support the user in defining the behavior for the end user. Multiple Solutions can be created to define different applications, investigate new ideas or for any other reason.
Using a single desktop interface, I will create, try out (preview), test, maintain and deploy (to dev, QA and production) my solution.
As an End User of the solution, I use an interface (Mobile App, Chat Bot, Web) which communicates with the Engine in the runtime environment.
Engine is responsible for runtime execution of the solution as defined within Studio. Engine is also responsible for generating the log files which can be used by Teneo users to analyze and maintain the behavior of the solution. These log files are interpreted by Teneo Inquire.
Inquire provides an API with which you can query the log data using the bespoke Teneo Query Language (TQL). Queries can be executed in order to build a report, import into a BI tool or populate a custom dashboard and they can be executed from any code capable of sending and returning JSON queries. Pre-built clients are available for some common languages to simplify connection via an object in code rather than building the queries directly.
Was this page helpful?