This Java Connector connects Microsoft Teams to a Teneo-built Virtual Assistant (VA) so the Teams messenger acts as a frontend to the Teneo engine (the VA backend). This way, users can chat via Teams with a Teneo engine instead of a real person. One instance of this connector can serve multiple users talking to one published Teneo engine simultaneously.
Teneo Microsoft Teams Connector is a standalone Java console application (an executable
.jar file). It communicates with Microsoft Bot API so no firewalls, etc., may prevent this communication.
The functioning of the connector is illustrated in the following diagram:
The sequence of the steps depicted in the diagram is as follows:
Users submit their messages into Teams messenger.
User's input is submitted to the connector by the Microsoft teams backend. The connector creates a
TurnContextobject (an instance of
com.microsoft.bot.builder.TurnContext) for the given user interaction and generates a so-called bridge session ID (BSID, an instance of
com.artificialsolutions.teamsconnector.BridgeSessionId) consisting of the account's object ID within Azure Active Directory (AAD) and the channel ID for the user or bot on this channel. The connector then checks via its singleton bridge object (an instance of
com.artificialsolutions.teamsconnector.TeneoBot) if there already exists a session (an instance of
com.artificialsolutions.teamsconnector.TeneoBot.BridgeSession) identified via this BSID. If no such session exists, it is created and its timeout countdown is started. If it already exists, it is returned and its timeout countdown is restarted. Each session object has its own instance of Teneo engine client (
com.artificialsolutions.teneoengine.TeneoEngineClient) to talk to Teneo engine.
The connector submits the user input to the Teneo engine client associated to the current session.
The Teneo engine client forwards the user's message to the Teneo engine with a POST request maintaining its own session with Teneo engine.
The Teneo engine client receives Virtual Assistant's answer in the Teneo engine's response.
Virtual Assistant's answer is passed to the
TurnContextobject obtained at step 2.
TurnContextobject submits the answer to the Microsoft Bot API.
The answer is displayed to the user.
Your bot needs to be published and you need to know the Engine URL.
To run the connector you need Java (JDK or JRE) version 17 or higher.
1git clone https://github.com/artificialsolutions/teneoteamsconnector.git 2
The application is configured in the
application.properties file, to be found in the folder
src\main\resources of the source code. The following configuration properties are implemented:
server.port- The port the connector is available on localhost (for example, 3978)
MicrosoftAppType- By default,
multitenantfor Java client
MicrosoftAppId- Can be found under the bot configuration as 'Microsoft App ID'
MicrosoftAppPassword- Can be found under 'Certificates & Secrets' section. It's the value of the new client secret you generate (not the Secret ID)
MicrosoftTenantId- Can be found under 'Azure Active Directory' on your Azure Portal.
microsoft.graph.request.params- The user-related request parameters to be added to Teneo engine requests:
PreferredNameas per https://learn.microsoft.com/en-us/powershell/module/microsoft.graph.users/update-mguser. These parameters will be added to Teneo engine requests in camel case , for example
Surnameetc. The parameters to add should be separated by commas, for instance:
teneo.engine.endpointUrl- The Teneo engine url for your bot
teneo.engine.connectTimeoutMillis- The timeout to connect with Teneo engine, in milliseconds
teneo.engine.responseTimeoutMillis- The timeout to wait for Teneo engine responses, in milliseconds
bridge.sessionTimeoutMillis- The timeout for the sessions created by the bridge, in milliseconds; it is recommendable to have it slightly longer then the session timeout of Teneo engine, which is normally 10 minutes (600 seconds, 600000 milliseconds)
bridge.maxParallelSessions- The maximum number of simultaneous sessions for the bridge; this number can be kept high (tens of thousands), although not too high since its purpose is to reduce the risk or the application running out of memory if the number of session increases too much
application.explicitData- The Boolean value indicating if some error and debug information should be added to requests sent both to Teneo engine and displayed to users in Teams. This property is not obligatory and defaults to
false. It should only be set to
truefor testing and troubleshooting
Regarding the logger configuration (the file
log4j2.json in the folder
src\main\resources of the source code), in order to test the application it is highly recommended to have it on the
trace level. If you have it on those sensitivity levels, it might log some PII, like user BSIDs, user inputs, etc. Thus it should be set to have less sensitivity in production (
warn for example).
Go to the project folder in your console and execute the following:
1mvn clean compile package 2
In your console, go to the folder you have your executable
.jar file in (it will normally be
target inside your project folder and the executable will be something like
asolteamsconnector-1.0.0.jar) and execute it:
1java -jar asolteamsconnector-1.0.0.jar 2
It sets up a service available via HTTP on your local host on the port specified in the application's configuration file. Its URL address might look something like
http://localhost:3978 or similar. Microsoft Azure should be able to access that service via HTTPS so you have to make it publicly accessible. For demo or testing purposes you can use ngrok to create a public URL for a service running on your local host.
The requests received by Teneo engine contain the following parameters:
userinput, value: the input text (if the user submitted it)
Additionally, the request will contain all the parameters/values available via the
com.microsoft.bot.schema.Activity.getValue() provided this call returned a
Map instance. Moreover, the user-related parameters configured in
microsoft.graph.request.params will also be added in camel case (
givenName instead of
city instead of
Virtual Assistant normally returns a text as its answer. This text is then displayed in Teams to the user. If adaptive cards should be returned, they should be placed in the output parameter
msbotframework as a well formed JSON. Splitting answers into 'bubbles' is also supported via the output parameter
An Azure account with an active subscription is required. Create an account for free here.
Once created, go to the bot's configuration and add the connector's public URL in the field Messaging endpoint in the format
https://YOUR-PUBLIC-DOMAIN/ is the public URL of your connector (the one provided by
ngrok or similar).
A channel is how your bot communicates with your application, in this case Microsoft Teams.
- Select your bot in Azure. Under settings, click on Channels, then click on "Available Channels" and select Microsoft Teams.
- Read and Accept the Channel Publication Terms.
- Select "Microsoft Teams Commercial (most common)".
- Click "Apply".
- Once enabled, you can close this tab and the channel should be available under Channels along with Web Chat.
To open in Teams, hit the link under Actions, "Open in Teams". You will be prompted to allow Teams to be opened. Once you agree, you will see your bot in your Teams Chat panel.
Send a message through Microsoft Teams and your bot will respond to your chat!
Simple Output Parameters make it easy to enrich your conversations with for example buttons, video's and cards in Microsoft Teams.
Microsoft Teams supports various rich message types via Adaptive card, allowing you to enrich your conversations with buttons, videos, cards, etc. There are different ways of adding these rich messages to outputs in Teneo Studio. This page focuses on an approach that makes it easy to add rich message types by adding a few helpful scripts to your solution. Once added, conversation designers in Studio can add rich messages by populating output parameters in a simplified and intuitive way. For example, attaching an image to an output using this approach is as simple as adding an output parameter with the URL of the image:
First, we'll prepare our solution by adding two scripts:
- Add the following lines of code at the bottom of the 'Post Processing' script in Global Scripts:
1def msTeamsHelper = new MsTeamsHelper("1.5") 2msTeamsHelper.createOutputParamters(_) 3
The argument for MsTeamsHelper class indicates the version number of the Adaptive Card. You can also omit the version number here if you don’t know which version is better for you. If you do so, the version number will be set at 1.4 by default, which should cover all functionalities needed. Click here for more information about message types supported by different versions of Adaptive Card.
Once we've added the script, we can start adding output parameters with message type details to our outputs. We'll refer to these output parameters as simple output parameters going forward.
To display buttons in Teams using simple output parameters, you need to add an output parameter named teams_buttons and assign the button titles to the output parameter's value. Use the pipe symbol ('|') to separate the button titles.
The JSON for each button option generated by Simple output parameter approach consists of three properties: • type - indicates the type of the action, always has the value Action.Submit • title - represents the button text, gets the value from the output parameter • data – contains the text and parameter to be sent to Teneo Engine when the button is clicked. The text has the same value as the Title, while the parameter has the key buttonChoice and value as the Title as well.
Due to the limitation of Adaptive Card, you can add at most 6 buttons in the same card. Using ShowCard action, we can support up to 11 buttons by folding the 6th to 11th button in a sub-card, for example:
To display link buttons in Teams using simple output parameters, you need to add an output parameter named teams_links and assign the link button details (title and URL) to the output parameter's value. Use the pipe symbol ('|') to separate the button details and use the comma to separate the title and the URL within each link button.
The JSON for each link button consists of three properties: • type - indicates the type of the action, always has the value Action.OpenUrl • title - represents the link button title, gets the value from the output parameter • url - the URL that will be opened when the user clicks the button
Similar with the buttons, link buttons can carry other properties as well. If your bot requires additional information to be included in a linkbutton, you should add the message type using the advanced approach.
To display a text block in Teams using simple output parameters, you need to add an output parameter named teams_text and assign the plain text to the output parameter's value. Additionally, you can indicate the size of the text (optional) by adding the pipe symbol ('|') and the size at the end of the text. The allowed values for the size are: small, medium, large and extraLarge. If you do not indicate the size, the size of the text in the Text Block will be set as “medium”.
The JSON for the Text Block generated by Simple output parameter approach consists of three properties: • type - indicates the type of the message, always has the value TextBlock • text - represents the button text, gets the value from the output parameter • data – contains the text and parameter to be sent to Teneo Engine when the button is clicked. The text has the same value as the Title, while the parameter has the key buttonChoice and value as the Title as well.
Similar with the buttons, a Text Block can carry other properties as well. If you need to put additional information in the Text Block, you should add the message type using the advanced approach.
To display an image in Teneo Web Chat using simple output parameters, you need to add an output parameter named teams_image and assign the image's URL to the output parameter's value. You can also add an optional alternative text after the URL, with a pipe symbol ('|') as the separator. If you do not provide it, the alternative text will be "This is an image" by default.
The JSON for the image generated by Simple output parameter approach consists of three properties: • type - indicates the type of the message, always has the value Image • url - indicates the URL of the image, gets the value from the output parameter • altText - contains the alternative text to be shown to the user when the image URL is broken.
Similar with the buttons, an image can carry other properties as well. If you need to put additional information in an image, you should add the message type using the advanced approach.
Currently Media playback are not supported in Adaptive Cards in Teams, but you are still allowed to create the JSON for audio and video by Simple Output Parameters and wait until the media files are supported in Teams in the future.
To add a video or audio file, you need to add an output parameter named teams_video:
and assign the URL of the media file to the output parameter's value. Especially, for those videos in Online Video Platforms (e.g. Youtube), you need to change the parameter name to teams_webvideo. Like the image, you can also add an optional alternative text after the URL, with a pipe symbol ('|') as the separator. If you do not provide it, the alternative text will be "This is a media file" by default. The JSON for the media file generated by Simple output parameter approach consists of three properties:
- type - indicates the type of the message, always has the value Media
- source - indicates the URL of the media file which gets the value from the output parameter and the mimeType which is auto detected by the file extension. In case of Online Video Platforms, the mimeType will be omitted.
- altText - contains the alternative text to be shown to the user when the URL of media file is broken.
To display a combined message in Teams containing various message types mentioned above, you can just add the corresponding output parameters to the same output node, with an additional parameter called teams_order (optional) to specify the order of the components.
In an Adaptive Card, Text Block, Image and Media belong to the body property, while buttons and links belong to the actions property. The contents in the body are always shown before the contents of the actions, which cannot be modified. You can only manipulate the order within the body or the actions by the output parameter teams_order.
The value of the teams_order parameter should contain the type names separated by comma in the order that you want them, for example:
The output parameter above indicates that image goes before text block within the body scope, while in the actions scope, buttons are shown before links. The supported type names in teams_order are:
|Type names in teams_order||Message types in JSON||Scope|
|media||Media||body (Not supported by Teams yet)|
As described in the Microsoft Bot Framework message type JSON specification, Teams looks for an output parameter msbotframework. The approach outlined on this page uses the MsTeamsHelper class which produces the correct JSON by converting the details provided in 'simple output parameters'. This simplified approach has some limitations. Some message types will be added with certain default values. For example, buttons will get the style 'default' and the send-back input will be the same as the title of the button.
In cases where more customization is needed, you can generate the JSON and populate the msbotframework output parameter using script nodes and flow variables. This gives you full control over the JSON that is added to the output parameter.