Code Integration - Advanced Concepts

This section covers more advanced programming concepts related to integrating the Assistant in your app

Asynchronous Action Handling

While handling the user journey, the app might not be able to identify the state or the condition of the journey in a synchronous fashion. For eg the app might open a new activity (or page) which will then fire up a search to the backend and show the results. So the callback itself won't be aware of the result unless the callback blocks till the entire activity is loaded. This is not desirable and leads to clunky code.

So the Slang programming model provides ways for the app to asynchronously signal the Assistant about the app state and the condition.

This can be done by using the "wait" and "notify" semantic. The process is very straightforward -

  • The call back will return the app state WAITING to inform the Assistant that it's not ready yet to identify the state or the condition of the app. The Assistant goes into a "wait" state (and it continues to remain in a processing state even after the callback returns)

  • The app can then call the method "notifyAppState" whenever it's ready

Return WAITING app state

The app should return WAITING from the user journey callback. This would keep the Assistant in a "processing" (which is what it would have been when it invoked the callback).

Android Native
React Native
Web
Android Native
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Fire an async search request
// ...
return SearchUserJourney.AppState.WAITING;
}
React Native
onSearch: async (searchInfo, searchUserJourney) => {
// Fire an async search request
// ...
return SlangRetailAssistant.SearchAppState.WAITING;
},
Web

Notify App State

The Assistant will continue to remain in the "processing" state until the app completes the transaction. It can do that by notifying the Assistant about the next app state. To notify the app state, the app needs access to the user journey object that was passed to the callback.

The app can access the user journey object via a global helper method getLastSearchUserJourney and then call notifyAppState with that object.

Android Native
React Native
Web
Android Native
SearchUserJourney userJourney = SlangRetailAssistant.getLastSearchUserJourney();
userJourney.setSearchSuccess();
userJourney.notifyAppState(SearchUserJourney.AppState.SEARCH_RESULTS);
React Native
SearchUserJourney userJourney = SlangRetailAssistant.getLastSearchUserJourney();
userJourney.setSearchSuccess();
userJourney.notifyAppState(SearchUserJourney.AppState.SEARCH_RESULTS);
Web

Programmatic Conversations

Sometimes the app might want to launch the Voice Assistant on its own without the user explicitly launching it (ie via the Trigger). The app can launch the Assistant by calling the startConversation API and pass the user journey name that it wants to launch it with. The Assistant will speak out the greeting message associated with that user journey in that case.

Native Android
React Native
Web
Native Android
SlangRetailAssistant.startConversation(AssistantUserJourney.SEARCH);
React Native
SlangRetailAssistant.startConversation(AssistantUserJourney.SEARCH);
Web

Accessing and Setting User journey Contexts

When the user journey callbacks are called, Slang will pass any details of the command via the Info parameter. But when users trigger the same user journey multiple times, Slang will preserve the context of the previous invocation and reuse it. For eg:

  • User says "onion"

  • Slang will invoke the onSearch callback and pass "onion" as the productType parameter. Everything else will be empty.

  • Now the user says "show me organic ones"

  • Slang will invoke the onSearch callback again and this time pass "organic" via the variant field but it will still pass "onion" as the productType parameter. That is it preserves the context between multiple invocations.

But sometimes the app might want to access or set the context outside the callbacks too.

Slang is built to be used in a multi-modal environment, ie the users might use the UI to do certain operations and Voice to do some. It's quite natural for the user to be searching for an item by typing it and then use Voice to provide additional filters. In such cases, the app can pass data to the Assistant explicitly via the appropriate user journey context.

Native Android
React Native
Web
Native Android
SearchUserJourney.getContext().getProductType();
SearchUserJourney.getContext().setProductType("onions");
React Native
var category = SearchUserJourney.context.getItemCategory();
SearchUserJourney.context.setItemCategory("grocery");
Web

Also sometimes it would be important to not preserve context too. For eg if the user has gone back a page and started a search again, then this should be treated as a fresh new search rather than as continuing whatever was spoken earlier.

The app can easily clear the context as shown below

Native Android
React Native
Web
Native Android
SearchUserJourney.getContext().clear();
React Native
SearchUserJourney.context.clear();
Web

Continous conversations

By default when the app reaches a terminal app state, Slang will go back to its trigger state. Now if the app wants Slang to keep listening, it can do that by enabling Continued Conversation mode.

This feature is still under development.