Advanced Topics

Additional APIs and customisations to measure and improve the experience further

2.4 The onSearch callback

Whenever the Assistant detects that the user is searching for items in the app, it will try to break down the search request into its basic components and invoke the onSearch callback associated with the search user journey. The callback looks like this:

void onSearch(SearchInfo searchInfo, SearchUserJourney userJourney);

When this callback is invoked, the app is expected to:

  1. Consume the details of the search request via the SearchInfo parameter.

  2. Fire the app's search request.

public void onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
   // The searchItem will have the relevant part of the end-users search request 
   // and will automatically be in English, even if the user spoke in a 
   // different language. 
   String searchItem = searchInfo.getSearchString();
   
   // Launch SearchResultsActivity using "searchItem"
   // ...
}

The following are some examples of commands that could trigger this journey

  • "Onions"

  • "Show me onions"

  • "3 kgs of organic onions"

  • "Looking for fresho organic onions"

  • "Searching for Maggi Instant noodles in grocery"

  • "2 rs head and shoulders shampoo"

SearchInfo Parameter

The parameterSearchInfo contains the breakdown of the original search request. Its structure is as described below:

Class SearchInfo {
	public Item getItem();
	public String getSearchString();
	public List<FilterInfo> getFilters(); 
	public SortingInfo getSorting();
	public boolean isAddToCart();
}

Class Item {
	public String getCategory(); // The category that the user specified, 
	public String getBrand(); // The brand name identified by Slang from what the
	public String[] getProductNames(); // The product names (like "organic", "pulpy orange") if any
	public Quantity getQuantity(); // The quantity if any is spoken by the user
	public Size getSize(); // The size if any is spoken by the user
	public Price getPrice(); // The price value if any spoken by the user
	public String getDescription(); // The fully constructed search string without size and quantity
	public String getCompleteDescription(); // The fully constructed search string with size and quantity
}

JSON Representation for the above

//Example:
//User Utterance: "Minute Maid litchi 500 ml 24 bottles"

{
  "item": {
    "brand": "minute maid",
    "description": "minute maid litchi",
    "completeDescription": "minute maid litchi 500ml 24 bottles"
    "quantity": {
      "amount": 24,
      "unit": "BOTTLES"
    },
    "size": {
      "amount": 500,
      "unit": "ML"
    },
    "productNames": [
      "litchi"
    ]
  },
  "isAddToCart": false
}

2.5 Specifying Custom Events for Deeper Analytics

As part of the Analytics offering from CONVA, it offers a few metrics that can be used to track the effectiveness of adding the Voice Search experience.

  • VTS - Voice-to-text - tracks the affinity of users to do voice or text searches for any search session

  • CTR - Clickthrough Rate - allows the app to compare the click-throughs that happened on the search results page when the user landed on it via text or voice search

In order to compute these metrics, the app is expected to share a couple of events at the appropriate time. Let's look into each of those required events

2.5.1 Text Search Events

CONVA provides an API with which the app can specify if a search happened via UI/Text based interactions. This event would be used to determine the adoption and engagement of the voice compared to the traditional UI-based interactions.

SlangRetailAssistant.notifyTextSearch("<SearchString>");

2.5.2 CTR Events

Use the below API to inform CONVA when a user clicked on the search results. Note that the user could have clicked on different sections of the search result - wishlist, product, add-to-cart. Providing the exact section that was clicked will allow CONVA's Analytics to provide more fine-grained comparisons between voice and text searches.

SlangRetailAssistant.notifyCTREvent(
    new HashMap<String, String>(){
        {
            // The section that was clicked. Here are the
            // supported strings 
            // "NavigatedToProductPage"
            // "AddedToCart"
            // "AddedToWishlist"
            // "ShadeSelected"
            put("eventName","<SectionName>"); 
            // The product item that was clicked
            // Eg: "Organic Onions"
            put("itemName","<ItemName");
        }
});

2.5.3 Associating App specific UserIds to the CONVA Analytics Events

By default, CONVA-generated analytics events do not capture any PII data about the user. We use a CONVA-generated DeviceId to uniquely identify the device on which the app instance is running. But if an app wants CONVA to associate these events with any user ids that it already knows about the current user (say a logged-in user), it can use the below API to trigger that association.

SlangRetailAssistant.setUserId("<UserId>");

2.6 Assistant Lifecycle Events

The Slang Assistant handles most of the heavy lifting of interacting with the end-users and notifies the app when there is some action to be performed by the app. But in some cases, apps may want to be notified of low-level events that the Assistant is aware of. For example, whenever a user clicks on the trigger (the microphone button) or when the user cancels the Slang surface.

Access to low-level Assistant events is available through the Assistant's Lifecycle Events API.

Registering for events

The app can register with the Assistant to be notified of all interesting life-cycle events via the setLifeCycleObserver method.

SlangRetailAssistant.setLifecycleObserver(new SlangRetailAssistant.LifecycleObserver() {

            @Override
            public void onAssistantInitSuccess() {
                Log.e("LifecycleObserver", "onAssistantInitSuccess");
            }
            @Override
            public void onAssistantInitFailure(final String description) {
                Log.e("LifecycleObserver", "onAssistantInitFailure");
            }
            @Override
            public void onAssistantInvoked() {
                Log.e("LifecycleObserver", "onAssistantInvoked");
            }
            @Override
            public void onAssistantClosed(boolean isCancelled) {
                Log.e("LifecycleObserver", "onAssistantClosed");
            }
            @Override
            public void onAssistantLocaleChanged(Locale changedLocale) {
                Log.e("LifecycleObserver", "onAssistantLocaleChanged");
            }
            @Override
            public boolean onUnrecognisedUtterance(String utterance) {
                Log.e("LifecycleObserver", "onUnrecognisedUtterance");
            }
            @Override
            public void onUtteranceDetected(final String utterance) {
                Log.e("LifecycleObserver", "onUtteranceDetected");
            }
            @Override
            public void onOnboardingSuccess() {
                Log.e("LifecycleObserver", "onOnboardingSuccess");
            }
            @Override
            public void onOnboardingFailure() {
                Log.e("LifecycleObserver", "onOnboardingFailure");
            }
            @Override
            public void onMicPermissionRequested() {
                Log.e("LifecycleObserver", "onMicPermissionRequested");
            }

            @Override
            public void onMicPermissionDenied() {
                Log.e("LifecycleObserver", "onMicPermissionDenied");
            }
            @Override
            public void onMicPermissionGranted() {
                Log.e("LifecycleObserver", "onMicPermissionGranted");
            }
            @Override
            public void onCoachmarkAction(SlangRetailAssistant.CoachmarkType coachmarkType, SlangRetailAssistant.CoachmarkAction coachmarkAction) {
                Log.e("LifecycleObserver", "onCoachmarkAction "+coachmarkType.name() + " " + coachmarkAction.name());
            }
        });

As part of the Lifecycle Events API, an observer will be notified of the following events:

  • onAssistantInitSuccess

    • Called as soon as the Assistant has initialized successfully and is ready to serve the app

  • onAssistantInitFailure

    • Called when the Assistant failed to initialize successfully. The reason is passed as a parameter to this callback

  • onAssistantInvoked

    • Called whenever the Assistant has been launched (this can be either as a result of the user clicking on the trigger or the app invoking the Assistant via the startConversation API)

  • onAssistantClosed

    • Called whenever the Assistant has been dismissed. A boolean parameter isCancelled is passed to indicate whether this happened because the user cancelled the session or if the Assistant was done with its job

  • onAssistantLocaleChanged

    • Called whenever the user changes the locale of the Assistant. A locale dictionary parameter is passed to indicate the current locale.

    • locale is a dictionary that contains the following fields :

      • country : field that represents the country in 2 characters. Example: "IN", "US"

      • language : field that represents the locale in 2 characters. Example: "en", "hi", "ta"

  • onUnrecognisedUtterance

    • Called whenever the Assistant is not able to understand what the user spoke. The utterance that the user spoke is passed as a parameter.

  • onUtteranceDetected

    • Called whenever the Assistant has detected an utterance that was spoken by the user. The utterance that the user spoke is passed as a parameter.

  • onOnboardingSuccess

    • Called whenever the Assistant has completed the entire onboarding process for the given user.

  • onOnboardingFailure

    • Called whenever the Assistant onboarding process has been cancelled by the user.

  • onMicPermissionGranted

    • Called whenever the microphone permission that is required by the assistant has been granted by the user.

  • onMicPermissionDenied

    • Called whenever the microphone permission that is required by the assistant has been denied by the user.

  • onCoachmarkAction

    • Called whenever the coachmark UI that is provided by the assistant has been interacted with by the user. It additionally provides an CoachmarkInfo object that describes the action.

    • CoachmarkInfo is a dictionary that contains the following fields :

      • coachmarkType : "ONBOARDING", "USER_HINT", "LOCALE_CHANGE_HINT"

      • coachmarkAction : "VIEW", "CANCEL", "CLICK", "UNKNOWN"

2.7 Assistant Prompts

Based on the how the Assistant was configured in the Console returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.

The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.

That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant to your app.

Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to the more advanced needs of the apps. Please refer to the Advanced Concepts section for more details.

2.8 Nudging Users

We also offer a nudge API to trigger the coach mark on the CONVA trigger on demand.

NOTE: This will only work with the Global and Inline Trigger that is managed by the SlangAssistant SDK. It will not work in the case of custom app Triggers/Mic Buttons.

2.8.1 Nudging with messages controlled remotely

The CONVA console allows specifying the message to be shown in the nudges. So to dynamically show the coachmark with the messages configured in the console, use the below APIs

SlangRetailAssistant.getUI().nudgeUser();

2.8.2 Nudging with custom runtime messages

Sometimes it would be useful to show more contextual messages to the user to educate them about the Voice capability in the app. Eg after the user does a text search, the app can show a coachmark pointing to the mic trigger and informing the user to try voice next time.

To perform such contextual nudges, use the below API

The user can pass language-specific prompts and CONVA will pick the right language based on the currently selected locale.

There are two strings that can be specified.

  • The title string - the one that shows up in bold in the first line

  • The description string - the one that shows up in regular style in the second line

SlangRetailAssistant.getUI().nudgeUser(sActivity,
    new HashMap < Locale, String > () {
        {
            put(SlangLocale.LOCALE_ENGLISH_IN, "Title");
            put(SlangLocale.LOCALE_HINDI_IN, "शीर्षक");
        }
    },
    new HashMap < Locale, String > () {
        {
              put(SlangLocale.LOCALE_ENGLISH_IN, "Description");
              put(SlangLocale.LOCALE_HINDI_IN, "विवरण");
        }
    }
);

Last updated