CONVA Plus

Adding advanced multi-turn Voice capabilities into your app

By now you must have configured and published your Assistant via the Slang Console and also maybe customized it as required. Congratulations! :) If you have not already done that, you can do so by following the instructions here.

Let's start coding!

For testing, we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.

1. Configure the build system

The first step is to update the app's build system to include Slang's Retail Assistant SDK.

Add the Slang dependency to your Gradle files

Add the path to the Slang maven repository to your top-level Gradle file

# Add this to your top level Gradle file

allprojects {  
    repositories {    

        maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }  
    }
}

Add the Slang Retails Assistant dependency to your app's Gradle file

# Add this to your app's Gradle file

dependencies {  

    implementation 'in.slanglabs.assistants:slang-retail-assistant:5.0.5'
}

2. Code integration

2.1 Initialization

The next step is to initialize the SDK with the keys you obtained after creating the Assistant in the Slang console.

The recommendation is to perform the initialization in the onCreate method of the Application class. If the app does not use an Application class, the next best place would be the onCreate method of the primary Activity class.

// Your application class 

protected void onCreate(Bundle savedInstance) {  
    ...
    AssistantConfiguration configuration = new AssistantConfiguration.Builder()    
        .setAPIKey(<API Key>)    
        .setAssistantId(<AssistantId>)
        .setEnvironment(STAGING)  // Change this to PRODUCTION once you've published the Assistant to production environment    
        .build();  
    SlangRetailAssistant.initialize(this, configuration);
}

2.2 Show the Assistant Trigger (microphone icon)

Once the Assistant is initialized, the next step is to show the Assistant Trigger (ie the microphone button) that the app's users can click on to invoke the Assistant and speak to it.

Add the below line to the onResume method of the Activities where you want the Assistant to be enabled.

protected void onResume(Bundle savedInstance) {  
    ... 
    SlangRetailAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed
}

By default, the trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call:

     SlangRetailAssistant.getUI().hideTrigger(this)

Note that there are two types of Assistant Icons. Global and Inline. Refer to this page for more details and how to make use of it in your app. The default is the "Global" Assistant Icon

2.3 Implement Actions

Refresher: A UserJourney represents a path that a user may take to reach their goal when using a web or mobile app. See Voice Assistant Concepts for details.

Refresher: The Actions for the various User Journeys can also be specified directly in the console. Refer to the "Define Actions for various User Journeys" section for details

Now if the actions (basically the visual change that the app should do) corresponding to the various User Journeys have not been already defined in the console, the app needs to do that via code and implement the Actions associated with the various User Journeys supported by the Assistant. This can be done as shown below

SlangRetailAssistant.setAppAction(new SlangRetailAssistant.AppAction() {
    @Override
    public SearchAppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {        
        // Handle search requests
        // ...

        return new SearchResultsAppState(OrderViewAppState.SUCCESS);
    }
   
    public OrderManagementAppState onOrderManagement(OrderInfo orderInfo, OrderManagementUserJourney orderourney) {        
        // Handle order management requests
        // ...
        
        return new OrderViewAppState(OrderViewAppState.SUCCESS);
    }
    
    public OfferAppState onOfferManagement(OfferInfo offerInfo, OfferManagementUserJourney offerManagementUserJourney) {
        // Handle offers requests
        // ...
        
        return new ViewOfferAppState(ViewOfferAppState.SUCCESS);
    }  
    
    public CheckoutAppState onCheckOut(CheckoutInfo checkoutInfo, CheckoutUserJourney checkoutUserJourney) {
        // Handle checkout requests
        // ...
        
        return new CheckoutCompleteAppState(CheckoutCompleteAppState.SUCCESS);
    }
    
    public NavigationAppState onNavigation(NavigationInfo navInfo, NavigationUserJourney navJourney) {        
        // Handle navigation requests
        // ...
        
        return new NavigationCompleteAppState(SUCCESS);
    }
        
    @Override
    public void onAssistantError(final AssistantError error) {
        // Handle errors that might have occurred during the Assistant lifecycle
        
        // Error codes available 
        // FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS, 
    }
}

The following user journeys are currently supported by the Slang Retail Assistant:

  • Voice Search

  • Voice Order Management

  • Voice Offers

  • Voice Checkout

  • Voice Navigation

Backward compatibility note: Earlier offers and checkout were targets inside Navigation Journey. Now they have become their own full-blown user journeys

The Action Handler interface has an explicit callback for each of the supported user journeys. Whenever the Assistant detects the user's journey (based on what they spoke), it invokes the callback associated with that user journey.

When these callbacks are invoked, the Assistant also passes the parametric data corresponding to the user journey that the Assistant was able to gather. The app is then expected to:

  1. Consume the parametric data as needed

  2. Optionally launch appropriate UI actions

  3. Set appropriate conditions in the Assistant based on the app's internal state

  4. Return the AppState that the app transitioned to

2.4 Return the AppState and Condition

Refresher: An AppState typically corresponds to a screen that the app transitioned to based on user input. See Voice Assistant Concepts for details.

An AppState indicates which state the app transitioned to, based on the user-journey and parametric data that was passed to the app. The list ofAppStates that are supported depends on the user journey.

Conditions represent more detailed states of the app within a particular app state. For example, the search might have failed when performing the search or the items might be out of stock. The app can use Conditions to indicate the correct condition to the Assistant. The condition controls the message that the Assistant speaks up after the call-back returns.

public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {        
   // Handle the Search requests
   // ...
   
   return new SearchResultsAppState(SUCCESS);
}

2.4.1 Assistant Prompts

Based on the AppState returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.

The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.

That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant into your app.

Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to the more advanced needs of the apps. Please refer to the Advanced Concepts section for more details.

Last updated