As developers, we’re occasionally (okay, maybe more than occasionally) stuck in the middle between designers who ‘know what works’ and executives who ‘know what they want. Even in smaller shops, it may not be clear which user experience will more often result in the desired behavior. Beyond simple use tracking, testing two different options to determine which works better usually meant two separate APKs and a lot of data mining and management to see which was the best.
Amazon has changed that with the release of the A/B Testing Service, where developers can run experiments within one APK. You define the variables to test in each of two variations, and then decide what percentage of downloads will get each variation. The app then collects data and allows you to make an informed decision about which variation you want to enable. These variations could vary from the speed of the ball in a game, or the message displayed while trying to upsell an In-App purchase item like extra lives. It’s easy to configure and integrate the A/B Testing Service with your app and it’s also free for any developer distributing their apps on the Amazon Mobile App Distribution Program for Android.
In this post, you will learn how to integrate A/B testing into your app. For our example, we will use the “Snake Game”. In the traditional game, the speed of the snake increases every time it is fed. We will run tests to figure out the optimal speed increment in order to ensure that the game is neither too easy nor too hard for the player and that the player is always engaged. In our case, a successful test would be if 70% - 73% of players are able to feed the snake 20 times before the snake collided with the boundary or the snake itself. This will give us objective data on whether the increment was too high, too small, or just right.
Once you have identified your test, you can create an A/B test by going to the Mobile App Distribution page to create it.
In our example, we will create a project called “Snake Speed Project” and an A/B test called snakeSpeedTest. We will use this to test out various increments in the speed of the snake until we find the optimal one.
To configure an A/B test you will need the following information:
In our example, the test would look like the screenshot below:
For more details on how to setup an A/B test, please visit the startup guide.
Integrating the API
Now that you have a test set up in the Mobile App Distribution page, you’re ready to integrate it into your application. For this, you will need to download the SDK.
After downloading the SDK you will need to integrate it into your project. For more information on how to setup your project, please visit Integrate the SDK.
To initialize the Amazon Insights SDK, you will need to obtain the following from the Mobile App Distribution page:
You can now initialize the SDK using these two keys.
// Configure the Amazon Insights SDK
AmazonInsights
.withApplicationKey(YOUR_APPLICATION_KEY)
.withPrivateKey(YOUR_PRIVATE_KEY)
.withContext(getApplicationContext())
.initialize();
Now that your application is initialized, you can start receiving different variations for your test. In our case, it is the increment by which to increase the snake speed.
//Get a variation allocated for the “Snake Revive Project” experiment and
//retrieve the speed variable.
ABTest
.getVariationsByProjectNames("Snake Speed Project")
.withVariationListener("Snake Speed Project",new VariationListener() {
public void onVariationAvailable(Variation variation) {
speedMultiplier = variation.getVariableAsString("speedMultiplier",
"feedingTime");
//... increase speed.
}}
);
After you have successfully retrieved the variation, you would need to notify the Amazon A/B Testing Service of a successful view. You can do that by simply adding the following code. (Note that snakeSpeedIncremented is the same event we added in the A/B testing portal page for counting views)
// record when the snake is fed for the first time only (visit)
CustomEvent
.create("snakeSpeedIncremented")
.record();
Once the game ends by either the snake colliding with the boundary or itself, we will check the count of the how may times it was fed. If it was more than 20, then we will record a successful conversion. (Note: snakeLevel20Reached is the same event we added in the A/B testing portal page for counting conversions)
// record if number of feeds is more than 20.
if (noOfFeeds > 20) {
CustomEvent
.create("snakeLevel20Reached")
.record();
}
Once you have incorporated the SDK and deployed it through the Amazon Mobile App Distribution Program, you now start collecting data.
In our case, we determined that 95% of the players reached level 20 for both test increments, which suggest that the game play was easier than our target. We ran additional rounds of tests by doing launches with new increments and found that the 1.65 multiplier was the optimal level of difficulty, as the conversation rate was around 71%. Refining the increment amounts to do new rounds can be done by just going to the A/B test page. No new APK is required.
The Start your first A/B test guide tells you how you can start an A/B test, view results, and end a test.
As you can see, setting up and integrating Amazon’s A/B Testing Service is simple and straightforward. The best part is that it’s free for developers of the Amazon Mobile App Distribution Program.