No results found

Try a different or more specific query
Developer Console
Appstore Blogs

Appstore Blogs

Want the latest?

appstore topics

Recent Posts

Archive

Showing posts tagged with Testing

May 26, 2017

Serena McIntire

ATS-blog-2.png

Do you have an Android app or game? If so, you may be able to publish it on the Amazon Appstore without needing to make any changes. Eighty-five percent of Android apps are already compatible with the Appstore platform and don’t require any additional development work.

[Read More]

May 08, 2014

Paul Cutsinger

Ready to test your app on the Amazon Fire TV? This video shows you how to enable the device for ADB debugging and side load your app. You can learn more about developing for the Amazon Fire TV here and get the SDK here.

 

January 21, 2014

Russell Beattie

The Amazon Mobile App Distribution Program enables developers to create Kindle Fire apps using existing HTML5 mobile web apps. It’s also a good way for web developers to start creating mobile apps using the skills and knowledge they already have.

Back in December we covered how to get your existing web apps onto actual devices with a Webinar on Submitting HTML5 Web Apps to the Amazon Appstore, and companion blog post focusing on setting up the Amazon Web App Tester to debug and test your apps.

We wanted to follow up with some more details to help you get the most out of the tester, which is a key part of the HTML5 web app creation process on Kindle Fire tablets. The Web App Tester has a variety of powerful features which can be used to make development faster and easier. Below are a few ways to best take advantage of all that great functionality. 

Amazon's Web App Tester

What exactly is Amazon's Web App Tester? It's a downloadable app which lets you test your web app in a production-like environment on your Kindle Fire or Android device, without first submitting it to our store. The tester contains the same web engine and libraries that will run your web app when it is wrapped into a downloadable app. The tester however, has an interface that lets you enter in your own custom URLs, and most importantly, enables remote debugging for development and testing using your desktop computer.

Because the Web App Tester is based on the same technologies as the final wrapped app, you should be able to better assess your app's performance and functionality - and more quickly work through any problems you might encounter - on the device itself, rather than via an emulator or simulator, which may not be as accurate.

Additionally, libraries that are pre-loaded into the final downloadable app, such as the In App Purchasing API for JavaScript, are also built into the Tester so you can debug your IAPs before you launch. (Look for a more detailed post about enabling IAP for your web apps coming soon).   

Managing Multiple URLs

The first thing you'll encounter when using the Web App Tester is a place to enter a URL for testing. This is great for easy ad-hoc testing, but if you have complicated URLs to enter, or have multiple URLs that you need to manage, you can create a JSON file of URLs and put it into the root folder of your Kindle Fire. The file has to be named amazon.testerurls.json and placed in the /mnt/sdcard/ folder on your device. Here's how the list of URLs should be formatted:

{ 
   "urls":[
      "http://m.imdb.com",         
      "http://m.amazon.com",         
      "https://read.amazon.com"]     
}
 

The easiest way to get it to the correct spot on your device is via the command line using the Android Debug Bridge (ADB), which comes as part of the Android SDK. Assuming you've set up the SDK correctly, you only need to connect your device using USB, open a command line, change to the directory where your JSON file is located and run this command:

$ adb push amazon.testerurls.json /mnt/sdcard/ 

Remote Debugging Options

If your device and your computer share the same network, you can enable Web Developer Tools for debugging over WiFi. If you're on a corporate network, or want to test aspects of your app including offline functionality or WLAN speeds, you can connect via USB using ADB.

Helpfully, the Web App Tester gives you all the details you need to enable Remote Debugging once you start the app. Simply click the full screen handle at the bottom or side of the screen, and swipe down. There you'll see options to enable remote debugging using ADB or via your network.

 

Once you choose, a dialog box pops up with instructions and the exact URL to enter into your Chromium based browser (for debugging). For example, here are the instructions for remote debugging via WiFi:

 

Close that dialog, enter the URLof your web app to test and you'll be all set to debug via a desktop computer.

Dev Tools Tips

Once the Dev Tools page is open in your desktop browser, remote testing and debugging of your mobile web app should become as familiar as doing development for desktop browsers. Here are some things to watch for.

On Device Tools. First, note that all the normal functionality you'd find in Dev Tools is live, but running on the device itself. In fact the entire UI is a static HTML app served from the device, which then communicates back via Web Sockets. Viewing the network speed, recording the timeline activity or profiling is all happening with the device's hardware.

Reload Shortcut. When doing development for a desktop browser, you may be in the habit of clicking the reload icon in the browser to refresh the page. Rather than exiting out of your app on the device, and then re-starting, you can simply type Command-R/Control-R inside the remote Dev Tools window to refresh the contents of the page on the device itself.

Live Inspection. Just like on a desktop browser, you can use the inspect icon to help pinpoint elements on the screen within the HTML5 markup of your app - rather than clicking the screen, just activate the inspect icon, and then touch the screen to find that element within the Dev Tools. It goes in reverse as well, notice that as you use your mouse to hover over the markup in Dev Tools, the corresponding elements light up on the device.

FPS Meter. In the Dev Tools, you can use the settings icon to turn on the FPS meter, which displays in the top corner of your device. This will let you get a live view of how fast your app is refreshing, without having to add in additional libraries.

On Device Debugging and Console. You can step through JavaScript code just as you would normally. Additionally, the console is also live, with the JavaScript engine running on the device. This allows you to use console tricks such as $0 to refer to the selected element, navigate using document.location.href, or even pop up an alert() window if needed.

Remote Debugging API

Because the Web App Tester's Dev Tools use the same remote debugging protocol as desktop Chromium browsers, they can be accessed not only from Dev Tools, but from text editors, IDEs or via scripting languages such as Python. Here's an example using the chrome-remote-interface Node.js module.

First, install the library using NPM:

npm install chrome-remote-interface 

Then create a test.js file with this boilerplate example (modifying the options variable as needed):

var Chrome = require('chrome-remote-interface');

var options = {

host:  'localhost',

port:  '9222'

};

Chrome(options, function (chrome) {

    with (chrome) {

        on('Network.responseReceived', function (message) {

            console.log(message);

        });

        on('Page.loadEventFired', function (message) {

            console.log("----------------------- page loaded ");

            console.log(message);

        });

        Network.enable();

        Page.enable();

        Page.reload();

    }

}).on('error', function (err) {

    console.log(err);

    console.error('Cannot connect to Chrome');

)};

Then run the script

node test.js 

As you use your app on the device, you'll see events logged in your console as they fire. The protocol and module will also let you send commands to the remote browser, letting you automate testing on the device and recording the results. For more info, check out the Remote Debugging Protocol pages here.

Hopefully some of these tips will come in handy as you're doing development for your web app, if you have any tips of your own or questions about using Dev Tools, definitely get in touch!

-Russ (@RussB)

 

January 16, 2014

Peter Heinrich

A/B Testing is about using data to challenge assumptions and test new ideas. Watch this video to hear about the “happy accident” that inspired an important A/B test we hadn’t considered and how it led to an increase in retention and monetization for Air Patriots.

Created in-house at Amazon, Air Patriots is a plane-based tower defense game for iOS, Android, and Fire OS. The development team uses A/B Testing to experiment with new ideas, so I recently I sat down with Senior Producer Russell Carroll and Game Development Engineer Julio Gorge to discuss how they used the service on Air Patriots. They described for me the design choices they tested, how the experiments were constructed, and what benefits they derived from the results.

Check out the conversation to learn how Russell and Julio’s experience on Air Patriots made them advocates for A/B Testing in every mobile app, especially those offering in-app purchase.

 

January 13, 2014

Peter Heinrich

Closely following the launch of the A/B Testing service for iOS, Android, and Fire OS apps, we have just released an update addressing one of our most popular feature requests. You can now track up to ten goals in a single A/B test, which means you can see how your experiment affects up to ten metrics at once. This is especially powerful when the metrics aren’t entirely independent and it would be difficult to create A/B tests to isolate them from each other. Let me illustrate with an example.

Say you have a mobile game that generates revenue using a combination of in-app purchasing (IAP) and mobile ads. You know that player engagement is the key to monetization, so you decide to test a hunch that more challenging levels will keep players in the game longer.

You create an A/B test project for your app, adding an experiment that allows you to adjust the overall difficulty of each level. Since you can have up to five variations for each test (see A/B/n testing for more information), you decide to measure player engagement when the game is much harder, slightly harder, slightly easier, and much easier than normal. “Normal” will be a variation of its own, called the Control.

In this case, you create a test variable called difficultyMultiplier, which your code can access and use to modify its behavior for each user. For the control group (60% of players in this example), difficultyMultiplier is 1.00, indicating no change from the default difficulty. The other groups see a slightly different value for difficultyMultiplier, depending on how hard the game should be for those players.

To measure the effect of changing this variable, you define a view event and a conversion event, which your code records as they happen and reports to the A/B Testing service. For the purposes of this test, you consider it a view whenever a player starts a new game session. A conversion is registered if he/she plays for five minutes or more. The A/B Testing service tabulates these events by variation and reports on the conversion rate for each group of users.

Say you run the experiment and discover your hunch was right: harder levels are played longer, leading to an increase in the average amount of time players engage with your game. The logical next step would be to ratchet up game difficulty. But what if improved engagement isn’t the whole story? Changing the difficulty may affect other metrics you care about, but you can’t always tell based on a single type of conversion event. For example, how does this change the way people share their progress on Facebook, a major customer acquisition channel? How does it impact ad click-thru rates? Does it impact how users rate the game? Setting multiple goals can help you detect such unintended consequences and choose the variation that delivers balanced results.

Now that the latest version of the A/B Testing service allows a single view event to be associated with up to ten different conversion events (goals), you can measure and compare the impact of each variation along more than one axis. Each goal can be maximized or minimized independently. For example, here you are trying to maximize game sessions, in-app purchases, ad clicks, and Facebook shares while minimizing one-star reviews, all in the same experiment.

When generating reports, the A/B Testing service includes the results for all goals associated with an experiment, organized by variation. The service highlights the “best” variation with respect to each goal, so you can tell at a glance which one resulted in the most game sessions, for example (Variation C), or maximized shares on Facebook (Variation A).

When goals overlap or depend on one another, as they do here, there may be no single variation that definitively “wins” every goal. A report like the one above, however, can help you make an educated choice, weighing the trade-offs of each alternative. In this case, Variation B looks like a good candidate since it succeeded in minimizing one-star reviews and came close to winning several other goals as well. When you look at the big picture, Variation B appears to have the best performance overall.

The orange checkmarks indicate which results achieved statistical significance—that is, where there are enough measurements to be confident that the observed change is actually due to the test variation. More details are available for each individual goal, so you can drill down on the ad clicks, for example, associated with each variation:

With the addition of up to ten goals for a single experiment, the A/B Testing service expands its flexibility and becomes an even more powerful tool for refining your app and optimizing it based on customer behavior. For more information on A/B testing, multiple goals, and how you can incorporate them into your mobile app or game, check out the online documentation.

 

November 07, 2013

Mike Hines

Now is one of the best times of year to submit your apps to the Amazon Appstore and have them published for Android phones and tablets, including the new Kindle Fire line of tablets. In 2012, we saw a 50% increase in the number of app downloads during Thanksgiving week as compared to an average week. During ‘Digital Week’ in 2012, the week after Christmas, customers purchased and downloaded 600% more apps than an average week during the year.

As we’ve noted in earlier blog posts, it’s easy to get started as 75% of the Android tablet apps we’ve tested already work on Kindle Fire without any extra development. Amazon also has a tool that quickly ensures your apps have the best chance of passing both Amazon Appstore and Kindle Fire compatibility testing. Even if you’ve already got an app in the Amazon Appstore, you can use this service to check out any updates you plan on submitting.

The testing tool works fast and screens your apps for potential errors or incompatibilities. For example, you’ll learn:

  • If there are structural issues with implementation of Amazon APIs
  • If you are using any libraries that might impact compatibility
  • If your app has features that are not supported by some Kindle Fire devices

If an issue is found, you will also get some suggestions for fixing the problem. Note, the tool is not designed to replace debugging in your IDE, and it won’t find null pointer exceptions or similar coding errors.

To get started, you can find the tool in the SDK & Tools area of the Developer Portal where there is now a link for the App Testing Service.
 


 

The App Testing Service detail page includes a brief description of the tool and a button that initiates the App Testing Service. Clicking that button brings you to the app testing page (below) which contains a control into which you can drag your .apk.
 


To give you a sense on the experience you can expect, I’ll walk you through the short process. I started with a small quotation app that I created and dragged it into the tool. The testing was complete in under a minute and my app passed. The tool then displayed a ‘Submit to Amazon Appstore’ button that I could use to start the app submission process.

Next, I tested the same app that used Google In-App Billing instead of Amazon In-App Purchasing. The tool caught that error and correctly identified the issue and offered suggestions for fixing my app. Here is that test result:
 


Here is another test result that identifies an error in In-App Purchasing implementation:
 


Once the test is complete, you can find the results of this and all your tests in a table at the bottom of the app testing page. This lets you go back and re-visit previous issues and recommendations across all the apps you have tested.

So don’t miss out on getting in front of all those customers during the holiday season. Save time and go to https://developer.amazon.com/tya/welcome.html and make sure your apps are ready to submit to the Amazon Appstore. 

 

Want the latest?

appstore topics

Recent Posts

Archive