Mobile A/B Testing 101
There’s a reason why A/B testing is at the core of Leanplum. It’s suboptimal to make large-scale business decisions based on guesswork — and considering the amount of data that modern analytics software can track, there’s no excuse to do so.
There are plenty of reasons to A/B test, but let’s look at the specifics.
What is Mobile A/B Testing?
An A/B test is an experiment that simultaneously compares two different versions of the same website while measuring key metrics. Mobile A/B testing is the processing of using these experiments to optimize a mobile app. For a quick explanation by Leanplum’s very own CEO, read this post.
Usually A/B tests only change one variable at a time, in order to correlate the variable with the result. An A/B test that changes multiple variables at the same time is called a multivariate test, or MVT. These tests are useful for measuring the interaction between different variables on the same page, but they must be designed carefully in order to find meaningful correlations.
What Should I Mobile A/B Test?
For mobile, nearly everything can (and should) be A/B tested. You can refer to this handy list of in-app content that should be A/B tested.
Many people think of A/B testing in terms of superficial design changes, like tweaking the color of a login button. Ideally, your A/B testing process would encompass much more than UI elements.
Leanplum’s A/B testing system is powerful because it can test any variable in and outside the app. This includes:
If you want to optimize your marketing and content, it makes sense to test every variable. Statistically significant changes will be automatically highlighted on the Leanplum analytics dashboard, so don’t worry about being overwhelmed by results. We offer out-of-the-box data science to make analyzing test and campaign results easy.
What Do I Gain From Mobile A/B Testing?
Besides the obvious upside of a higher conversion rate, there’s one big reason why you should A/B test every change to your app. Testing is the only way to make sure that your change didn’t accidentally make the user experience worse.
In our case study with App Annie, product manager Eric MacKinnon explains that enterprise apps worry more about breaking something than about improving conversions. In his own words, A/B testing for popular apps is all about “ensuring that necessary changes (such as updating and modernizing the app) don’t have an adverse effect on user behavior.”
Foregrounding both negative and positive effects is a big part of Leanplum’s mobile A/B testing platform. We’re proud to offer two-tailed testing, which means that the negative changes of an A/B test are displayed alongside the positive ones. For example, if you’re A/B testing a messaging campaign, you don’t have to worry that one push notification increased your open rates, but also increased app uninstalls. You get a holistic picture of app performance, to understand campaign tradeoffs.
We can daydream about optimizations that boost conversions without any consequences, but in reality, every change has a tradeoff. You can’t make good optimization decisions without knowing what you’re giving up.
What Makes a Good Mobile A/B Test?
Deciding to A/B test your product is a step in the right direction, but it’s not the last step. You need to design an effective test in order for analytics software to help you.
For quick reference, we made a list of the four commandments of mobile A/B testing.
In general, A/B tests should be thorough. Take the time to design a test for every meaningful variable you can imagine. Part of our analytics philosophy is that you don’t know what you don’t know, so we automatically show users all statistically significant changes in their campaigns. Even if a variable doesn’t seem crucial to you, the data might tell a different story.
Once your test is set up, you’ll have to consider which segment of your users to target. This decisions depends on the goals of your test.
If you’re implementing a brand new feature, it might be best to only expose the feature to your most dedicated users. You can send that segment a message announcing the new feature, and offering a channel for them to provide feedback. They’ll have an easier time understanding the feature, and they’ll provide valuable usage data. Using our Time Estimator, you’ll know exactly how long the campaign will take to reach statistical significance, before you press go on the test.
Once you’ve tested user reactions and you’re ready to roll out that feature to your full audience, you can send all your users a message about the update.
Alternatively, if you’re making a minor layout change, you can place frequent users and infrequent users into different segments. Perhaps infrequent users will be more confused by the change than frequent users (or vice versa). If the two groups are lumped together, you’ll be given a less-than-helpful mean that doesn’t tell you much about whether the layout change was good or bad.
Ready to Start Your First Mobile A/B Test?
A/B testing is the crux of a successful mobile marketing campaign. If you’re using mobile marketing software like Leanplum, it’s worth taking the time to set up detailed tests, especially if you’re making frequent changes to your app. Fortunately, Leanplum’s tools make implementing tests a breeze. Start small, and build up to more sophisticated tests once you’ve gotten the hang of the platform.
Leanplum is the most complete mobile marketing platform, designed for intelligent action. Our integrated solution delivers meaningful engagement across messaging and the in-app experience. We work with top brands such as Expedia, Tesco, and Lyft. Schedule your personalized demo here.