Sticky Smartphone

3 notes &

How to get retention & engagement numbers out of mobile analytics A/B testing

For most people, A/B testing analytics (or split testing) is all about conversion. How can I redesign this webpage to get more people to click through to my goal? How can I get more people to sign up. It’s all about acquisition, acquisition, acquisition… actually there’s more to it than that.

For mobile app analytics, A/B testing tools generally support campaign or content optimization  focused on conversion, clicks or triggers. This is fine if what you are doing is marketing focused, acquisition focused or activation focused.. but it doesn’t really help much with engagement or retention. What if you wanted to find out the impact of :-

  • Giving new players to your game different starting statistics?
  • Change the order that the app screens are presented?
  • Modify a tutorial page

etc. what change will increase retention, not just conversions. What change will increase use of other features?

What conventional A/B testing doesn’t tell you - 

  • Engagement - Which users are more engaged to my app as a whole, or feature X of my app because of A or B (besides the feature/layout under A/B test)?
  • Retention - Which users are more likely to be retained (increase retention) because of A or B
  • Features - Given that A or B increased retention, what features are used less or more with each group?
  • Deeper Activity - Does A or B increase activity anywhere else in my app?

In fact there’s a very easy way to do this using event attributes or event parameters (depends on the app analytics tool you use). In your app framework, assign people who are in the ‘A’ group with an event attibute/parameter as ‘Test Group A’ and likewise for ‘B’ in their App Launch event or whatever naming convention makes sense for you. You can give more unique names for different tests.

By doing this, you have segmented your users into A & B and now you can test the impact across every event/metric in your app.

Segment and extract all the data into a spreadsheet and you can then statistically test them like this -

In the above chart anything in Green was tested as statistically significant. This gives a deeper insight into what other changes occured due to the A/B test and means you can make A/B tests beyond thinking about ‘click’ or ‘tap’ conversions at the top level, remove features that don’t add to engagement, retention or revenue.

Secondarily, using tools which provide retention or lifecycle metrics, you can create a segment using the event parameter and see which version really does provide more retention and allows you to make more discoveries

  • Conversions where higher with A, but yielded lower retention. Customer Lifetime Value may be lower because of this so long term revenue could decrease
  • Conversion was higher with A, but higher value conversion with a deeper feature is higher in B
  • There is no significant difference in conversion between A or B, but retention is higher with A
  • There is no significant difference in conversion but engagement is higher with B

These are some of the potential discoveries just looking beyond the basic A/B test.

Do you have any mobile analytics tricks to share?

Image by Search Engine People Blog

Filed under a/b testing app analytics conversion engagement mobile a/b testing mobile analytics mobile apps mobile metrics retention retention and engagement split testing

  1. stickysmartphone posted this