A/B testing is something we come across more and more, typically in organisations with greater digital maturity. While I love AB testing, too often I see firms using a shot gun-based approach lacking genuine insight to create meaningful hypotheses and hoping to hit a random target. While this sounds like a rant, I think it is important to consider our ways of working in the digital space and ensure that we gain maximum value from our efforts.

So I was interested to come across this article on the Marvel App blog with a wonderfully click baitey title 'A/B Testing – You’re Doing It Wrong'.

A/B testing article

The article includes some interesting findings:

  • Only 10% of experiments resulted in actionable change — formally releasing a new version of a page or feature

  • 50% of teams could not make decisions from A/B testing experiments due to inconclusive or poorly measured data

This sums it up nicely: “Companies may be running A/B tests too frequently for too little time, contributing to a high failure rate that makes A/B test results less valuable and meaningful.” Note that the above data is from small and potentially non-representative sample or as the author states "This next dataset is from a qualitative and quantitative A/B testing survey of 26 A/B testing practitioners conducted from May 1 to May 30 in 2016 (Northwestern, IDS — Justin Baker, 2016). While this is not the end-all be-all of surveys, it could still give us some meaningful insights". While the sample may not be perfect (when are they?) the findings reflect my personal experience.

I believe that at the heart of the issue of poor A/B testing is a lack of data triangulation to understand the user experience and use this to create hypotheses that can be tested. Some of the most meaningful A/B tests that I have seen were when qualitative data was used to understand a problem and then a solution was identified and tested.

We find that combining AB testing with some form of qualitative research (such as journey mapping research) is ideal for understanding the 'why' of an issue, and also for informing test hypothesise.

What are your thoughts?

About the author

About the author

Author photo
Author photo
Author photo

Chris Gray

Managing Director & Principal Consultant

Chris is a leader in the Human Centred Design field with a 20+ year track record of improving customer interactions with some of Australia’s largest organisations. He is a strategic thinker who brings a calm and considered approach to tackling complex problems. An accomplished workshop facilitator, Chris excels at engaging with senior stakeholders and guiding projects to success. Chris has expertise in user research, service design and embedding Human Centred Design within organisations.

Latest posts

Interested to know more? Let’s Talk.

Interested to know more? Let’s Talk.

Interested to know more?
Let’s Talk.