Take assumptions out of the equation with A/B Testing

Take assumptions out of the equation with A/B Testing

A/B testing is not a new thing. In fact, it's been around for as long as I can remember. I think it's important to pause and take stock of how valuable A/B testing really is when it comes to improving business performance.

The overall premise of an A/B test is to test a B variant against the control (A variant) to see which has the better performance. That performance could be revenue, conversion rate, clicks, average order value or any other metric you're trying to influence with the test. The idea being, you test before you fully action the change to ensure that it isn't actually a terrible idea. There's nothing worse than making 'incremental improvements' only to realise that what you're doing is creating a worse experience for your customers.

 

However, the key benefit we see with any A/B testing is the shift in mindset of those doing the testing. Knowing that the changes you're putting in place and moving the needle in the right direction is great, but forcing yourself to always think about what could be done better, even if it's just slightly better, is a great place to get to.

I'm not talking button colour changes, which was once a top recommended test by every eCommerce expert out there (I'm also not saying it's not a valid test, but let's face it, you'd need to be pretty far down your list of test hypothesises to get to that one), I'm talking about value propositions highlighted on homepage, testimonials/product reviews on product, cart or checkout, homepage structure hierarchy or gift with purchase vs free shipping. All of these tests take what we think we know and challenge them, to see what actually works for our customers.

"It's worked before, so it'll work now"

It's so easy to get bogged down with the idea that it's worked before, or it's worked for someone else, or worst, it's what I would want if I were the customer, when the reality is, it's none of those. We simply don't know what our specific customers are looking for, and forcing our own assumptions is a very dangerous game. A/B testing allows us to test those assumptions before making the decisions, and using actual data to determine if what I want as a customer is actually what our customers want.

"I just don't think it's the right thing for our customers"

The other wonderful thing about A/B testing is that it gives you an easy answer for the inevitable push back on certain ideas. You know the one, it's when someone else says "I just don't think it's the right thing for our customers". Without A/B testing, that idea is dead in the water. It's too hard to get it past the powers that be. But, with the help of Google Optimize or any other testing tool, you can now respond with "how about we A/B test it and let the customers decide". 

This line has a surprisingly good hit rate when it comes to getting an idea actioned without full buy in. It's the diplomatic approach to site improvement. You can run the test for a few weeks and present a well rounded argument to either fully action the change, or leave things the way they are and move on to the next test. 

 

A/B testing has been around a very long time, but it's only recently that I've fallen back in love with the practice. Continuous testing has seen some phenomenal results for our clients, and really helps us push past our biases and opinions and craft an experience that our customers respond well to.