The Global VC

How to Get More Out of A/B Testing When Your ‘Better’ Variant Fails

Article image
500 Global Team

500 Global Team

PUBLISHED

2014.07.02

SHARE

FacebookTwitterLinkedinMail

TAGS

We’re big advocates of a/b tests on your most important real estate. But what happens when you run an A/B test where your new ‘improved’ version doesn’t win?

Today we look at how Love With Food (a 500 company) extracted an important conversion funnel optimization even when their original version won a major homepage A/B test.

First, an A/B review. It’s all too easy to fall into the shotgun approach to A/B testing. Remember that frequency doesn’t make up for randomness, and simply running more test won’t on its own yield meaningful insights or lift.

Instead, the A/B test itself is just one part of the optimization process:

1. Start with a problem
2. INTERVIEW customers
3. Design a hypothesis
4. Test the hypothesis.

If done properly, steps 1 – 4 require creativity, and a good amount of resource investment. All of which can tend to bias us towards the exciting new variant that we worked hard to design.

However, just because your experiment tells you you were right to begin with doesn’t mean you have to call it a total wash.

Let’s see how Love With Food designed a stronger conversion funnel, even when a homepage A/B test pointed to the original as the winner.

Aihui Ong:

We’re always trying to make sure the messaging is clear on the Love With Food Homepage, so we installed Qualaroo and started asking visitors, “What do you think Love With Food does?”

Some people said, “Ok, you are a snack box subscription.” They understood right away.

Others weren’t so sure. For example, people who don’t understand subscription models had a harder time.

We decided to take the customer insights a step further because it’s important that customers understand what we offer right away.

So we went to Starbucks and approached strangers with our laptops. We ran a little UX test, then quizzed them: do you know what we do, what is the pricing, how do you cancel, etc.

All of the people we interviewed said basically the same things:

‘I need more information, one sentence telling me I can discover healthy snacks is not good enough, I don’t know what I’m getting so can you show me what’s been in past boxes’

and so on.

As a result of these studies, and because it seemed so unanimous, we decided to make a homepage version with a lot of information, answering every question. It was very cluttered (which my team hated).

We tested both versions, the clutter versus streamlined. In the end, our original homepage — with less information — still converted more people.

The moral of this part of the story is you have to test.

We asked detailed questions of a group of people face to face, and all of them said “I need more information in order to be convinced!”

However, testing across thousands of site visitors showed the opposite was true.

So you ended up with the original homepage. What was the benefit of this test then?

The face to face part of it helped us to understand the why, but the end result — does it translate to more customers? — was a totally different thing.

We also learned something very valuable from designing the test. We learned that people do want information, and so they like to go from homepage to the About page. Many of our visitors like to read about us, the company and team behind the boxes.

What did you change / improve then? 

A lot of marketers think, the shorter the funnel, the better your conversion rates.

For us, that turned out not to be entirely true.

Our funnel A/B test didn’t change our homepage design, but it did show us that people want to visit our About page, learn about the company and the team before giving us their credit card.

Having the About page in the middle of the funnel actually helped with conversion. It lets us satisfy people’s need for more information but doesn’t cram it all onto one single homepage.

We did the face to face tests, and then we actually created a version with everything the customer wanted. But it was still the original version — the one based on our gut feeling — that was the winner.

But we still learned something extremely valuable from that test, and the insights work that led up to it.

Just because our gut was right that time doesn’t mean we’re going to stop testing.

We say ok this is what customers SAY they want, but this is what our results show… so how can we incorporate the customer feedback in some other way?

———-

Love With Food is a snack box subscription service and ecommerce store that specializes in natural, organic and gourmet snacks.

The company is part consumer-facing sub-commerce, part market intelligence platform. As the company engages its female “foodie” target demographic, it’s also collecting product intelligence for consumer food brands of all sizes, including General Mills, Nestle, Green and Blacks Organic Chocolate, SoyJoy, and Lindt Chocolates.

Love With Food participated in the 500 Distribution Program last fall, working closely with Hacker-in-Residence Matt Berman on paid acquisition, better analytics, and drip and trigger emails, and now they’re so awesome they’re hiring.

Check out Love With Food’s current openings here.

500 Global Team

500 Global Team