Here’s the truth: A/B testing is likely a giant waste of time for you.
And, to be honest, for 99% of sites on the internet as well.
Here’s why: Most sites have too little traffic and too low of a conversion rate to draw any useful conclusions from A/B tests.
I’m going to show you why this is, and more importantly, how you can optimize your conversions without A/B testing until you get enough traffic.
Why A/B testing is useless (for most online businesses)
Recently, I wrote about how I changed some copy and removed a section to improve a wedding website’s free trial conversion rate by 73%. Well, a few tests after that I thought I had another winning variation. The early signs were promising, check out what the test looked like after three weeks:
18.3% improvement, 95% statistical confidence, 168 conversions for each variation. Even Optimizely had declared my test a winner! (Look at that beautiful green check mark at the top).
“I just need a little more data,” I thought (keep reading to learn why). Then they will love me. Everyone will love me. The whole world will love me.
Then I ran it for another week, and like a child on Christmas morning, I opened up my Optimizely dashboard:
Oh, that sinking feeling when you realize your baby is not as cute as you thought. Almost identical. No improvement. Zero.
Why did the variation dominate for 3 weeks then take a nose dive and end up equal to the original? Honestly, we don’t know why. A/B testing isn’t the best at telling you why a variation wins, it just tells you if a variation wins. For you A/B testing nerds, there are a few possibilities:
- It could be nothing more than statistical chance that B was winning for while before reality hit.
- It could be that after 3 – 4 weeks, enough users cleared their cookies (Ton Wesseling says in 2 weeks you can expect that 10% of users have cleared their cookies) that the results started getting muddled and both variations regressed to one average rate.
- Or, it could be that the last week was a holiday week (Thanksgiving) and people were using relative’s computers, showing family, etc., so the same users were seeing different variations.
But, we know this: If I had stopped the test a bit earlier, I’d have proudly declared B the winner. The client would have been happy, and I would have gone and ordered this pad thai dish and eaten it all in one sitting, in celebration:
But all those mouthwatering rice noodles would have been for naught, because a year later, the client would not have seen the 18.3% bump in free trial signups I’d promised.
That’s a bad outcome.
Best to avoid that at all costs. You don’t want to think you have a conversion boost when nothing is actually happening. I want to show you why you can’t trust early A/B test results.
How do you know if your A/B test is valid
The harsh reality is: Just because your test program says a variation has won because it reached 95% significance, it doesn’t mean it’s “valid”.
Here’s the brief version: Since we’re talking statistics here, there is always some chance that after your test shows 95% confidence, taking more data could move it back to insignificant. Stopping your test the moment you get 95% confidence prevents you from seeing those instances.
So more data is always better, but then, how much is enough? Great question.
We just have to be practical. There’s no “magic” number.
Here are two options:
1. Use this great tool from Evan Miller and just plug in your current conversion rate at the top:
and the percent difference you want to be able to detect in the bottom:
And it will give you an approximate sample size to test:
2. I also like this general rule from Peep Laja of Conversion XL: Each variation should have 200 or more conversions, the test should run for at least 3 weeks, and the statistical confidence should be at least 95%.
Let’s look at rule #2 for a minute: 200 conversions per variation. Minimum.
Say you just started an online t-shirt store with Shopify and you’re getting about 10,000 visitors per month. Now say your overall conversion rate is 1%. So you’re getting about 100 sales a month.
But you want to grow.
What should you do? “Let’s increase our conversion rate!” is not a bad statement to make. Lots of big ecommerce brands can get more than 1% conversion.
And that annoying braggy friend of yours who also has an online store is getting 3%, and you definitely want to beat them. Ugh, they’re so annoying with all their annoying conversions.
Should you open an Optimizely account and start testing? Likely not.
Here’s why: Each of your variations are going to get about 5,000 uniques and 50 conversions on average…per MONTH.
To get 200 conversions per variation, you’d have to wait 8 weeks. That’s dangerous. Remember, 10% of people start clearing cookies every 2 weeks. That means in 8 weeks 40% — almost half — of your users have cleared cookies, so when they come back, they may see the opposite variation and your data will get muddled.
So what should you do to get more conversions?
The ABC Method: How to optimize conversions without A/B Testing
You know when you think of three things, and they start with A, B, and C, and you think, “OMG, I can call this the ABC method!”
That just happened!
Anyways, here’s how to make sure your conversion rate doesn’t continue to be terrible and lose money, even though A/B testing isn’t practical yet.
A is for Analytics – Check for low converting segments on Google Analytics
You can often find hidden drains on conversion by looking through your analytics and sorting conversion goals by segment. Here’s how:
1. If you haven’t already, set up goals in Google Analytics. (If you want to know how, I recorded a video of me setting up goals in Google Analytics. Get it here for free.
2. Give your goals some time to gather data. Then check your conversion rate across different pages. To see conversion rates across different pages, go to Behavior / Site Content / Landing pages.
3. Check those conversion rates across different segments:
For example, is there a sharp drop in conversion rates for people that use a mobile device? If so, you can spend time analyzing the mobile experience of your site and strategize how to improve it knowing that it’s an issue that will give you good results.
See the difference? No more guessing and working on some aspect of your business hoping that it will help.
You can get super fancy with segments and see if people who saw a certain page convert better, people that return vs. new, people that arrive via search or a referral. It goes on and on.
In the bonus section, I’ve put a link to a YouTube video of me showing you how to set up Goals in 2 minutes. You can get it free here.
B is for best practices – Implement these even without A/B testing
Even though you don’t have enough traffic to A/B test, that doesn’t mean you can’t implement some well-known best practices.
The hardcore optimizers may huff and puff at this – saying that you can’t be sure your improvements are actually improving anything.
And, honestly, they’re right. You can’t prove it.
But we’re not talking about trying 39 variations of your headline or button color.
There are a few key best practices that make things easier for the user and generally improve your conversion rate.
Best Practice Example 1: Reduce form inputs
For example, in many many many cases, people have found that having more form inputs reduces conversions. In one well-known case study, Expedia removed just one field from their checkout process and increased their profits by $12 million.
Lesson: simplify checkout as much as possible.
Best Practice Example 2: Match Ad and Landing Page Headlines Another one of my favorites is for PPC ad landing pages: Make sure the headline on the page you’re sending customers to matches the headline of the ad they clicked.
For example, don’t have an ad for a tennis racquet go to the homepage of your “All Sports!” store. I literally have not seen one case study showing the opposite to be true (but feel free to link to one in the comments).
If you want to find more best practices, I recently teamed up with Brian Dean from Backlinko to make one of the biggest lists of unique conversion tactics around. It’s interactive, with filters so you can choose what kind of tactics you want:
A word of warning: not all of those techniques (or most CRO techniques) fall under the umbrella of “almost always works”, so tread lightly. Before you have enough traffic to A/B test, just pick large changes with solid case studies backing them.
In the bonus section, I’ve included my top 4 conversion optimization best practices (2 more beyond these). You can get it here.
C is for customers
This is my favorite strategy because it’s so obvious yet so few businesses do it.
Think about what “converting” a visitor or prospect even means: It just means giving them what they want.
If you know exactly what a customer wants, converting them should be trivial: Just give them that (and do it profitably).
But so many businesses don’t actually ask customers what they’re looking for – they just assume they know.
Deeply understanding what your customers actually want, what they fear, what they’ve tried, what’s holding them back, and systematically changing your product, offering, positioning, or experience to get closer to what they want can work wonders.
For example, in the wedding website case study above, I originally changed text near the CTA that focused on “free trial” and “no credit card required” to be more focused on what customers were looking for, making a beautiful website quickly. We saw a 100%+ increase in clicks on that CTA.
Here’s how to get more insight from customers:
1. Ask over email
When they sign up for your email list, send an email asking one key question you want to learn. Not a “7 minute survey” – everyone hates those. Just a personal email from you asking “Hey, what are you struggling with?” Or “Hey, what kind of clothes do you wish we carried that we don’t?” or “Hey, what’s your biggest issue with enterprise sales apps?” Something that lets them vent open ended.
Derek Halpern wrote about this technique for bloggers long ago and here’s the exact email I got from him when I joined his list:
Try something similar for subscribers who join your list.
2. Ask with a Survey
I like the Qualaroo style surveys the best, because they’re less intrusive than pop ups and the value of knowing what your customers want can be well worth the $60/month (or whatever they’re charging now).
You can put general questions on your home page and more specific questions on specific product pages.
3. Chat with your customers
Many SaaS and ecommerce sites are using chat these days – and for good reason. Actually engaging with customers tells you a ton.
The best part is that when you’re a small company (that doesn’t have enough traffic for A/B testing yet), chances are it’s you, the business owner, on the other side of the chat, which means you’ll get fabulous insight into what’s holding customers back.
You can use this insight to shape your offer or change your copy.
Again, these services aren’t free, but they’re worth their weight in gold. (I have zero affiliation with any of them, choose whoever you want.)
Helloify is one such service, but there are many others.
How To Take the First Step
- If you can’t get approximately 200 conversions per variation in a month or so, you are likely wasting time with A/B testing.
- While you grow traffic, try these instead
- Check your analytics for large differences by segments
- Implement a few conversion best practices
- Talk to your customers to learn what they truly want
I don’t want you to consume this information and do nothing, so I put together a bonus for people who want to actually take action on the above three steps.
Here’s what’s in the free bonus:
- A video of me showing you how to setup your first pageview goal in Google Analytics. If you’ve avoided setting up a goal because it sounds complicated, watch this, and in less than 5 minutes you’ll start measuring your own goal.
- My top 5 best conversion optimization best practices culled from the complete list.
- My list of 30 questions you can ask your customers across ecommerce, SaaS, and blogging
You can download them by clicking exactly where I’m pointing to:
I’ll help you in the comments
I’d love to know what you’ve tried and where your biggest hang ups have been. Let me know in the comments and I’ll try to answer every question.