You have a website. But is it really working as effectively as it could be? How do you know? There’s an easy way to test if specific tweaks could make a big difference to your website’s outcomes. It’s called A/B testing.
What is A/B testing?
As a dog trainer you may already be familiar with the concept of A/B testing. If you’ve done food preference testing, you probably did something like this…
Let’s say you usually use hot dogs (they’re the “control” food you’re testing against). Then:
- Test 1: hot dog vs roast chicken (chicken wins!)
- Test 2: hot dog vs cheese (cheese wins!)
- Test 3: roast chicken vs cheese
And the winner is…roast chicken!
Being a savvy dog trainer, you realize that your result is only valid for THIS dog, under THESE conditions, in THIS situation. Your result is context dependent, but it’s useful information, nonetheless.
Like preference testing for training, A/B testing components of your website is a valuable exercise — it gives you information about what’s working well now and what could be more effective.
So let’s look at A/B testing in a bit more detail.
How do I run A/B testing?
First, let’s look at how it’s usually done: if you’re a humongous corporation you spend loads of money paying another big tech business to design and run your tests for you. You give the testing company the details of what you want to improve and they take it from there.
But most dog trainers don’t have BIG marketing/tech budgets to spend.
So, how might you or I do it? By keeping it simple and targeted with 4 questions:
- What is my goal? (increase in blog sign ups, customer inquiries, class bookings, etc.)
- Which one element do I want to test on the specific page? (signup button, CTA (Call To Action), headline, image, form etc.)
- What am I going to alter and compare? (size, color, wording, layout, image etc.)
- How long will each version run? (a week, a month, a quarter etc.)
Start with an uncomplicated plan and iterate as you go. Your plan might initially look like this:
- What do I want to achieve? More group class sign ups.
- What will I test? The signup button design on the group class page.
- What am I going to alter and compare? Change the color from green to red.
- How long will each version run? Two months.
What should I A/B test?
There are many different aspects of your website that can be improved using A/B testing. Below I’ll go through the most common ones and give some examples of results other businesses have found.
Buttons
Buttons are a great place to start A/B testing. Why? Because if their design misses the mark, people won’t click on them! And the whole function of most web pages is to get visitors to click on specific buttons — to act and move through the process of browsing to buying.
Button colors
Buttons are deceptive. They look so simple but small changes can have big results. Let’s talk about their color. For example, when HubSpot changed a “Get Started Now!” button from green to red they found a 21% increase in sign ups. That’s pretty impressive!
So, does that mean you should change all your buttons to red? Probably not. In the HubSpot test, green was the predominant color on the page. Changing the button to red made it pop — it stood out like a greyhound at a pug party.
The thing about button color is that it needs to be taken in context of your brand colors and the design of the whole page. Remember: if you want people to click your buttons, they need to really stand out. Color contrast is just one way to achieve that.
Button size
When it comes to buttons, size really does matter. They need to be big enough to be obvious without be so big as to be annoying. Buttons don’t only appear on your website; you might have them in your emails too.
Delivra decided to do some tests on their email CTA buttons and one of their tests included increasing the button size from a moderate 49px x 292px, to an immense 79px 436px. No missing that button! Surprisingly, bigger didn’t seem to be better. The click-through rate on mobile devices stayed the same but click-through on desktops dropped by a whopping 18%.
What does that mean for you and me? Basically, if you think changing the button size might improve effectiveness, give it a shot. It might be that increasing (or decreasing) your button size by a moderate amount is the way to go. You won’t know if you don’t test it!
Calls To Action
Another element just ripe for tweaking is your CTAs. Words have power and small changes in wording can have major consequences on visitor behavior.
Onilab found that changing a CTA on a membership loyalty program signup button from “Sign me up!” to “I’m in!” resulted in an 84% increase in button clicks.
In another test they found that changing the CTA text from “Learn More” to “Show Me App Connections” resulted in 19% more click-throughs.
Who wouldn’t relish improvements like those?
A good CTA is one that connects with your audience. In the first example “I’m in!” conveys a sense of joining an exclusive club rather than signing on the dotted line. In the second, “Show Me App Connections” gave readers more information and hints at benefits that might be useful for them.
Headlines and subheadings
It’s well known that website users scan rather than read pages. Why? They’re information gathering — their eyes scuttle over the page instead of moving from word to word. Because of this, your headlines and subheadings are super important — and well worth testing for effectiveness.
When Highrise decided to A/B test their signup page they found that tweaking just a few words in their headline and subhead increased subscriptions considerably.
They started with this:
Looks ok at first glance — gives all the information in a no-frills way but somehow it was missing the mark. This version turned out to be the worst performer!
The winning version was this:
It highlights the 30-day free trial and calls attention to the fact that signup is quick — less than 60 seconds. The implication is “What have you got to lose?? Give it a go!” How much better was it? 30% better. Those are great results!
This one came in second place:
Same headline but the subhead calls out different benefits. This resulted in 27% more signups. Again, results not to be sniffed at.
If your pages aren’t working as well as you’d like, try playing around with your headlines and subheadings. The truth is, if they’re not pulling their weight, the rest of your copy won’t get read (no matter how good it is).
Caveats and considerations
Remember how right at the beginning of this post I said that preference testing results could only be taken in context? Well, A/B testing results are the same. If you read an article saying that company X increased their result metrics by Y%, all you can really know is that for their business, with their website and their audience at that time, the tweak made a measurable change.
Does that mean you can’t test their findings on your own website? Of course you can! And you might well find that a similarly easy modification has equally impressive results for you. (Which is why A/B testing on a small scale is worth the effort!)
But be aware that outside factors can mask results. For example, it may not be valid to compare puppy group signup rates during puppy season (spring) with those of winter.
So, caveats out the way, go have some creative fun with your website. Be sure to let us know what you discover in the comments!
0 Comments