I can’t believe it’s been a full week already since Which Test Won held their second annual The Live Event, a two day conference focusing mainly on A/B Testing, but also on other things dealing with improving conversions on your web site. For the second year in a row, it was a show full of great information.
Like last year, there were several true industry experts speaking there, such as Peep Laja, Matt Gershoff, and Jeanne Jennings, among others. And even more so than last year, the conference truly focused on those of you down in the trenches doing the actual conversion work. I’d estimate that 90% of the presentations were from people at private companies rather than from agencies. Steve Rude, of Thomson Reuters, Dominique Charrette, of Quicken Loans, and Melissa Burdon, of Extra Space Storage, had especially good talks, among a great lineup of speakers.
Anne Holland, Justin Rondeau, and the rest of the team at Which Test Won did a fabulous job. They have another run of this show in London later this month. If you can’t make it to that, I’d recommend you get on the list for information for next year’s show. But in the meantime, let me list out a few of the key takeaways I came away with.
A/B Testing Philosophies – Live Them!
In order to get the most out of split testing for your websites, it’s important to develop a testing mindset, which includes certain procedures that are followed consistently, and that results are communicated clearly. There was a lot of talk around this concept at the conference.
Develop a Complete Testing Plan
This was by far the most common theme of the conference. The details of an effective testing plan are beyond the scope of this post, but having a standardized plan that includes data evaluation to generate good testing hypotheses, reliable testing and implementations of the test, complete evaluation of the results including data external to the testing tool, and strong communications to the entire organization is critical. If you don’t have a documented testing plan, do a little research and put one together.
Schedule Regular Testing, No Matter What
We all face roadblocks to running tests on our sites. You might be waiting on a new page design that is delayed. It could be that there are political issues stopping testing. Maybe you don’t have any good hypotheses to test. None of that matters. You should be testing at all times. If for no other reason, conduct new tests to learn your testing tool and analytics tools better. The more you know your tools inside and out, the better prepared you are to give good answers when someone asks you to run a test.
Segment, Segment, Segment
I’ve preached about the importance of segmentation regarding analytics on a number of occasions. You need to use the same mindset when it comes to designing and running A/B tests, and when looking at their results. You might find that one version narrowly beats another version in total, but that when looking more closely it slaughters the control for search engine traffic or for mobile traffic. Learning such things can allow you to turn around and refine a test to make big gains.
Develop A Test Results Template And Communicate Your Test Results
Let’s face it, A/B and Multivariate Testing is new as a marketing strategy. A lot of people still don’t realize the value that testing brings. So for every test you run, communicate the results, and do it in a consistent fashion with an easy-to-read Testing Results Report. Even for tests that “fail”, something is learned. For one thing, you can determine how much revenue was gained by not making that change. Communicate all test results to anyone who will listen, and sell the merits of testing.
Quick-Hit Tests To Consider
I know what you’re thinking. All that philosophical stuff above is great and all, but was there anything that we can try right now for an immediate benefit? Well below are a few things that you might consider.
Set Up A/A Tests For Each of Your Web Sites
This is something we do for each client prior to running an actual test. We set up a test with two identical versions and test them against each other. What’s the point? Well in theory, neither version should provide a lift over the other. And if you ran the test forever, that would probably be the case. But just due to general randomness, one version will outperform the other slightly. (If won wins significantly, then you know you probably set up the test wrong—good thing you ran the A/A test to find that out!)
So if the “winning” version provides an apparent 2% lift, that is what is known as the variance of your page or site. Make note of that. Then when you run your actual A/B test with a control and a test version, if the test version provides a 9% lift, take that 2% variance into effect. The test version really provides a 7% lift.
Test Email Subject Lines That Tease…
Nick Ellinger of Mothers Against Drunk Driving brought this up. They found that almost without variation, email subject lines that left the reader hanging a little bit were more effective in both opens and click-throughs. So for example, an email subject such as “Last Chance To…” outperformed something like “Deadline Tomorrow For Our Campaign Matching Fundraiser”. The subject line that does not include all of the information can be something to entice the reader to become engaged.
As always, don’t just accept this as “the way to go” in creating email subject lines. But if you think it might be an effective technique for you, run a few tests to see!
Test Personalizations In Your Marketing Emails
If your opt-in database includes first names of your recipients, this can be a good thing to test. Will your emails get higher open and click-through rates if you include a personalized salutation in them, such as “Dear Tom:”? Consider testing this to find out. This was the first test that email marketing expert Jeanne Jennings of AlchemyWorks suggested.
Of course there are some situations where you may not be able to do this. If your emails need to use a more formal salutation, such as “Dear Dr. Jones”, you will have to have title information (Dr., Ms., Mr., etc.) or possibly be able to determine gender based on first name.
Review Any Contact Us Databases or Chat Transcript Archives For Insight
Andy Edmonds of eBay made a return visit to TLE this year, and while last year’s was filled with some heavy math crunching, his presentation this year was much more human. It centered on all of the useful information you can get to help you build test hypotheses if you read through the Contact Us messages your company has received, and the Online Chat transcripts it might have archived.
Reading through those messages can shed a great deal of light on what your users are trying to do, how they are trying to do it, and in what areas your site is failing them. It can be almost like a free version of an expensive Usability Study. Look through your databases. Read customers’ stories in their own words. If you’re strong with Regular Expressions, you can use them to start to locate communications with similar content.
I promise you this post is not a paid endorsement from Which Test Won! But I’d be remiss if I did not include two more takeaways in this post, which I have already done.
Sign Up For Which Test Won Access
Their database of nearly 500 case studies of past tests is an invaluable resource for testing ideas. Many people I talked to during the conference talked about the usefulness of this service. It is inexpensive, and you can sign up for it here.
Download Which Test Won’s State Of Online Testing Report
This is a free online report of the results of an in-depth study conducted in late 2013. It is full of statistics and charts that can not only give you more ideas on testing, but also be used to help sell the benefits of testing to your boss or to your customers.
So that’s a wrap from this year’s Live Event in Austin, Texas. Try out some of the suggestions listed above and leave a comment with how they work out for you.