What Could a 90% Increase in Clickthroughs Do for Your ROI? Find Out, Start Testing!
In a change of pace from our more technically focused blog posts, we would like to go over something that we’ve actually been looking into a great deal ourselves.
Don’t worry, we don’t need you to sharpen your favorite number two pencil or bring an answer booklet. The type of testing we’re referring to is actually a pretty savvy method to increase your click-through rates, subscriptions and sales in future email marketing campaigns.
While there are many types of tests, the type of testing we’d like to review today is the A/B test. Most people who aren’t in marketing are probably (blissfully) unaware just how much marketing is about psychology and science, as it is creative.
The A/B test speaks to this scientific nature, by enabling you to determine which layout, buttons, images, videos or any other aspects of you marketing communications positively aid in achieving the specific goals of the message.
A simple A/B test involves deploying similar copies of an email, landing page, or form, with some differences, which can be minor, or drastic. This then enables you to better understand what creates a more favorable response from your target audience.
Take this test from visualwebsiteoptimzer.com for example:
By changing only the headline on their page from, “Businesses grow faster online” to “Create a webpage for your business”, CityCliq increased their conversion rate by 90%.
While the tester admitted these findings weren’t exactly surprising, they did learn that their prospects were already aware of the benefits to being online, and were looking for ways to easily accomplish the goal of an online presence. With this knowledge their future campaigns can focus on communicating the ease of use and how they help businesses get online, instead of the already known benefits to being there.
What this also proves is that an A/B test doesn’t need to be complicated or dramatic to be successful in both the performance and customer insight categories. In fact, it’s probably better to be simple to ensure you can isolate which variable is responsible for the positive or negative change. Once you’ve gotten more comfortable with the testing process, go big! Try creating a completely different design and see what kinds of results you get from the change.
Here are some of the things you’d need to run your own A/B test:
- A communications piece (email or landing page)
- A goal for your communications piece (clicks to a ‘read more’, video views, email sign ups).
- A method to track your goal that enables you to distinguish the results between your two communications pieces
- A chosen variable to test (a different headline, button, color, image, etc.)
If you’re testing an email for instance, send the control (your standard message) to the first half of your list, and the test version to the other half. Then see which email performs better in respect to the specific goal you set for that piece.
Another thing for you to consider when performing A/B testing is the importance of consistent presentation between your two communications.
If either the test message or the control message doesn’t render properly, it can skew the results, giving you some bad data on user engagement. So make sure your messages are as close to identical as possible, obviously aside from the variable you’re testing.
Let us know how your testing goes, or clue us in on the results from a previous test.
We would absolutely love to do a case study with one of our partners on the value of consistent rendering and proper testing!