“One problem we have in marketing is that adequacy is the enemy of excellence. In many cases, we’re losing the most money by using campaigns that are working, but are not working well enough.” – Dr. Flint McGlaughlin, CEO of MECLABS

In a 2009 New York Times profile, Marissa Mayer—Google’s vice president of search products and user experience—recounts discovering that Google users were more likely to click on a toolbar that was a greener shade of the current blue. The team liked the existing color, so Mayer decided to split the difference and make the toolbar a hue exactly between the two colors. Unsettled by her gut decision she decided to test 41 gradations between the two blues to see which got the most clicks.

41 Shades of Blue

Early and often

You’ve probably heard the somewhat cynical remark: Vote early and often. At Masterworks we like to say: Test early and often. As soon as you make something new for an email or landing page start testing it, and once a test has won set up another test to try to improve on the last version.

What you should be testing

If testing 41 shades of blue seems like an onerous task, consider that Google’s bottom line is attached to people clicking text ads, so it’s in their best interest to enhance for clicks.

You should test the user actions that most directly affect your bottom line. It might be a headline, a call to action button or the way you sign off your emails.

How to run an A/B test

An A/B test can be easily implemented. Here is an example of one in six steps:

  1. Decide the most important thing to test—what optimization will have the biggest impact on your bottom line?
  2. Write a measurable test question—it should always be a sentence that starts with “which,” e.g. “Which button color will get the most donations?”
  3. Create treatments—the presentation of your test, e.g.
  4. Get a lot of observations of your treatments—statistical validity is it’s own article, for now, just get as big a sample size as you can and show one half treatment A and the other half treatment B.
  5. Determine the impact of your test—did one treatment get more donations? Why?
  6. Repeat with a different treatment—e.g.

Testing resources

There are a lot of resources out there to help you easily create A/B tests.

  • Testing: An easy way to test emails is just split your list using tools from your email service provider. A tool we recommend for landing pages is Optimizely. It’s simple interface makes it incredibly easy for anyone to run an A/B test, even with no knowledge of HTML. Other good tools include Visual Website Optimizer and Google Website Optimizer.
  • Analytics: Make sure you’re using a good analytics tool—like Google Analytics—so you can track your tests accurately.
  • Inspiration: If you need some ideas for what to test, follow these testing sites: Which Test Won, ABtests, Visual Website Optimizer’s Split Testing blog.

Get updates on the topics you love!

Mark Neigh

Mark Neigh

Mark is a curious observer, turning misfit insights into innovative ideas. He draws from his experience as a Digital Strategist for Fortune 500 companies (LEGO, Nintendo) and forward-thinking non-profits (Feeding America, Prison Fellowship) to expertly use digital media to further our clients’ missions to change the world.

Read more posts from Mark

  1. […] while you may be comfortable to rest at a local maxima, and test 41 shades of blue on your buttons, you’re never going to get to the absolute […]

  2. […] Test for impact: What optimization to your program will have the biggest impact on your bottom line? That’s the first thing you should test. […]

  3. […] is possible to overdo A/B testing.  While Google’s famous blue test had a solid outcome for their bottom line, there’s reason to be critical of that approach. […]

  4. […] that are likely to have big effects. With 100 people visiting a landing page a week, testing 41 shades of blue is not going generate a detectable difference. Go for big bold changes in the value […]

  5. […] that are likely to have big effects. With 100 people visiting a landing page a week, testing 41 shades of blue is not going generate a detectable difference. Go for big bold changes in the value […]

Leave a Reply