Neglecting to Split-test Your Ads

When you compare two very different ads head to head, one of them will almost always be better than the other — more compelling, more attractive, or more in tune with the innermost desires of your market. The problem is that you don't know which one is going to be better. Even the world's best marketers are wrong more often than they're right. In fact, the chances of them being right the first time without any testing are laughably small. If you run a single ad, the probability of that ad being the best of all the possible ads in the universe is laughably small.

When you've run the test long enough to have statistically significant data, you gracefully retire (or unceremoniously fire, whichever you prefer) the losing ad and put up another challenger. You continue the process, directing the survival of the fittest ad until you find the one unbeatable control that maximizes your business goals.

In the old pre-Internet days, split-testing was complicated and expensive, a high-level business process reserved for huge companies with giant mainframe computers and millions of dollars on the line. Now, Google AdWords makes split-testing as easy as sending e-mail; when we see advertisers neglecting this fundamental improvement strategy, we feel like the mom telling her kid to eat his peas because there are children starving somewhere else in the world. We want to shake them and shout, “Don't you realize how lucky you are to be able to split-test so easily ...

Get Google AdWords™ For Dummies®, 3rd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.