There’s an old-fashioned marketing tool that’s become even more powerful in our age of digital marketing. It’s called split testing, and it can help you refine your marketing efforts so they become far more effective.

Some of today’s companies are unfamiliar with split testing (also referred to as A/B split testing) or assume that since it’s a time-honored tool, it can’t possibly have a place among today’s fast-changing apps and options. While the approach itself is far older than most marketing decision-makers, it has become even more valuable with the onset of the digital age.

Split testing is rooted in direct marketing. Before anyone heard of digital marketing, many large companies invested a significant share of their marketing budgets into direct mail. If a company was planning to send a major mailing to a half-million prospects, and wanted to make sure its message and offer would connect with the recipients, they might conduct split tests to smaller segments of the list using any number of variables.

For example, they might want to test whether the correct price point is $89 or $99. To conduct a split test, they would pull two random groups of 10,000 names from the larger list and mail each their proposed package. One group would receive it with an $89 price and the other would get the $99 price. When they received the responses, they could determine which price produced more business … or whether price appeared to be a factor at all.

They would run similar tests with different envelope designs, headlines, offer language, and other variables. Once all the tests were complete, they would have a good sense of what combination of factors would produce the highest response and might even be able to project the response across the entire mailing to 500,000 prospects.

Companies that conduct large direct mail efforts continue to make use of this strategy, but it isn’t limited to massive mailings. You can use similar approaches with your blogs, social media efforts, and email marketing.

Suppose you’re planning to send a large email effort to your prospect list. One of the most critical factors in whether that effort will succeed is the subject line that will appear in the recipient’s email client. So it might make sense to split test subject lines to see which one gets opened the most. We can accomplish much the same for our clients by using software that evaluates and compares subject lines using what researchers have learned. When drafting a marketing email for a client, we can enter a long list of variations and let the software identify which of them is likely to produce the highest open rate.

Doing the same with all the other elements of your marketing efforts is like peeling an onion and revealing new insight with each layer. While working on an email campaign to CEOs, I tested a number of familiar and unfamiliar phrases to see which would be most likely to catch the recipients’ eyes. You might assume that phrases like “increasing revenue” would be the winners, but I discovered the less-common phrase of “your company’s position” (as in a position in the stock market) produced higher responses. The software told me it was a phrase this specific audience would recognize and be more likely to respond to.

Some blogging software has similar features built in. An example is a tool that analyzes how well search engines will respond to specific headlines. You might be surprised at how changing just one word or shifting the structure of a headline for a blog post can affect its effectiveness. What you thought would be a dazzling headline receives a “meh” from the software, but change a word or two, and the score jumps way up. Unfortunately, far too few companies take the time to learn or employ features such as these.

A similar type of testing involves the readability of text. When you write something in Microsoft Word and some blogging platforms, you can measure what’s known as the Flesch reading ease score. The score examines your word choices and sentence structure, then rates it on how easily understood your words are to average audiences. A score in the 30s means most people will find it difficult to understand, while a score in the 90s suggests that even 11-year-olds will grasp what you’re trying to say. Here again, making minor adjustments to your blog posts and other copy and testing the results will help you develop words that best convey your messages.

It’s up to you: you can just do what you think is going to work best, or you (and your marketing partner) can take a little extra time and perform predictive tests that will give you a high level confidence that you’re going to succeed. I know which I would choose.

Deborah Daily is co-owner of Buckaroo Marketing | New Media.

Published: July 7, 2022

Website Link: Inside Indiana Business – 07-07-2022

PDF Version: Inside Indiana Business – 07-07-2022 (PDF Format)