Author: Maggie Stack, Account Director
At least once a week, while doing homework, one of my kids will say “when will I ever use this in real life?” I like to point out to them how often I use my Algebra skills, but I never thought I would use science in my marketing career. As my colleague, Alan Sherman, mentioned in a previous blog, we use direct mail testing to determine the best direct marketing strategy. This is where science comes in. By following the steps of the scientific method with a continuous improvement mindset, we aim to exceed our clients’ marketing goals.
When thinking about our clients’ direct marketing strategy, our question is always how can we improve results. The exact Key Performance Indicator (KPI) we are trying to improve varies by client, but it always means a better Return on Marketing Investment (ROMI).
We start by reviewing current marketing efforts. Who is the target audience? What motivates them to respond? What tactics are currently being used? How do those tactics work together?
Once our research is complete, we make recommendations for the elements or options we believe will improve results. Sometimes this is the choice to reduce the cost of a campaign while maintaining response and sometimes it is a higher cost option that will improve response. This could be a new data source, a new offer positioning, a new direct mail format, or addition of a complimentary digital tactic. In direct marketing, the options are truly endless. Once we decide on what will be tested, we can begin the experiment!
The two most commonly used experiments in direct mail testing are a split test and a multivariate test. Split tests, or A/B tests, involve testing the same package except for the specific element to be tested. Multivariate testing involves testing multiple elements at the same time. The right one to use is based on several factors: budget, available quantity, quantity needed for the result to be statistically significant, and the number of items we want to test. Once the experiment is in the hands of the prospects, we wait for results.
Depending on what we are measuring, results could take months to gather. There must be enough responders to be confident in the results. Analysis comes in the form of charts and graphs. Our goal is to always improve results, but sometimes we learn what does not work. As my high school science teacher would say after a failed experiment, “if you learned something, the experiment wasn’t wasted.” In direct marketing, the win in a losing test is that it leads to better, more refined hypotheses.
While there is a conclusion to every individual experiment, direct mail testing should never end. We believe marketers should always be striving to improve their data, improve their messaging, and improve the tactics they use. And we love partnering with those that feel that same. Looking to apply a little bit of the scientific method to your direct marketing strategy? We are here to help.
Bio: Maggie Stack is an Account Director with over 20 years of experience in marketing services and direct mail production. When she isn’t discussing data and creative with her clients, you can find her and her husband cheering on their children in hockey, baseball, and dance.