off Individual Membership
every year you're a member with us
Hurry our offer ends in
New Individual Annual and Monthly members only. Valid until 30th April 2017. This offer is available for the lifespan of your membership, if you choose to cancel your membership this offer will end.
The growth in Data Science techniques during the last few years has generated a vast interest in using analytical techniques to optimise engagement on email campaigns. Whether a company wishes to compare the performance of two email templates, compare the performance of multiple email templates or see the association between several characteristics of an email and a single metric, Predictive Analytical techniques allow them to acquire the answers they need.
Below we are going to outline 3 techniques you can implement from ‘getting started’ lists of around 500 subscribers, to highly advanced models geared towards enterprise users.
For these techniques to make sense, it is important to have a basic understanding of statistical significance and what size of list may be required for your tests to be useful. Very basically, the larger the sample size, the more statistically significant (meaningful) the results will be. However, the bigger impact you expect the test to have, the smaller the sample you require for the results to be deemed “statistically significant”.
As an example, we would expect changing an email subject line from “Acme March Newsletter” to “Your 75% Off Acme Today Only” to have a good impact, whilst changing it to “Acme March News” we would expect less impact. As such, the first test would require a smaller sample size to be “statistically significant” than the second.
There are many tools and calculators online and most good newsletters with A/B testing with have inbuilt functionality to assess if your test is statistically significant before you decide to hit send.
To begin, we will explore the most basic example of Predictive Analytics: A/B Testing. This simply involves showing one version of an email template to a group of users, showing another version of an email template to a different group of users and then comparing the performance of those two templates. Performance could be measured by anything the business is interested in; Open Rates, Click-Through Rates (CTR), Engagement, Conversions etc.
The smaller the list size, the more difficult it is to get statistically significant results, so test for major differences and look towards Open Rates as your initial metric. As your list grows, start to look at Click Through Rates and more subtle changes.
The next example of Predictive Analytics is an enhanced version of A/B Testing; Multivariate Testing. Multivariate Testing makes multiple email templates based on the combination of multiple variables. These then get sent to several groups of different people, from where the business can see which combination performed the best.
With an A/B test you compare “Email A” to “Email B”, but with Multivariate testing you could compare A to B to C to D…or you could compare the effect of several individual differences between A and B.
Put simply, Multivariate Testing is a bigger version of A/B Testing, running multiple A/B tests simultaneously.
As you can imagine, the more versions you are testing, the larger the sample size required to find a “winner” with statistical significance.
The above examples of Predictive Analytics have many advantages, one of which is their simplicity, however, there are problems with both these techniques.
First, they only test for a difference between variables and do not assess a correlation between the values of one variable against another. They provide answers on the differences between variables, but do not provide information on whether the values of those variables correlate or are statistically associated with the metrics which a business is interested in.
Secondly, and more seriously, the conclusions of A/B testing and Multivariate Testing need to be taken with a pinch of salt. Because two different email templates are sent to two different groups of people, any differences in the results found between those two templates might not because of the template per se, but are rather a difference between those two groups of people. Because we cannot show two different email templates to the same group of people, it must always be remembered that it may not be the design of the template that’s driving changes, but rather the personality, motivations, time available and aims of the people who received those emails.
In this situation, another technique has to be introduced which assess the correlation or association between variables. Referred to as the “Granddaddy of Supervised Artificial Intelligence”, a Regression Model can mitigate the problems of A/B tests.
If you wanted to find out if patterns in your old emails, whether they may be subject line, time of day, use of pictures, use of offers or amount of text influenced the CTR of that email, a Regression Model can answer that very question.
A Regression Model fits a line of best fit consisting of several variables (for example, send time, subject line, content etc) against one individual variable (for example, the CTR). The several variables are referred to as Predictor Variables, while the individual variable they’re fitting against is called the Response Variable.
The idea of Regression Testing is to use the values of the Predictor Variables to predict the value of the single Response Variable.
That’s exactly what Adoreboard did using their latest tool toneapi for low cost airline EasyJet in order to see if there was a correlation between emotion and click through rate. The team analysed 30 transactional email templates for emotional content. The analysis was then compared to the CTR that each template received from millions of EasyJet customers. Using the emotions as Predictor Variables, an overall correlation was found between emotions and CTR (the “Response Variable”), allowing the templates to be optimized accordingly. The result was a click through uplift from 13.4% to a predicted click through rate of 23.7%.
Regression Models are regularly used to make predictive decisions in other situations. An example of this was famously reported a few years ago when US giant Target developed a Regression Model which could predict whether a customer was pregnant or not, based on their purchase history. In this example, the predictor variables were the purchases the customers made, while the response variable was whether a customer was pregnant or not.
For large players who have a lot to gain (or loose!) from their email communication, regression testing can have incredible impact, although it is not something that can easily be offered as an ‘out of the box’ solution.
In conclusion, we recommend first getting started with basic A/B testing of a single, yet notable variable such as subject line. As your list grows, consider speeding up your tests with Multivariate Testing. And once your send volume is at enterprise level and business revenue is significant, consider Regression Testing and Modelling, which can offer succulent insights and improvements to your email marketing.
By Expert commentator
This is a post we've invited from a digital marketing specialist who has agreed to share their expertise, opinions and case studies. Their details are given at the end of the article.
Start the discussion on our community and social networks
Recommended Blog Posts
Popular Blog Posts
Statistics on consumer mobile usage and adoption to inform your mobile marketing strategy mobile site design and app development “Mobile to overtake fixed Internet access by 2014” was the huge headline summarising the bold prediction from 2008 by Mary Meeker, an …..
Landing page examples and best practice advice Discussion of web design in companies who don’t know the power of landing pages still often focuses on the home page. But savvy companies know that custom landing pages are essential to maximise conversion …..
Amazon’s business strategy and revenue model: A history and 2014 update I’ve used Amazon as a case study in my books for over 10 years now since I think all types of businesses can learn from their digital business strategy. From startups …..