Email Optimisation: How a Monkey Performs 46% Better than the Average


The three biggest limiting factors for your email marketing are:

  1. Assumptions – where faced with an either/or situation, you just go for the one you think is best
  2. Lack of attention to detail – small improvements are discarded as being irrelevant or inconsequential
  3. Attrition – by serving the same type of email (design or content) again and again then performance slowly deteriorates over time

All of this can be avoided by using simple testing: either by adapting as you go along, or better still by running in-email A/B optimisation.

What this means is that you can send a proportion of a list one email (normally half) and then the rest of the list a variation on that email and then compare the performance: open  rate, click throughs, orders and any other metric you’d like to value.

Then, based on previous performance, you develop your next email to perform at least as well as the best of your two test emailshots with further optimisation and testing.

So, let’s do our own A, B, C test…

A: Average Biz Ltd

They send an emailshot to a house list 12 times a year, and perhaps review their look and feel every 1-2 years.

For every send, they will experience attrition in performance of around 2% (a conservative estimate) a recipients get more familar, and bored, with their emails. At the end of the year their performance will be: 100 * (0.98)12 = 78%

22% down year on year


Email Optimisation Specialist

B: A Monkey

Let’s say you employ a monkey to test for you. They have no marketing knowledge to understand whether changes made will work better or worse than your current newsletter. All they can do is make small changes to the email, perform A/B testing and optimise the next email based on what the stats tell them.

On average their difference in performance is 4%, so the first newsletter performance is likely to be no different than without testing. But, on the second newsletter, they are able to carry forward the better performing one. So their NET performance improvement is 2% as they take forward the learning to your newsletter. At the end of the year, their performance will be: 100 * (1.02)11 = 124%

24% up year on year


C: Email Optimisation Specialists

If you took this issue seriously and decided to use an email optimisation specialist. Every time they developed an email and testing campaign for you, they would test subject lines, images, layout, background, button colours, copy and messages.

Most of the time, their recommendations and improvements would exceed 10% performance improvement, although occasionally it may flatline or even decrease slightly, it would be reasonable to suggest that they could sustain about a 4% upwards trend month on month for the first year. At the end of year one, the performance would be: 100 * (1.04)12 = 160%

60% up year on year


Learning Points…

So, what can we learn from this simple illustration? Here are our top tips:

  • You cannot stay where you are: if you do nothing, your performance will go backwards
  • Small differences matter: optimisation is NOT about giant leaps, it’s about the small steps you need to take to climb a mountain
  • Testing is certainty: you are the worst judge of what will perform for you, particularly if you’re ‘in house’, don’t assume. Test and know!
  • Read the stats: you should be VERY bothered about seeing how an emailshot performed. The tracking is there – read it, learn from it.
  • Isolate on-site performance: if your performance is related to sales or leads, then open rate and click throughs do not earn you money. Track the emails through, sometimes more precise messages lead to lower click through, but much larger sales!
  • Get the monkey off your back: test, measure, optimise!

Optimising emails isn’t hard, even a monkey could do it, and you should too!