You probably experienced it already: when you create the email, you think: I choose a call-to-action button in my email that matches the color in the image. It looks good and you are very happy with it.
That is a typical example that often comes up. We make/do things because we think they are good. When sending emails, this can have a major effect on the results of the email. For example, your click rate can decrease because you choose a red call-to-action button instead of a green one.
What can be A/B tested within email marketing which can be done within MailChimp?
You can test different elements of emails. They will not have all the same goal that you have in mind with the test, but it can be tested. Examples of elements you can test:
- Subject line
The subject line is one of the most important elements of an email. A good subject can trigger the customer to open the email. Some examples of what can be tested:
- Personalization in the subject (mentioning the first name of the customer)
- Length of the subject (number of characters)
- Using an emoji or not using it
- Sender name
The ‘from’, also known as name of the sender, can affect the open rate. For example, as the sender of the email, you can set it up like you send out the mail from your customer service: Customer Service (company name) or from a fictitious person: Anne from {company name}
- Call-to action button
We all know the button below the intro text, but what color should this button be? Do you generate more clicks if it has a green color? Or does your audience click more often at an orange button?
A lot of people has been said: green stands for positive, so that is the right color. But of course this can differ per audience. Always test this button. It can have a major effect on the clicks at this button.
- Emotional images vs top shots of products
The images of the products that you add in your newsletter can also generate more clicks. In some cases it can help to show an emotional/atmospheric image instead of the well-known top shots of products (product with white background).
- Time of sending out the mail
The moment when you send out the email is important to test. I do not only mean in the morning or evening, but also the day on which you want to send a promotional email. For example, Saturday is often seen as a bad day to send an email. The results are often disappointing when sending emails on this (Saturday) day. This could be because people are out and hanging out with friends.
Do not test everything at once and test multiple times
The moment when you send out the email is important to test. I do not only mean in the morning or evening, but also on which day you want to send a promotional newsletter. For example, Saturday is , in general, a bad day to send an email. The results are often disappointing when sending emails on this (Saturday) day. This could be because people are out and hanging out with friends.
Also, don't draw any conclusions after testing an element once. Send at least 2 emails with the same A/B test before drawing any conclusions. By having more results, you can substantiate your conclusion better and more firmly.
How to start with A/B testing?
If you want to create an A/B test for the first time, it is good to know which steps you have to take. Below I mention some points to determine for yourself in advance.
- Determine the purpose of the A/B test in advance
Do not start an A/B test before you have a goal you want to achieve. For example, do you want
to increase the average open rate? First calculate what your average open rate is and then see if
the variant in your A/B test could improve your open rate.
Another example could be that you create a hypothesis:
‘I expect that if I use the customer's name in the email subject, the customer will receive the
newsletter opens faster’ Test this and see if you are right.
- Determine the distribution of the A/B test
It is obvious to send an A/B test in which 50% of the customers receive version A and the other 50% receive version B.
But sometimes you also see that the ratio is 80%/20%. That does not make analyzing the results any easier. That is why the first option, 50%/50%, is the most common chosen option.
- Put all your A/B tests in one document (including results)
To avoid that you forget the results of a previous A/B test, it is good to document everything. Put the topics below in a document:
- Idea of the test
- Purpose of the test
- Result of de test
- Conclusion