A/B testing (also known as split testing) is a method of comparing two versions. This can be two or more alternative processes, pieces of communications or maybe different website structures.
It’s an experiment where alternative versions are shown to different, randomly selected customers, and their action and behaviour in response is measured. This testing can help to confirm the best approach to take when rolling out new activity or making changes to existing communications or processes.
Why should we do A/B testing?
A/B testing can measure a whole host of different outcomes from the number calls generated from an activity to time spent on a web page or value of sale.
Here are some of our favourite A/B test that give us insights about what works for the audience we’re trying to connect with, retain and grow.
Web-page titles: By testing different headers and titles on web-pages and blogs we can develop a better understanding of the types of content and ways of writing that works for an audience.
We’ll measure the number of people who click on the links within the page or article as well as how long your audience stay on a page.
Ecommerce product listings: When we compare different product placement, product imagery or description we can ascertain what is motivating people to progress through the buying process.
We’ll measure the number of times a product is viewed, which products are added to a basket and how often baskets are abandoned.
Creative approach to mailings: Even in a digital led era we know that the full marketing mix will still include traditional media like letters and direct mail for many clients. We often test the style of mailings – looking at traditional letter formats versus alternative formats like postcards and more innovative mailing styles.
We’ll measure the calls and clicks generated in response as well as the return rate.
Social media formats: We all know that rich media such as videos and GIFs are known to generate greater engagement but we like to refine our understanding of different audiences by looking at subtle changes such as duration of video, choice of filter or time of day.
We’ll measure post engagement, click through rates and changes to the follower base in terms of volume and demographic profile.
In the drive for ‘quick wins’ and ‘tactical opportunities’ it can be all to tempting to want these test complete quickly. Whilst that sometimes is possible (often more so with digital activity), it’s important to allow any test the time to run it’s course. For example, when looking at testing for retention it might even be necessary to run A/B testing over a 12-month period or more.
More than just answering important one-off questions A/B testing can be used to drive continuous improvement for any given experience, improving a goal like retention rate over time. It’s worth time and energy to build proper testing programme.
If you’d like to talk to us about how create, run and monitor testing just get in touch.