We’re happy to announce that we now support A/B testing emails on SendRoll. Here, we answer a few questions to help marketers A/B test their SendRoll campaigns to achieve maximum success.

What is A/B testing and why should I run an A/B test on my emails?

An A/B test is when marketers set up different versions (A and B) of the same campaign to send to customers. If you have significant email volume, you can split your audience with an A/B test to determine how different marketing messages resonate better with your customers.

How does it work on SendRoll?

When you create multiple campaigns that share the same goal (i.e. Starter, Cart, Loyalty, Manual) and target the same audience, SendRoll randomly selects one of the campaigns to serve to each individual in your audience. With enough volume, SendRoll can almost evenly split the audience into equal sized groups.

What can I test?

You can create campaigns that test different subject lines, delay in sending automation, email content, or combinations of all three.

Can you give me an example?

Suppose you’ve set up two Starter campaigns with the following subject lines:
Version A.) “Check out what’s in store”
Version B.) “Missing you”

Both Starter campaigns target non-converters—an audience of around 2000 people. For each of these potential customers, SendRoll randomly selects if they’ll see version A or B. This allows the creative to be tested between the 2 separate campaigns.

What are some A/B tests I should run in my email?

We recommend starting small. Run a test with just two variations and try to change only one variable at a time so you can cleanly interpret which variable influenced the result. However, our system allows you test more than two variations.

For example, if you wanted to see what types of copy entices customers to probe deeper and open your emails, you would create two campaigns with different subject lines to compare open rates. Below are some more variables you can change:

Variable Metric Potentially Impacted
Subject Line Open Rate
Delay Open Rate
Email Content Click Rate, Conversion Rate
Product Recommendations Click Rate, Conversion Rate

What are some questions I could answer with A/B testing?

  • How does the send time (delay) impact open rates?
  • Does asking a question or making a statement in the subject line perform better?
  • Does adding product recommendations improve my performance?
  • How does image size affect my click through rate?
  • … and more

Product validation from our customers

A/B testing emails has been on our roadmap as it is a commonly requested feature. Sometimes, strong validation that a feature is desired comes from seeing customers work around the product to fit their needs. In this regard, we saw some customers set up multiple campaigns that targeted the same audience with different content. These customers assumed that SendRoll would pick randomly.

In the past, however, our system didn’t select an email at random if a targeted visitor qualified to receive multiple campaigns. So, in a recent code cleanup, we dug into the internal logic and forced a random selection, revealing an undocumented feature we didn’t realize we already had.

The result of a new feature by having customers “hack” SendRoll and changing one line of code adds value almost immediately. We’re big on releasing quickly and often, delivering small wins sooner and in parallel with our larger product releases. Feel free to reach out with any feedback and stay tuned as we continually improve the product.

Learn more about how you can A/B test your campaigns with SendRoll here.