Using A/B Testing to Optimize Marketing Emails

SmartPush
SmartPush
  • Updated

When sending marketing emails, different subject lines, content, or sender information may lead to different results in terms of open rates, click-through rates, or orders.

SmartPush provides an A/B testing feature that allows you to create multiple versions of an email and test them with your audience. By analyzing the results, you can identify the best-performing version and optimize your future marketing campaigns.

This article explains the features, setup process, and best practices for A/B testing.

Note: This feature is only available for Free Trial and Paid stores.

 


 

A/B Testing in SmartPush

The A/B testing feature in SmartPush allows you to test different versions of your email content and compare their performance.

You can test the following elements:

  • Email subject
  • Preview text
  • Sender name
  • Sender address
  • Reply-to email address
  • Customer email language
  • Email content

In a single A/B testing campaign, up to 5 test versions are supported (i.e., up to 5 test groups can be created, with only 1 control group).

A/B Testing Terms

  • Test group: During the testing period, emails from each test group are first sent to a portion of recipients to collect performance data.
  • Winning group: The winning group refers to the test group that achieves the highest performance based on the selected test metric.
  • Control group: After the testing period ends, the content from the winning group is sent to the recipients in the control group.

Limitations

  • A/B testing currently supports email campaigns only.
  • The Free plan does not support A/B testing.
     

 

Creating an A/B Testing Campaign

Follow the steps below to create an A/B testing campaign.

Step 1: Create an A/B Test

  1. Go to Campaigns > Campaign List, then click Create campaign.
  2. In the A/B Testing Campaign section,  click the A/B Testing Campaign button.

Step 2: Create Test Versions

Each A/B testing campaign must include at least two versions. After completing Version A, click Create new version to create Version B. You can:

  • Copy Version A
  • Or create a new blank version

You can choose to test a single element or a combination of multiple elements.

Step 3: Select the Audience Group

Select the audience group you want to send the campaign to. The dropdown list displays all audience segments created in SmartPush or the SHOPLINE Admin.

Notes:

  • Emails can only be sent to subscribed contacts.
  • The selected group must contain at least 50 valid subscribed contacts. To obtain more reliable test results, we recommend selecting at least 1,000 contacts as your testing sample.

Step 4: Configure the A/B Testing Strategy

Allocation Ratio

Set the allocation ratio between the test groups and the control group.

  • Test group ratio: 10% – 100%
  • If multiple test groups exist, each group will have the same allocation ratio.
Note: If the test group ratio is set to 100%, no control group will be created.

Testing Metric

You can choose one of the following metrics:

  • Open rate
  • Click-through rate
  • Number of orders

During the testing period, the system will automatically determine the winning group based on the selected metric. See the "A/B Testing Best Practices" section below for recommendations.

Notes:

  • If the results are tied, the system compares them in the following order: Open rate > Click-through rate > Number of orders.
  • If the results are still the same, Version A will be selected as the winner.

Testing Duration

Since email opens and clicks require time to accumulate data, we recommend setting the testing duration to at least 24 hours.

The maximum testing duration is 10 days (240 hours).

Step 5: Configure UTM Tracking (Optional)

You can choose to enable UTM tracking and configure the parameters. For detailed instructions, refer to: "Configuring SmartPush UTM Settings."

Step 6: Set the Sending Time

You can choose one of the following options:

  • Send immediately
  • Schedule sending

After completing the settings, click Create.

 


 

A/B Testing Campaign Status

An A/B testing campaign may have the following statuses:

  • Draft: The campaign has not been created yet.
  • Pending: The test group is scheduled for delayed sending.
  • Sending: The test group emails are currently being sent.
  • Testing: The test group has been sent, and the system is collecting testing data.

    Note: If you manually select the winning group, the system will immediately send the email to the remaining recipients.
  • Paused: A campaign may be paused in the following situations:

    • Manual pause: The campaign is paused manually in the data report.
    • Automatic pause: If the test group fails to send or only partially succeeds, the system automatically pauses the test.
    Note: The campaign can be resumed before the testing period ends. Testing period calculation: Test group send time + Testing duration
  • Sent: Some emails in the test group or control group have not been successfully sent yet.
  • Successfully sent: Both the test group and the control group were sent successfully.
  • Failed: Both the test group and the control group failed to send.
  • Partially successful: Some recipients received the email successfully.

 


 

A/B Testing Best Practices

To obtain more accurate testing results, we recommend the following practices.

Test One Variable at a Time

If testing open rate, test:

  • Email subject
  • Preview text
  • Sender information
  • Sender address

If testing click-through rate, test:

  • Button position
  • Number of buttons
  • Button size

If testing orders, test:

  • Number of products
  • Types of products

Ensure a Sufficient Sample Size

To reduce testing bias and improve reliability, we recommend using at least 1,000 contacts in your test group.

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request