What you can test
The platform supports two types of A/B tests per campaign:Subject line
Test different subject lines to maximize open rates. Vary length, personalization, urgency, questions, or emoji use.
Email content
Test different body layouts, images, CTAs, copy length, or section order to maximize clicks or conversions.
You can test either subject lines or email content — not both in the same campaign. Choose the variable most important to your current campaign goal.
Setting up an A/B test
Open or create a campaign
Go to Marketing > Emails > Campaigns and open a drafted campaign (or click + New to create one). Build the base email design in the Email Builder.
Enable A/B testing
On the left panel of the Email Builder, find the A/B Testing tab and toggle Enable A/B Testing to on.
Choose your test type
Select either Email Subject or Email Content. This determines which element you will create variations for.
Set the test duration
Choose how long the test runs before a winner is selected. Options range from 30 minutes to 24 hours. Set this based on your list size and how quickly your audience typically engages — a 4–8 hour window works well for most campaigns.
Set the test audience size
Use the slider to define what percentage of your total list receives the test variations. The remaining contacts receive the winning version after the test concludes. For example: set test size to 40%, with two variations at 20% each, and 60% reserved for the winner.
Create your variations
Add up to 6 variations. For subject line tests, enter a different subject for each variation — you can use the AI subject line generator to suggest alternatives. For content tests, each variation uses a separate email design.
Choose the winning metric
Select the metric used to determine the winning version:
- Unique Open Rate — best for subject line tests
- Unique Click Rate — best for content tests
Variation limits and audience allocation
| Setting | Range |
|---|---|
| Maximum variations | 6 |
| Minimum test duration | 30 minutes |
| Maximum test duration | 24 hours |
| Test audience size | 10%–50% of total list (varies by variation count) |
Winning criteria
Unique Open Rate counts one open per contact, regardless of how many times they open the email. Use this metric when your goal is improving subject line performance. Unique Click Rate counts one click per contact. Use this metric when your goal is driving link clicks, CTA engagement, or conversions. The platform automatically identifies the variation with the highest rate after the test duration and delivers it to the remaining contacts without any manual action needed.Viewing results
After the test concludes, navigate to Marketing > Emails > Campaigns, find the campaign, click the three-dot menu, and select Statistics. Results show:- Performance of each variation (open rate or click rate)
- The variation declared the winner
- Delivery metrics for the final winner send
Subject line testing best practices
Test one variable at a time to get clear, actionable results. Here are common subject line dimensions to test:Length
Length
Compare a short subject (under 40 characters) against a longer, more descriptive one. Short subjects often display better on mobile and load faster in the subject preview, while longer subjects can communicate more context.
Personalization
Personalization
Test a subject with the recipient’s first name (using the
{{contact.first_name}} merge tag) against a generic version. Personalization typically lifts open rates, especially in promotional contexts.Urgency and specificity
Urgency and specificity
Compare a deadline-driven subject (“Offer ends tonight”) against an informational one (“Your April summary is here”). Urgency works well for time-sensitive campaigns.
Question vs. statement
Question vs. statement
Test a subject posed as a question against a direct statement. Questions create curiosity gaps that can lift open rates for educational or nurture content.
Emoji use
Emoji use
Test a subject with a relevant emoji against the same subject without one. Emoji can increase visual standout in a crowded inbox, but performance varies by audience.
Content testing best practices
For content A/B tests, isolate a single change between variations to understand what drives the performance difference:- CTA text: “Get started” vs. “Claim your offer” vs. “See the demo”
- Button color: High-contrast vs. brand-aligned
- Image vs. no image: Text-only email vs. image-led layout
- Long copy vs. short copy: Detailed explanation vs. brief teaser with a link
- Content order: Lead with the offer vs. lead with the benefit story
Frequently asked questions
What happens if neither variation clearly wins?
What happens if neither variation clearly wins?
If results are inconclusive (both variations perform very close to each other), the platform still selects the variation with the highest metric value at the end of the test duration and sends it to the remaining contacts. Plan a follow-up test with a more distinct difference between variations to get a clearer result.
Can I run an A/B test on a scheduled or batch campaign?
Can I run an A/B test on a scheduled or batch campaign?
Yes. A/B tests work with all delivery methods including immediate sends, scheduled sends, and batch sends. Set the test duration appropriately — for batch sends, ensure the test duration is shorter than the batch window so a winner can be determined before the remaining batches go out.
How many contacts do I need to run a valid A/B test?
How many contacts do I need to run a valid A/B test?
As a general rule, each variation should reach at least 200–500 contacts for results to be statistically meaningful. For very small lists, A/B tests are less reliable, and the declared winner may not reflect a true preference difference.
Can I pause an A/B test after it starts?
Can I pause an A/B test after it starts?
Yes. For scheduled and batch campaigns, you can pause the campaign from the campaign list by clicking the three-dot menu and selecting Pause. This stops all variants and the final winner send until you resume.
Does A/B testing work for workflow emails?
Does A/B testing work for workflow emails?
A/B testing through the campaign A/B feature applies to broadcast campaigns only. For workflow emails, use the Workflow Split Action to route contacts down different paths and compare performance between the branches over time.
Can I use AI to generate subject line variations?
Can I use AI to generate subject line variations?
Yes. The subject line field in the A/B test setup includes an AI generation option. Describe the email content or goal and the AI will suggest alternative subject lines to test.
.png?fit=max&auto=format&n=EQK5eX9kTD8NzWwA&q=85&s=878008bf159fcc4964d0c0d508b6e400)