10 A/B Testing Examples to Boost SMS Performance

illustration of phone with images of flowers and plants
Posted in
SMS Marketing
Published on
Feb 14, 2023
Written by
Kayla Ellman
Written by
No items found.
Thank you! You've been subscribed.
Oops! Something went wrong while submitting the form.

If you want to optimize the text messages you send, A/B testing is key. But what should you test? Keep reading (and bookmark this post) for ideas and best practices.

​​What makes people engage with the text messages they get from brands? It depends—not only on your brand voice and target audience, but also on factors like emoji usage, call to action language, and send times.

That’s why A/B testing is an essential part of getting the most out of your SMS program. By experimenting with different elements, you can understand ​​what resonates with your audience and where there are areas for improvement in your messaging and strategy.

Some brands’ subscribers may respond well to playful messages that include an image or emojis and feel like they’re texting a friend. Others may prefer more straightforward messages that only include a short amount of text and a link to shop.

To help you figure out the right approach for your brand, we’ve pulled together these A/B testing examples you should try on SMS. These tests will help you determine the best way to structure your campaigns and triggered messages, what type of content to include, and more, to make sure you’re maximizing engagement and conversions.

What is A/B testing?

A/B testing is a randomized experiment that compares multiple versions of a marketing campaign or piece of content (e.g., on email, SMS, or your website) to determine how different variables (e.g., subject line, copy, imagery) impact performance.

For SMS, common metrics to measure the success of an A/B test include click-through rate (CTR), opt-out or unsubscribe rate, conversion rate (CVR), and total revenue.

Types of A/B tests to run on SMS

Now that you know what A/B testing is and why it’s important, here are some ideas and examples to help you get started:


SMS A/B test example from MATE the Label

Incorporating multimedia—like images, GIFs, video, or audio—is a fun way to add color to your text messages. But maybe your customers are more likely to engage with text-only messages (i.e., SMS vs. MMS). This is one of the first and easiest A/B tests you can run to see which type of message works better for your audience.

Remember to keep the copy the same for each variation, so you’re only comparing the effectiveness of including an image vs. not including one.

Image type and content

SMS A/B test example from Farmacy Beauty

If you’ve determined that your subscribers are more likely to engage with messages that include visuals, the next step is to test different types of images to see which ones convert best. For example, you could include a product close-up in one variant, an image of a model in another variant, and a lifestyle image in a third variant—using the same copy in each one.

You can also run A/B tests to see if using static images or GIFs and audio or video have more of an impact on click-through and conversion rates. Or, if adding text over an image (e.g., to reinforce a limited-time sale or offer) performs better than the same image without the text added.

Emoji usage

SMS A/B test example from Cotopaxi

Emojis can be another great way to add personality to your text messages. While we typically recommend using emojis sparingly—and only when they add value to the message—every audience is different, so it’s important to test what works best for yours.

To see if emojis are effective in your SMS campaigns, start by running an A/B test comparing messages with and without emojis. If you find that customers respond positively to messages with emojis, then you can run additional tests to figure out the ideal number of emojis to use per message and which emojis get more engagement.

Format and length

SMS A/B test example from Little Sleepies

Keeping your text message copy between 75-115 characters (or 3-4 lines long) is a good rule of thumb, but you can run tests to see if your subscribers prefer shorter or longer messages. Try to keep the copy generally the same for both versions, using the shorter message copy as the foundation for your longer message, so you can accurately compare them.

Experiment with how you structure your messages, too. Do your subscribers prefer when you use no line breaks or many? Does it make a difference when you capitalize certain words (e.g., FREE, SALE) or keep the whole message in sentence case? Test these elements one at a time to fine-tune the format and length of your text messages.

Link placement and destination

Every text message you send should include a tracking link to your website, but where you place the link can affect how many people click it. Run ‌tests to see if placing links near the top of your message, in the middle of your copy, or at the bottom of your message drives more conversions.

You should also experiment with directing subscribers to different pages on your website. For example, when you launch a new collection, does sending people to the full collection page or to the specific product page influence more purchases? You can do a similar test when promoting new arrivals. Try creating one variation where you direct people to your homepage and one where you link directly to your new arrivals page.

Call to action

SMS A/B test example from Three Bird Nest

Ending your text messages with a clear call to action can help encourage subscribers to click through and shop immediately. But the language you use might make them more inclined to take that next step. You can test different short and sweet options (e.g., ​​“shop now,” “click here,” or “ends soon") and capitalizations (e.g., “Shop now” vs. “SHOP NOW”). Or, see if something more playful or descriptive works better (e.g., “What are you waiting for?” vs. “Shop now”).


SMS A/B test example from Our Place

We’ve found that consumers typically prefer dollar-off or percentage-off discounts, but do your customers prefer one over the other? Do they want other incentives like free shipping, loyalty benefits, and first access to new products drops? Run A/B tests across your campaigns and triggered messages to find out what drives the biggest impact for your brand.

You should also A/B test your email and ​​SMS sign-up units to see if you convert more website visitors into subscribers with discount-based incentives or other kinds of offers.

Copy variations

SMS A/B test example from Mejuri

In some cases, you might want to experiment with different copywriting approaches or positioning. For example, highlighting the benefits of specific items vs. using a fun play on words to garner interest, or leaning into FOMO to promote limited-stock items vs. positioning them as “back-in-stock” while supplies last.

Just remember to keep as many elements as possible the same in both versions—like the link and the image—since the copy is the variable you’re testing. 

Tone of voice

For some brands, sending text-only messages that get straight to the point may work better (i.e., “We’re having a sale. Shop now!”). For others, it might be more effective to use a casual tone of voice that reflects the personal nature of text messaging (e.g.,”Hi friend! Our new collection has your name written all over it. Treat yourself to something new.”). Play around with different tones or styles until you find something that resonates with your subscribers and aligns with your brand voice.

This is also an area where you can play around with dynamic variables, such as the {firstName} macro, to see if your subscribers respond well to being addressed directly.

Send times or wait times

When’s the best time to send an SMS campaign? ​​Your audience will have specific preferences, but a good place to start is by sending the same message at different times (i.e., in the morning, afternoon, and at night) and on different days of the week to see when your subscribers are most active. If you find that more people tend to open and engage with your messages at night, then you can get more granular and test the same message at hourly intervals (e.g., 5pm, 6pm, 7pm, 8pm). 

Keep in mind: Under the TCPA and related state laws, you can’t send text messages during “quiet hours.” Attentive's default and recommended Quiet Hours are 8pm to 12pm EST. If you use Attentive, we also recommend using our time zone-based message sending feature, which allows you to schedule and send messages based on a subscriber's local time zone.

You should also test the timing of your triggered messages to see how different wait times impact performance. For example, A/B test your abandoned cart reminders to send after one hour vs. three hours to see which one leads to more completed purchases. Or, experiment with sending post-purchase messages after 14 days vs. 30 days to figure out the best time to nudge recent shoppers to come back and buy again.

A/B testing best practices

Before you start running A/B tests, there are a few best practices to keep in mind to make sure you get the most accurate results:

  • Test one variable at a time. If you want to see how static images perform compared to GIFs, make sure to use the same message copy in each variation. But if you also want to compare the impact of using a casual vs. a professional tone, you should run two separate A/B tests, rather than testing both variables (i.e., image type and tone of voice) at the same time.
  • Sample size is key. We recommend testing each variant in your A/B test with a segment of at least 500 subscribers. However, the larger the test group, the more accurate your results will be, and the better they’ll reflect your SMS audience’s preferences‌.
  • Remember to run multiple A/B tests. It’s important to gather enough data to deliver statistically significant results. For example, if you find that your audience prefers text messages that include images (i.e., MMS vs. SMS), make sure you run the test again, with a few different campaigns, to verify that the results are consistent.
  • Try comparing two or more variations. If you use Attentive, ​​you can run up to 30 different variations in a single A/B test campaign, which allows you to test a wide range of content types or message send times.
  • Boost performance with auto-winner A/B tests. When you ​​create an A/B test campaign in Attentive, you can choose: what percentage of your recipients should receive the test (e.g., 20%), the winning criteria (e.g., CTR), and when to send the winning variation to the remaining recipients (e.g., after two hours). Auto-winner tests allow you to gain insights about your audience while making sure the best-performing variant is sent to the majority of your SMS list

A/B testing should be an ongoing part of your SMS strategy. Even when you find something that works for your brand, continue to test your campaigns and journeys regularly to make sure you’re always sending the most effective messages possible.

Related Articles