The average conversion rate for ecommerce is in the 2-3% range. What if I told you that you could add a full percentage point by removing some elements from the checkout page?
Without seeing any data to prove it and testing it out on your own pages, you shouldn’t trust me. Still, it’s good to keep an eye out for conversion rate optimization (CRO) case studies to see what others are doing and not to be left behind.
Needless to say, any case study you find should not be blindly copied but rather examined to see whether your own data suggests similar trends and only then consider running similar tests on your own audience.
To help you avoid getting left behind, I’ve highlighted four ecommerce CRO case studies that examine various challenges -- from checkout page design to capturing emails.
Which Is Better - Percent Off or Dollars Off?
Discounts and coupons are generally a great way to increase sales. When running a campaign, you can either offer a fixed $ amount off or offer a percent off. Both of them can work, but which one is better at driving more sales?
This is exactly the same question that the guys over at Evo wanted to answer. Evo is an ecommerce outfit that sells all things sports. In the past, they’ve had mixed results when using either “$ off” or a “% off” offer, but they never tested both of these offers against each other.
“We had done a lot of percentage-off coupons and not as many from the dollar-off variety. I had a gut theory that one would perform better than the other and wanted to find out,” says Nathan Decker, Ecommerce Director, Evo
When the time came to send out their next promotion, they decided to test it out. To make sure they got enough data to make the test viable, they chose one of their popular segments, wakeboarding. Based on average order size, they chose $50 off as a reasonably good offer that would still leave room for profit.
Using the same parameters, the competing offer was set at 15% off simply because it worked out to be more or less equivalent with the $50 off offer.
The competing offer emails, 15% vs $50 off, were prominently displayed on the subject line as well as in the copy. After 48 hours, they sent a follow-up email to remind customers of the offer, as well as a 6-day time limit to redeem it.
When the results came in, it became clear that the monetary offer was a clear winner. The $50 off offer generated 170% more revenue and had a 72% higher conversion rate.
However, the offer didn’t seem to actually affect opening rates too much - those were 20% for $ off and 19% for percentage off. The same story continued with clickthroughs, 32.4% and 33% respectively, and on the average order size, 15% off was actually a clear winner with 44% bigger orders.
“It turned out that the dollar-off coupon was a better way to position offers because the perceived value is higher for our customers. The ‘defined amount’ that you get turned out to be a much more effective incentive. Going forward, we are going to use the dollar-off [tactic] to drive more sales,” says Decker.
Will you have similar results? It’s hard to say without testing it out with your products and your customers.
One other thing to note is that whichever offer works best might be dependant not only on your products but also by the age of your customers as demonstrated below:
Screenshot via “Consumer Views of Email Marketing 2015”
Email Promotions and Daily Deals
For traffic, the daily wine page was prominently featured on the site’s main landing page. Also, traffic was driven by sending the daily deal to a targeted email list.
Naturally, WineExpress wanted to convert more visitors into paying customers and thus turned to A/B testing. The challenge was that the daily wine page was already converting well.
By optimizing only daily deals, the company could have very well ended up in a position where they were converting more customers with their daily deals but because of lower average order size end up losing money.
This is what the page looked like before:
Image via WiderFunnel
The main objective was to do away with the navigation bar all together while still maintaining a way to push additional sales to make the average order size bigger.
Using help from WiderFunnel, the company developed and tested three different designs aiming mostly to test different layout approaches.
Variation A: “CTA Right”
- Removed top navigation
- Streamlined top banner
- Added countdown offer to top
- Moved product description up
- Moved video above the fold
- Increased prominence of customer reviews
- Decreased prominence of ‘you may also like’ items
- Widened the page
Variation B: “New CTA Right”
This design variation was essentially the same as Variation A “CTA Right”, except that the call-to-action area was enlarged, pushing the video to the fold line.
Variation C: “New CTA Left”
This Variation was a mirror image of Variation B “New CTA Right”, moving the CTA and video feature to the left while product info and customer reviews went to the right column.
Image via WiderFunnel
In the end, two of three variations beat the original in terms of conversions and revenue per visitor a.k.a the initial goal. The winner, Variation A (featured above), lifted revenue per visitor by 41%.
This kind of product page design (removing all navigation) is best suited when you’re trying to push individual products and bundles. Offers like daily deals are made for this kind of testing.
How is Your daily deal page looking? Are you planning on running experiments to improve it? If not, I urge you to reconsider.
Saving Payment Details
Offering guest checkout is generally considered a cleaner and faster experience for would-be shoppers, and it can help alleviate the perception that you're only after the users’ data by forcing them to register before making a purchase.
Then again, by having guest checkout, the email address you capture can only be used for updates on shipping and not for marketing -- taking away the opportunity to send newsletters, promotions and more to your customers.
Ideally, you would use a solution that offers easy and fast checkout while at the same time encouraging customers to sign-up for an account. This scenario is exactly the problem that Hillary Clinton’s tech team ran into. They had an easy and fast donations flow but weren't completely happy with the amount of donors that registered to save account information.
Originally, supporters saw the following screen after making a donation:
Image via Medium
When clicking on “Save my payment details,” the following form appeared prompting the donor to sign-in or register to save payment details:
Image via Medium
Clicking on “Create an account?” opened this form:
Image via Medium
All in all, it was a confusing three-step process with none of the fields pre-populated with data from previous steps and no way to check if the email used by the donor was already a registered user or not.
The new flow removed a click and confusion from the process by pre-populating the email field with data provided in the donation form, and adding an automatic verification of whether the user already had an account.
After the donation was complete, new users (no account associated with the email) saw this prompt:
Image via Medium
And for those who already had an account but hadn’t saved payment information, saw the following:
Image via Medium
Following the tests Kyle Rush, Deputy Chief Technology Officer for Ms. Clinton, said the following describing the results:
“I’ve been doing a/b tests for about five years and this is the biggest, most conclusive win I’ve ever seen.”
A common frustration with A/B testing is that sometimes a clear increase in testing results does not show similar gains when deployed to 100% of visitors. With this change, it did. The large spike after January 1st is when the winning format was deployed to production:
Image via Medium
The Importance of Heatmaps and Session Replies
Wyldsson is a UK-based company selling healthy snacks, dried seeds and other toppings. During their new website launch, hopes were high. After all, it had taken months of design work, development and planning. The site looked good. Would it produce results to match?
Unfortunately, things didn’t go as planned as they immediately saw a drop in sales compared to their old design. Having a completely redesigned customer experience, it was always likely that a drop in sales would happen. Customers needed to familiarize themselves with how the new design looked and functioned.
“It was very frustrating. We had spent the best part of a year and a ton of money building our shiny new store, and it wasn't delivering,” says Dave McGeady, CEO of Wyldsson
After giving it some more time and not seeing any uplift, CEO Dave McGeady decided to take action. The solution was to use an app called Hotjar - an optimization suite that provides heatmaps, visitor recordings, funnel analysis, form analytics and more.
As there weren’t any big red flags to fix (nothing obvious at least), they used the software’s heatmaps (where visitors are clicking) and session recordings functionality that allowed them to see how visitors were interacting with the shop.
It became clear that their visitors’ problems included browser compatibility (the shop wasn’t functioning correctly on some browsers) and user accounts.
Image via Hotjar
Once the problems were identified, they were fixed and checked through a new round of heatmaps and session replays to make sure that everything worked as it was intended. After all this, the result was a 30% increase in sales. 30%!
It’s not always about adding or removing a button or changing its color, product images, etc. Sometimes the problems are simply that the site does not straight up function on some browsers or visitors are doing things that the developer never envisioned.
No amount of A/B testing could have shown the issues that Wyldsson had. Sometimes it’s a good idea to simply watch how visitors are interacting with your shop and make improvements through that.
Each and every website is different and caters to different audiences. What worked for one is not guaranteed for another and vice versa. Take these case studies as a blueprint to see what worked for your peers and use it as a basis for your own testing.
There are no hard and fast rules that work for everybody. There’s also no escaping your own testing. Good luck!