Κυριακή 12 Αυγούστου 2012

A/B testing: The secret to successful conversion

“Anytime you’re not testing is a waste of traffic.” So says Jonathan Isernhagen, Director of Marketing Analysis, Travelocity.com, who joined Sue Chapman, Director, Merchandising Practice, Demandware and Doug Rosenberg, Manager, Online Traffic and Loyalty, Brooks Sports for the “Leveraging the Power of A/B Testing” panel at the recent Shop.org Online Merchandising Workshop.

Testing was a popular theme in general with speakers and panelists at the workshop. And as any online merchandiser knows, A/B testing in particular has proved itself time and again as an invaluable tool to determine page layout, messaging, offers, and much more. In effect, navigation can be smoother for customers to find and learn more about what they’re looking to buy.

Still, A/B testing is not always either well-understood or well-executed – or, for that matter, well-documented and archived for future reference. The panelists had a few overarching words of advice, followed by some detailed case studies to illustrate tactics and results.

Define and answer key questions. Retailers should start any A/B testing exercise with a few key questions, Chapman emphasized, including:

  • Determining what to test.
  • Deciding on your goal – that is, what you are trying to prove? This question is at least as important – if not more so – than determining what you want to test.
  • Evaluating the results and measurement KPIs – that is, what metrics will identify the winner of a particular test.
  • Selecting and deploying the winner.
  • Learning from the challenges, results, and even surprises.
Anticipate testing challenges. Brooks Sports’ Rosenberg underscored the importance of understanding and planning for testing challenges. For example, A/B tests require resources and “bandwidth” from both the creative team and the business or product owners. Retailers also need to foster a testing mindset and skills within the development team – communication between key cross-functional groups in general is key to developing a successful testing strategy. Additionally, there’s the challenge of what to test and when – so therefore it’s important to define the process for prioritizing tests. All of that said, Rosenberg advised retailers to just keep at it -“Test, learn, and test again!”

Document and archive testing findings. Isernhagen stressed that, since the Travelocity team is always A/B testing something on the site, it’s key to document what is being tested for both current and future reference. Team members will come and go over time, but this institutional knowledge archive is in place as ongoing reference to answer questions about past tests and their results. If a team member wants to test something, they can first check the archive to retrieve those lessons already learned and avoid wasted effort. Of course, for the archive to work, Isernhagen advised that retailers also determine a common taxonomy that everyone has agreed to upfront.

Leverage A/B testing case studies. The following are quick glimpses of A/B tests that Demandware ran for two clients. Be sure to see also the detailed panel presentation to learn more about some A/B tests run by Brooks Sports and Travelocity.

  • House of Fraser. This British high goods department store group wanted to determine the optimal category sort order. For their handbag category, they tested whether the default sorting rule should be around number of units sold (A) or by revenue (B). Using number of units sold, the page appeared cluttered and unfocused, showing less expensive handbags and accessories such as leather cleaner (again, lower price point, higher units). Via A/B testing, they found that sorting the handbag category page in terms of revenue really showcased the quality and selection of handbags available (styles and brands). Not only did this sort represent the House of Fraser brand much more accurately, but the conversion rate on this page rose 12% – clearly a winner on both counts.
  • Living Direct. For its laundry appliance landing page, this niche small appliances retailer conducted a 50/50 split test to understand whether it should stick with the “Guided Shopping Experience” landing page (featuring 7 different laundry appliances that customers could click on to refine their search further), or instead switch to a “Featured Image” landing page which featured one single product.The featured image page resulted in a 2.58% conversion lift, 8.25% average order value increase, 10.62% revenue per visit growth, and overall revenue increase of 11%. Not necessarily an intuitive direction, but certainly proved through disciplined A/B testing.


By Fiona Swerdlow, Head of Research, Shop.org
http://blog.shop.org/

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου