E-commerce player Lenskart started A/B (split) testing around two years ago for its product pages across devices. In fact, it has even done A/B/C testing at various points in time to try out new ideas. For example, when it was planning to optimise the 'buy now' page, the marketer found that the moment customers clicked 'buy now', they saw a pop-up on their devices asking them what kind of lenses they wanted to fit into their frames. This could be a tough decision with so many choices thrown in. So Lenskart tested two versions and found that the one that gave a comparison of various versions worked very well with the customers.
"The most tangible result of A/B testing is the huge jump in conversion. The percentage of customers adding the product to their shopping cart saw a significant increase after we introduced the above-mentioned grid. Testing has also been observed to improve our average ticket value as it enables us to seamlessly introduce and upsell customers on a new features or additional products," says Peyush Bansal, CEO & Founder, Lenskart.
On its part, OYO Rooms has been conducting intensive A/B testing on its mobile app to streamline its products. For example, when OYO Rooms faced the dilemma of whether to have the option to browse hotels in maps or in a list, it conducted an A/B test. It found different customers have different preferences. While in some pockets maps worked better, some users found it intimidating as there were many data points on the map. So OYO has both the options and if a consumer prefers maps she could see that option first. It also conducted A/B tests to understand which type of hotels to show at the top and their impact on conversions, filtering options and so on. "A/B testing is about first framing a hypothesis and then experiment around it live. It enables consumers get better targeted searches. Optimisation around search experiments can lead to two to 10 per cent jump in conversion rates," says Anurag Gaggar, vice-president, product management, OYO Rooms.
A/B testing is not a new buzzword though. While it is a mature practice in the West with most professional businesses using it to ensure they maximise returns from their website, in India it is relatively new. However, with the consumer internet businesses flourishing in the country, a host of Indian start-ups have jumped on to the bandwagon and are testing different elements on their website to make it as responsive and engaging as possible.
So what is A/B testing? It enables marketers to create multiple versions of a web page, expose different versions to different sets of visitors and analyse how they engage with these different pages. Based on the analysis, a company can get clarity on which page engages the customers most and how individual elements on the page can be tweaked to make it more responsive for customers.
Kulzy, an online professional networking platform for advertising, media and marketing community, has only recently got into A/B tests. "Kulzy is a data driven site where analytics is crucial. It is important to have for each control page adequate data that accounts for variables. Once that happened, the analytics team at Kulzy began to measure and record the response across a variety of predefined metrics. We are at the stage of experimentation and documenting our learnings," says Sandeep Vij, co-founder & director, Kulzy. At present, Kulzy conducts A/B tests across the main pages, emailers and checked for buttons, positions, subject lines, layout, navigation, page sizes etc. It also uses the data to evaluate pages for time spent, sign up, open rates, bounce rate and so on.
Conducting A/B tests are easier said than done. Creating a hypothesis, setting the test and exposing it to different set of users are just one part of the exercise. "A/B testing is good when you have a clear hypothesis and you have to test it. It is not a replacement to customer research. A/B testing can be used to optimise over existing set of ideas but it does not throw up new ideas," says Gaggar of OYO Rooms.
Getting it right
Whether to conduct A/B test or not depends on your how big your exposure to digital is. For example at Panasonic, the focus is more on market research. But in case of products such as air purifiers, where the target audience is digitally very active, it uses A/B test for creating customer stickiness. "For air purifiers, our target group falls into a category where there is huge tech focus. Here we are looking at throwing three/four different kinds of creatives to measure customer stickiness. For some of these categories, we have done A/B testing. It depends on what ones exposure to digital is," says Sarthak Seth, head, brand & marketing communication, Panasonic India.
One of the critical questions a marketer has to answer before conducting A/B test is what actually makes the customer buy and how to convert customer preferences to actual purchase in the market place. "The key elements of A/B testing and using it for right marketing strategy is to assimilate the test results into the organisation, its marketing, all possible consumer touch points and the larger framework," says Joginder Chhabra, heading market & consumer insights, LG Electronics India. "The main objective of split tests is to continue with the tests till they achieve the optimum conversion potential of the website and take best insights for the offline media campaigns," adds Chhabra.
Therefore, split testing, experimental design and multivariate testing is largely used in India by IT and tech giants for studying the impact of multiple variables such as offers, incentives etc. Organisations and businesses directly dealing with large consumer base such as banks, airlines, e-commerce firms and telecom companies can also benefit from such tests.
However, in some cases the cost of experimentation can be quite high as one has to build multiple versions of the same model/design. "It doesn't make sense to experiment without going and talking to consumers first. For instance, doing a low cost prototype and showing it consumers could be a better way to keep costs in check," says Gaggar.
Three critical areas where A/B testing can be used by businesses to solve problems and improve conversion rate are:
Landing page optimisation
Businesses with service-based model can use A/B testing to increase sign-ups on their pay per click (PPC) landing page. The purpose of a landing page is to direct the user to take a desired action - largely sign up or buy. In 2015, search engine marketing is expected to capture the largest share of online spend at 47 per cent, or about 14 per cent of a company's total marketing budget.
To get better ROI out of the PPC marketing spend, marketers could use landing page optimisation techniques. A/B testing allows marketers to easily test out messaging, visuals, layout and information flow to find out what best works for their audience.
Reducing cart abandonment
One of the biggest pain-points for e-commerce companies is the high cart abandonment rate. The cart abandonment rate for e-commerce companies varies between 60 per cent and 75 per cent, according to different estimates. This means, more than two-thirds of the shoppers put a product in the cart and then decide against shopping at the very last moment. Hidden shipping costs, security concerns and no guest checkout option are some of the reasons for shopping cart abandonment. Marketers can use these insights to run A/B tests that can help reduce cart abandonment rate. For example, they can test between a security seal or a privacy policy to see which assuages the security concerns of the customers better.
Improving engagement
Bounce means when a visitor lands on your website, hangs around for a bit and then leave without visiting another page.
The average bounce rate of the e-commerce industry is 34 per cent, which probably means that visitors are not finding relevant information on that webpage. One of the ways in which A/B testing can be used on an e-commerce homepage to reduce bounce is to test out if a pop-up modal box is annoying the visitor. These modal boxes can sometimes blind the visitor into bouncing off. An A/B test can find out if that's the case and potentially reduce bounce. A/B testing can also allow marketers to test out different checkout experiences to find out the most optimum one.
Sparsh Gupta
Partner & CTO, Wingify
Partner & CTO, Wingify