Life is full of A/B tests. People perform A/B tests every single day, often without knowing. From parenting to gardening — and everything in between — we test options to see what works best: Does my child prefer hot dogs or chicken fingers? Do beans or tomatoes grow best in my garden?

For websites, A/B testing is a well-established part of the user experience toolbox. With it, we can hone in on an individual page element, for example, and micro-optimize it for performance. This gives us answers to mission-critical what questions like: What element works best in this spot? What version of the page converts more?

Why, Why, Why?

But in website user experience, like in life, sometimes knowing the whats is not sufficient. We also need to know why. That’s because without the answers to the why questions, we can’t begin to rectify the performance of the whats that aren’t to our liking. Why doesn’t my son like hot dogs? Is it because his friend claimed they’re made of ground-up lizards? Why won’t my tomatoes grow here? Is there something wrong with the soil?

These whys are glaringly absent from traditional website A/B testing methodologies. Today’s website stakeholders demand answers to questions like: Why should I test this element, as opposed to another? Why do more customers convert from this image than from another? Why are users dropping off this form more frequently than before?

Increase ROI with Data-Driven Testing

To better direct the A/B testing process and bring deep customer-experience-based understanding to testing results, a new, data-driven approach to A/B testing is required.

That data-driven approach to A/B testing goes beyond the whats of website user experience to deliver the whys, too. That gives you the power to leverage existing user experience data and solutions to create an A/B testing methodology that adds new layers of in-page insights to the testing toolbox. And those layers, in turn, enable more focused pre-test planning, augment testing results with measurable validation and increase ROI from site optimization efforts.

Before, During and After-Testing Insights

Before, during and after testing, the data-driven approach to A/B testing allows testers to visualize previously-hidden factors that contribute to the success of one element over another and reveal insights that facilitate faster test iterations.

Before Testing: Experienced A/B testers know that the most effective test is a well-researched one. One of the most challenging parts of the testing process is defining what to test, why those factors are important and how the various potential results can impact page performance.

In the pre-testing stage, data-driven A/B testing helps you focus test investments where they’ll have the greatest impact. Visually examining aggregated data from in-page customer experience metrics can help you discover which page areas require improvement using techniques such as:

Heatmaps:

Desktop and mobile heatmaps illustrate visitor mouse moves, clicks or taps, attention and scroll reach or exposure. These insights give you an aggregate-level visualization of what people are doing on your page and what specific elements could benefit from testing.

Side-by-Side Comparisons:

Side-by-side heatmap comparisons can also be a great preliminary tool before moving into structured A/B testing. Comparing page interactions side-by-side allows you to better design your A/B test to micro-optimize for each target audience.

Analyzing Links:

Reviewing a graphical representation of every link on the page — including visitor hovers, clicks, hesitation time and visit order — can provide key pre-testing insights into user decision-making processes.

Drilling Down to Actual Customer Journeys

From the aggregate level, you can then drill down to the individual level to understand actual customer intent by watching actual customer journeys. This sets the stage to judge how your user experience affects your visitors’ ability to realize their goals using tools such as:

Session Replays:

Watch replays of user browsing sessions to visualize exactly what visitors are seeing and doing on your desktop or mobile site.

Learning Opportunities

Forms:

Testers can overcome the limitations of traditional form metrics like completion rates by analyzing insights into real-world form usability issues. This then allows for focusing testing on micro-optimization of layout, fields and other form-specific issues that can dramatically impact conversions.

During Testing: While your tests are running, data-driven A/B testing helps speed time to results and increase ROI for time and resources invested in the test planning stage.

For example, while tests are running, you can validate that test variations are working as intended by watching session replays immediately after the first visitors experience your tests. This enables quick corrective action if needed, reducing potential lost testing time and expense. By leveraging visual evidence of what experiences are working best, you can also iterate tests more rapidly and see results more quickly.

After Testing: In post-test assessments, data-driven A/B testing empowers testers to better understand why certain experiences win and others lose. For example, by comparing test variations side-by-side using heatmaps, it’s possible to uncover exactly how changes impacted visitor interactions.

By delving deeper into the why of your A/B testing results, you gain significant insights into the optimization cycle. Data-driven A/B testing helps you quickly and conclusively determine which versions were the true winners of the test cycle and how best to prioritize implementation of necessary changes.

Getting Buy-In for A/B Testing

Although A/B testing is anything but subjective, it can often be challenging to get organizational buy-in for changes indicated by testing results. Presentations of findings can easily degenerate into unnecessary and distracting questions about methodologies, confidence factors, variants and other technical data.

Data-driven A/B testing helps testers recast the dialogue from technical issues to user experience. By integrating customer intent with A/B testing data, you can markedly change the way your organization looks at web analytics as a whole. Furthermore, showing your internal stakeholders visualizations that highlight that a particular element or content piece isn’t having the desired effect can go a long way toward convincing them that it’s necessary to test.

The ROI of Asking Why

A/B testing platforms show test result data and indicate which version of a given page or page element worked best. Data-driven A/B testing adds the why to the what by revealing the reasons one version performed better than another. It is precisely these whys that decision makers and website stakeholders need to hear when evaluating testing findings and suggested changes.

Ultimately, data-driven A/B testing provides testers with the unique ability to quantify the impact of potential changes. Intelligent use of the data can even enable predictions of how revenues will be impacted when the recommended changes are implemented — and how much your company may lose if they’re not.

Title image ”Sorry” (CC BY 2.0) by Tobyotter

fa-solid fa-hand-paper Learn how you can join our contributor community.