A/B testing has been a meaningful tactic since the earliest days of web analytics.
Just recently, Adobe announced a significant revision to its A/B testing platform Target, one that reflects how machine learning is influencing analytic features.
Enhanced Personalized Testing
Adobe introduced Automated Personalization, a switch in Target that leverages machine learning through application of a personalization engine in a test environment.
By selecting auto-target in the setup workflow for a test, marketer can select and test a variety of content as variables according the customer experience of a website visitor or app user. The content, like most A/B and multivariate test, include different layouts, choices of images, different wording, or a varying combination of each.
The result is a tool that can conduct personalization testing to determine website or app elements that resonate with individuals.
Relating Metrics to Personalization
Adobe’s announcement reflects a shift in optimization testing, one based on a need to better understanding what personalization enhances customer experience.
Metrics were originally conceived as an indicator of how well people interacted with a website structure element. A conversion rate metric, for example, represents the percentage of people who visited a website or app and acted on a button, reached a page or acted on an element of interest to the marketer.
Now metrics must account for more than just a click of a button or a download of a white paper.
Metrics must incorporate a series of digital-related behaviors and how consistently those behaviors occur over time.
Such metrics are meant to mimic human decisions for a click of a button, and they mimic a decision tree process. Thus metrics must align in some ways with machine learning processes to be valuable for advanced analytics.
Understanding Customer Behavior
The “ah-ha” moment for analytics providers like Adobe is recognizing A/B testing flourishes when accounting for a series of digital-related behaviors.
A/B testing is meant to highlight when one webpage element yields a higher volume of response compared to another. It is meant to represent a preference of an element, say a button or a call to action phrase over another.
Learning Opportunities
Key aspects of machine learning rely on big data sets created from real time events. Machine learning-influenced decisions usually come from comparing a sample set to a larger big data set. That scenario is a match for A/B testing.
Generating large datasets in A/B test results can assure the right sample size and increase accuracy because a large volume of observations is available.
The data ultimately plays into establishing a cohesive real-time analysis, a reporting strength enterprise professionals have long perceived Adobe to have. The personalization engine in Target will go a long way to augment that perception.
What Do Customers Want?
Cohesive analysis can also go a long way for ensuring a cohesive customer experience, which customers are increasingly demanding.
Adobe Senior Product Marketing Manager Jason Hickey wrote in a blog post, "Consumers’ sheer intolerance for distracting experiences and irrelevant content.”
Hickey also quoted data from a Digital Synopsis infographic that summarized the cost of that distracting experience and irrelevant content. “Due to not-so-user-friendly designs and user experiences (UXs), two in five site visitors don’t complete even the most basic tasks.”
Overall Adobe has a significant introduction that keeps it uniquely competitive in the analytics space.
Focusing on A/B testing is a smart development on Adobe’s part in the face of growing expectation that customer experience will drive sales in the years to come.
Learn how you can join our contributor community.