Skip to main content

Faculty Insights

Engineering Improved Performance Through Online Experiments

By Neeraj Arora

November 20, 2019

Professor Neeraj Arora poses for a portrait.
Neeraj Arora is the Arthur C. Nielsen, Jr. Chair in Marketing Research and Education and a professor in the Department of Marketing at the Wisconsin School of Business. Photo by Paul L. Newby II


Online experimentation, like nearly everything else in the digital world, is experiencing a rapid growth. While we shop or consume news online, companies and media outlets like Amazon and ESPN are continually experimenting with different aspects of the user experience, testing how we navigate through their websites and using that information to make their marketing and online presence more impactful. The primary purpose of this research is to design online experiments that perform well across multiple devices such as laptops and mobile phones.

Experimental design is a fairly specialized area within statistics. It’s an area in which I have tremendous interest and a few published papers. For this study, I was fortunate to have the opportunity to pursue that interest again by partnering with co-authors Soheil Sadeghi (PhD ’17), who received his doctoral degree from the Department of Statistics at the University of Wisconsin–Madison, now with Microsoft, and Peter Chien of UW–Madison’s Department of Statistics. This paper is based on Soheil’s doctoral dissertation, which both Peter and I supervised. Peter also brought deep experimental design expertise to our study, so it was truly a collaboration across multiple levels and disciplines.

First, as a point of comparison, let’s talk briefly about A/B testing. People who work in digital marketing, analytics, or online retail are quite familiar with A/B testing. You have two versions of a website, A and B. You might swap out one feature of your website, like a call to action button, in Version A, and leave Version B as is. It’s a simple, straightforward method to gauge which version is getting better traffic based on the element you alter.

Multivariate testing, on the other hand, allows you to change many different elements (e.g. color, front size, type of visual) at the same time. Such testing can occur to enhance consumer engagement with websites, email campaigns, and apps. Instead of simply comparing A and B, you might add C and D as additional factors that you can test in a multivariate online experiment.

Multivariate experiments in the online space present a new design challenge: the experiment needs to be conducted across multiple platforms that include desktops and smartphones. Such experiments are important because a different set of attribute combinations may be optimal for each platform. For example, the presence of multiple images may be best for a desktop and in contrast, a list of links may be more effective for a smartphone. Despite the fact that multivariate testing has become quite popular, there is a void in the design literature as to how it can best be applied in a multiplatform context. So with my co-authors, we constructed statistical designs that give us the ability to isolate factors that maximize user engagement for each platform (e.g. laptop, mobile). We introduced a “sliced design” (the term “sliced” originated with Qian and Wu, 2009, in the context of computer experiments) that can perform a multivariate test within each platform.

Testing our model in-house
Once we completed the theoretical work to construct the multivariate, multiplatform statistical designs, we had the opportunity to apply it right here at the Wisconsin School of Business. Thanks to a collaboration with WSB’s alumni relations and web teams, we tested different elements of a mass email distributed to WSB alumni encouraging readership of Update, the School’s print and digital alumni magazine. It was a win-win all around, since the results could help the Update team better serve alumni while also allowing us to test our statistical designs empirically—something unusual in the design field, where many papers remain theoretical.

We identified six design elements of the Update email to test. These included elements such as full-width banner imagery, a call to action button, and class notes stories (brief updates from alumni). We then created eight different email versions that were equally distributed to eight audience subsets within the overall audience of 25,000 alumni, and tested them across both laptop and mobile platforms. Working with the web team, we were able to track the traffic and then analyze the data.

Study Results
The test yielded several key results for the performance of Update email campaigns. Here are the insights we shared:

Specific features proved to be positive or negative. When tested, the class notes and the call to action were positives that should be included; others tested as negatives that should be avoided.

Be consistent across platforms. Our analysis showed that it’s beneficial to use the same combination of chosen elements for both laptop and mobile platforms.

Implementing all of these changes meant increased traffic. Using our statistical model allowed us to experiment with several “what if” scenarios. Our analysis found that incorporation of all of our recommendations would yield a 16 percent increase in traffic on desktop and a 7 percent increase on mobile. As a result, the next issue of the Update magazine implemented the changes to “class notes story” and “call to action button” that our research identified.

So, what can we learn from these individualized findings? Beyond Update, they tell us that how you design your email is critical for maximizing drive traffic.

Our research also suggests a cost savings for firms. In the experimental phase, versions made for one platform should be similar to other platforms, thereby making it more cost effective for firms to implement.

Finally, our study contributes to the field of experimental design research. How does one think about the right design for a multi-platform situation? Our study can be viewed as an extension of the existing work in the multivariate design literature for this brave new world that we live in.

Read the paper: “Sliced Designs for Multi-platform Online Experiments,” forthcoming in Technometrics.

Neeraj Arora is the Arthur C. Nielsen, Jr. Chair in Marketing Research and Education and a professor in the Department of Marketing at the Wisconsin School of Business. He is also the executive director of WSB’s A.C. Nielsen Center for Marketing Analytics and Insights.


Tags: