My Team’s Insights Led to a Direct Revenue Increase

“Is ResultsPlus usage low because they don’t see it or is it low because they don’t value it?”


CHALLENGE: 

A premium product with great revenue generation potential was underutilized, why?

SOLUTION:

I recommended usability testing with eye-tracking, which diagnosed the problem and suggested the best alternative approach

OUTCOME:

A 10% increase in revenue, projected to increase product revenue by 4M annually


Case Study:

Westlaw is a legal research tool that brings together over 40,000 databases of case law, statutes, codes, and other resources. At the time, the primary business model was a subscription service with access to databases critical to a customer’s work (e.g., geographic region, area of practice). A typical subscription might include 2 thousand databases. Customers could then access materials outside their subscription for an additional fee, often with premium pricing.

To address the fact that customers were reluctant to initiate searches in databases outside their subscription, West developed a very clever product called “Results Plus,” which returned relevant results from premium databases outside a customer’s subscription and displayed them in the right rail alongside their standard search results. Results Plus had tremendous “win-win” potential, as it exposed customers to valuable materials they might otherwise not have seen while also generating additional revenue. Unfortunately, it was under-utilized, with fewer than 25% of customers having tried it.

The user experience team was tasked with getting to the bottom of the product’s disappointing numbers. Our first task was to determine whether usage was low because customers didn’t see the product or because they didn’t value it. We knew the content had real value, so our hunch was that it simply wasn’t being seen.

“Is ResultsPlus usage low because they don’t see it or is it low because they don’t value it?”

I suggested that we conduct an eye-tracking study as a way of definitively answering this question. This was 2007 and eye tracking was just beginning to be explored for the purposes of usability testing. We partnered with the University of Minnesota’s Eye Tracking lab to study Results Plus. It was exciting to be “early adopters” of these tools, but all the more to have such a perfect mapping between the new technology and our business need.

Our first study was conducted qualitatively with 15 users. The study showed definitively that the vast majority of users simply never saw the Results Plus content appearing on the page, with only 11% of participants’ gaze plots going into the right rail. Armed with confirmation of our hunch we moved into the really exciting phase of the project—developing and testing a series of designs aimed at making Results Plus more visible.

Our second eye tracking study evaluated four distinct designs with a range of graphic design intensities. One went so far as to have a horizontal banner with animation intended to subtly draw the eye to the RP results in the right rail. On the other end of the spectrum one of the designs was exclusively text based. The entire team tended to prefer the more graphically intensive versions of the design, so we were all extremely curious to observe the eye-tracking sessions.

In a case of science trumping intuition, the results of the eye-tracking study were definitive—this time demonstrating that the text-based version attracted and held participant’s gaze more effectively than their graphical counterparts.

We had our “winning” design—now we had to sell it internally to a marketing and product development team who favored the graphically intensive versions. We felt almost sheepish coming back to announce that our proposed changes to the Results Plus interface were so minimalistic.

The eye-tracking gaze plots and heat maps were persuasive in a way that no words could have been.

The eye-tracking gaze plots and heat maps were persuasive in a way that no words could have been and we launched the new design after following our usability testing with some live split testing of the new design vs the existing design.

Our efforts were richly rewarded with the news that our “tweaks” had led to a 10% revenue increase that was projected to generate four million dollars in increased revenue annually.