From a researcher’s perspective, unsurprisingly, I’ll always encourage companies to invest in detailed user research. There may be an element of bias in that, but I’ve seen first-hand the impact that insights gained from research can have.

This can take the form of direct, measurable increases in engagement, but there are other benefits of research that are harder to quantify.

Measuring the ROI of research

It’s very difficult to directly tie improvements to website performance back to the initial research. By the time a final, improved, design is launched there’s no real way of pinpointing exactly where and how the research has impacted that. In a good design process research insights should be a foundation for almost all design decisions, but there’s no accurate way to measure the difference between designs that have been informed by research and designs that have not. You can’t A/B test the impact of research.

If we can’t put a direct financial value on research then how can we prove its value to stakeholders and clients?

Measuring the impact of research-informed design

You can, to some extent, measure the impact of design changes that come from the result of research. My colleague Chris gave some tips for doing just that. As a design transformation agency our focus is all about making impactful change through design.

Some of this can be measured through traditional website analytics, like improvements in conversion rate. This can be achieved through A/B testing or by looking at consistent metrics before and after major design changes. This is a great starting point, as long as you’re aware not to let any bias creep in.

When it comes to website analytics we can measure literally hundreds of different metrics. But these don’t paint the full picture, and can often be misleading. Without the context behind the numbers it can be hard to say whether changes in certain metrics are positive or negative. Using some of the alternative methods of measurement such as surveys can be problematic too.

Ultimately though, it’s hard, or maybe even impossible, to measure qualitative data in a quantitative way. Most forms of measurement focus on short-term changes, which may have a negative impact in the long-run. For example, using Deceptive Designs on your website is likely to increase your conversions in the short term, but is also likely to do long-term damage to your reputation. There are ways to measure customer satisfaction, but these tend to be flawed (I’m looking at you Net Promoter Score). They often only show negative impact when it’s too late and the damage has already been done.

We want to look at the impact on customer satisfaction. Looking longer-term you may be able to use models like Google’s HEART framework to understand a bit more about the experience your users are having, rather than just the impact on your metrics. This will require research work, rather than just analysis of numbers, but will give a much clearer picture of the user experience.

There are other considerations around how people view a brand. The rise in awareness of a company’s socio-environmental credentials is something that may impact whether people choose to deal with a brand, but this kind of decision would be impossible to measure without detailed research.

Luke on the left with a client, sitting behind a desk mid-discussion with some printed material laid out

The dangers of forecasting

While it’s hard to assess the impact of a project after it’s launched it’s even harder to forecast the impact that a design change is likely to make. When it comes to potential design changes companies naturally want to know upfront what return they are likely to get from their investment. But this goes against the very nature of research.

In the past I’ve been asked, before undertaking research, what the output is likely to be, and even whether it’s worth doing at all. There’s no guarantee of what results you’ll get back from research. I’ve never run research and not found interesting and useful insights, but the whole point of good research is that you don’t know the outcome before you undertake the research.

It’s very difficult to measure the financial impact of research after it’s been undertaken but it’s impossible to forecast its impact ahead of time.

So what is the value of research?

If it’s difficult to measure the impact of design changes, and almost impossible to measure the impact that research has had on those changes, then how do we show the value of research? I’m going to fall back on a quote here, which sums it up:

“If you think good design is expensive, you should see the cost of bad design.”
Dr. Ralf Speth, CEO Jaguar Land Rover

A lack of user research is an opportunity cost. Designing without a clear idea of who you’re designing for, and what their needs are, won’t lead to a positive outcome. It sounds simple, but research is often overlooked in the rush to push new products live. Research at an early stage of a project may fundamentally shift what ends up being built, or it may just validate some existing assumptions. Either way, there’s no way of knowing the outcome unless you do the research.

User research is also a gift that keeps on giving. With every research project you undertake you’ll learn more about your users and their needs. As a result the ongoing design, marketing and business decisions that you make will be more likely to land with your users, boosting sales and improving customer satisfaction.

Summary

William Bruce Cameron said, not everything that can be counted counts, and not everything that counts can be counted. Being aware of the limitations of what you can measure is key to avoiding being driven by meaningless metrics that might actually be harming your overall experience. Humans are complex things and we can’t be easily ‘measured’, particularly not using quantitative data alone. Taking time to consider and understand the longer-term impact on behaviour, is crucial.

Being informed about your users and their needs is crucial for designing the services and products that will give them the best experience. Skipping on research will lead to expensive guesswork. That’s the value of research.

If you’d like to learn more about measurement in UX then you can listen to our Podcast on Measuring Design.