Triangulate for better user research insights

January 4th, 2021 Written by Garrett Stettler

Designing innovative, useful products and services requires understanding people–specifically, what motivates them, what they’re trying to do and whether they can use what you’ve designed. From an agile perspective, continuously learning about users helps you reduce risk and adapt quickly to changes.

This sounds straightforward enough: the more we know about our users, the better we can design products and services to meet their needs. So what’s the catch? Why do so many teams, despite their best efforts, end up building something that doesn’t work for the people who use it? There are quite a few catches, unfortunately.

People are hard to understand

First, people aren’t easy to understand. Behaviours that seem obvious on the surface turn out to be complex and nuanced upon closer examination, hence the many disciplines in the social and behavioural sciences dedicated to figuring how and why humans behave as we do.

Time and money is limited

For an agile product or service team working under tight constraints, then, adhering to the maxim ‘know your users’ can be daunting. You can’t ever know everything about your users, but you need to know enough to design for them and to do that with limited time and money.

Incomplete information and bias can mislead

Readers familiar with Lean Startup will have heard about ‘getting out of the building and talking to users’. That’s a good step, but given how complex people’s behaviour can be, is it enough to just talk to people? The short answer is no. Taking people’s statements at face value is risky because they usually present only part of the full picture. Plus, assumptions and bias will naturally creep into any attempt to learn about people, threatening to lead you down the wrong path. To avoid these risks, you need a systematic way to learn about people.

A pragmatic approach

At first glance, this seems unachievable: understanding people in an unbiased way requires a systematic approach, which seems to directly conflict with the fact that time and money are limited.

It can be a tightrope walk, but one that user researchers and designers have learnt to navigate. Below I describe an approach based on best practices for data-informed design that we have used with Infinity Works clients. I also discuss the trends in data-informed and data-driven design that we expect to accelerate in the future.

A design process is a series of decisions. Decisions should be informed by evidence.

Erika HallAuthor, 'Just Enough Research'

Data-informed design

At Infinity Works we help organisations to use their data strategically and become data-driven. From a design and user research perspective, that means we advocate data-informed design. Briefly, that means seeking out multiple sources of data to inform design decisions, continuously testing and iterating at each stage of design.

There is, however, always the danger of being misled by data, as in the saying ‘rubbish in, rubbish out’. This is particularly true in resource-constrained environments. Without the proper user research approach, you run the risk of answering your questions only partially, wasting time focusing on the wrong thing or drowning in data that you can’t interpret.

Before looking for data, you need to identify your assumptions and frame research questions. From there you can determine the right combination of data to answer those questions, and the most practical way to get that data.

Ask the right questions with an appropriate research approach

Different projects and problem spaces call for different approaches. Design and research goals fall into a few categories:

  • Explore a problem and whether it is worth addressing
  • Understand the people for whom we are designing, learning about what they are doing and what they need
  • Explore how we can best meet people’s needs
  • Assess how well our design is meeting people’s needs

Often, those goals will map to a particular stage in a project or service lifecycle. For example, early discovery research will call for an exploratory approach to help you ‘design the right thing’, whereas iterations to an existing product or service will involve focus and fine-tuning to ensure you’ve ‘designed the thing right’.

Just beware the temptation to rigidly map research goals to specific stages. Agile ways of working may require moving amongst different research goals as new things are learnt. It’s important to continually revisit your research approach to ensure you’re gathering the right data at the right time.

Collect the right data

‘Data’ is often equated with numbers, but understanding people usually requires uncovering context, needs and motivations that simply can’t be found in aggregate numbers. Some questions are best informed by quantitative data; others by qualitative.

‘Quant’

Quantitative data tends to be good at identifying what is happening. It is often used to evaluate the usability of designs using methods like A/B or multivariate testing, web analytics, and unmoderated usability testing. It can, however, be used to explore problems and generate designs. Along with traditional sources like surveys, we can learn about people through ‘big data’ including text mining and predictive analytics.

‘Qual’

Qualitative data excels at answering why something is happening. It provides insight into what people are thinking and the needs and motivations behind their behaviour.

Qualitative research is often mistakenly equated only with ‘talking to customers’ through interviews. There are, however, a multitude of methods aside from interviews that provide insight about people including field studies, moderated usability testing, participatory design exercises, and diary studies.

The power of triangulation

All teams – particularly those trying to move quickly – start by working on assumptions. Smart teams recognise this and test their assumptions, starting with the riskiest ones. The trouble with using data to test and validate assumptions is that constraints and confirmation bias can lead people to mistake ‘data’ for ‘truth’.

In the real world of limited time and money, how can teams be sure they’re not being misled by data? As in the parable of the elephant in the dark, one way is triangulation: approaching assumptions from multiple perspectives using different methods and data. Building knowledge is a cumulative effort. Each research method or set of data will have limitations, but when combined will show something closer to the truth.

Ethnographer Tricia Wang shares the example of Nokia in 2009. Using quantitative models based on millions of datapoints, they failed to understand that low-income consumers in China were ready to pay for more expensive smartphones, as Wang had concluded from her ethnographic study of 100 people.

Conversely, teams that rely only on rich, subjective insights from small samples miss out on benefits of large-sample quantitative research like statistical power and generalisability.

Wang argues that different types of data are needed to give organisations a “complete context of any given situation”.

Mixed methods in design

For years, academics have aimed for that ‘complete context’ by using a research approach known as mixed methods. The idea is to combine different types of methods and data, taking advantage of the strength of each.

Designers have also learnt the value of a mixed method approach. McKinsey’s report on the Business Value of Design noted that “the best results come from constantly blending user research–quantitative (such as conjoint analysis) and qualitative (such as ethnographic interviews)” and further combining that with data from other parts of the organisation.

At Infinity Works we encourage teams to adopt a regular cadence of research that involves a range of methods. We find that a sequential approach, using one method or type of data followed by another, is often the most practical first step.

For, example, in an exploratory approach we start with interviews and small-sample prototype testing of ideas for a mobile app. The richness and nuance help us understand what users are thinking and refine our ideas. We then follow up with larger scale quantitative validation or watch how the design performs in analytics.

In the reverse explanatory approach, we start with the numbers. For example, statistical analysis of trends in how people engage with customer loyalty programmes might be followed up with small-sample observations or interviews to give us more insight into why we see those trends.

It’s also possible to collect different types of data at the same time, although this can be more resource intensive. Spotify illustrate this well in a case study on ‘simultaneous triangulation’.

No matter the approach, the key is to take action based on findings–and keep learning! We help Infinity Works clients match research and development cadences and establish feedback loops so research can be a continuous activity even in production. This ensures a steady stream of data and insight to drive design decisions.

The future – close ties between data and design

We find mixed methods approaches are most successful when there is close communication and collaboration between user researchers, designers, business analysts and data scientists, amongst others.

Given the value we have found in collaborating across functions in this way, we expect organisations will increasingly start to structure teams to enable it. Indeed, in 2019 McKinsey found 10 to 30 percent performance improvement among companies that tightly interweave data and design. Spotify and Microsoft, for example, have created organisations that integrate user researchers and data scientists to foster closer collaboration. Designers at IDEO have also found success involving data scientists in all stages of their design thinking process.

Conclusion

Even if your organisation doesn’t have oodles of user researchers and data scientists available to pair up, this does not preclude practising and benefitting from data-informed design.

We find mixed methods help us walk that tightrope of understanding people in light of real world constraints. Time and again we’ve seen that even small steps in this direction have a massive payoff.

Instead of mistaking one source of data for the truth, you’ll have a better, more unbiased understanding of the ‘complete context’ of your product or service in people’s lives. This in turn helps you reduce risk, adapt to change, and create a well-loved user experience.