Think Like a UX Researcher
eBook - ePub

Think Like a UX Researcher

How to Observe Users, Influence Design, and Shape Business Strategy

David Travis, Philip Hodgson

Share book
  1. 294 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Think Like a UX Researcher

How to Observe Users, Influence Design, and Shape Business Strategy

David Travis, Philip Hodgson

Book details
Book preview
Table of contents
Citations

About This Book

Think Like a UX Researcher will challenge your preconceptions about user experience (UX) research and encourage you to think beyond the obvious. You'll discover how to plan and conduct UX research, analyze data, persuade teams to take action on the results and build a career in UX. The book will help you take a more strategic view of product design so you can focus on optimizing the user's experience. UX Researchers, Designers, Project Managers, Scrum Masters, Business Analysts and Marketing Managers will find tools, inspiration and ideas to rejuvenate their thinking, inspire their team and improve their craft.

Key Features

  • A dive-in-anywhere book that offers practical advice and topical examples.
  • Thought triggers, exercises and scenarios to test your knowledge of UX research.
  • Workshop ideas to build a development team's UX maturity.
  • War stories from seasoned researchers to show you how UX research methods can be tailored to your own organization.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Think Like a UX Researcher an online PDF/ePUB?
Yes, you can access Think Like a UX Researcher by David Travis, Philip Hodgson in PDF and/or ePUB format, as well as other popular books in Informatica & Programmazione di giochi. We have over one million books available in our catalogue for you to explore.

Information

Publisher
CRC Press
Year
2019
ISBN
9780429774003
1
Setting the Stage
images
The Seven Deadly Sins of UX Research
Most companies would claim to design products and services that are simple to use. But when you ask customers to actually use these products and services, they often find them far from simple. Why is there a disconnect between what organizations think of as “simple” and what users actually experience?
images
It’s fashionable to blame poor usability on firms not doing enough user research. On the face of it, this seems like the obvious cause of poor usability. If firms only did the research, they would realize their product was a dud. But, like most obvious reasons, it’s wrong.
In reality, there’s never been a better time to be a purveyor of UX research tools. Every organization seems to want to “take the temperature” of their customers. Take a quick look in your email junk folder at the number of times you’ve been asked to complete a survey over the last month. If it’s like ours, it will number in the double digits.
The problem isn’t with the quantity of UX research. It’s with the quality: organizations struggle to distinguish good UX research from bad UX research.
Here are seven examples of poor UX research practice that we’ve come across in our work with clients—along with some ideas on how to fix them.
• Credulity.
• Dogmatism.
• Bias.
• Obscurantism.
• Laziness.
• Vagueness.
• Hubris.
Credulity
The dictionary defines credulity as a state of willingness to believe something without proper proof. The form this takes in UX research is asking users what they want (and believing the answer).
A couple of months ago, David was attending a usability study on behalf of a client. He was there because the client thought that the usability tests they were running were not delivering much predictive value. The client was concerned they weren’t recruiting the right kind of people or maybe the analysis wasn’t right.
As David sat in the observation room, he watched the administrator show three alternative designs of a user interface to the participant and ask: “Which of these three do you prefer? Why?”
Asking people what they want is very tempting. It has obvious face validity. It seems to make sense.
But it’s also wrong.
Here’s why. Over 40 years ago,1 psychologists Richard Nisbett and Timothy Wilson carried out some research outside a bargain store in Ann Arbor, Michigan.
The researchers set up a table outside the store with a sign that read, “Consumer Evaluation Survey—Which is the best quality?” On the table were four pairs of ladies’ stockings, labelled A, B, C and D from left to right.
Most people (40%) preferred D, and fewest people (12%) preferred A.
On the face of it, this is just like the usability test David observed.
But there’s a twist. All the pairs of stockings were identical. The reason most people preferred D was simply a position effect: The researchers knew that people show a marked preference for items on the right side of a display.
But when the researchers asked people why they preferred the stockings that they chose, no one pointed to the position effect. People said their chosen pair had a superior denier, or more sheerness or elasticity. The researchers even asked people if they may have been influenced by the order of the items, but of course people looked at the researchers like they were crazy. Instead, people confabulated: they made up plausible reasons for their choice.
There’s an invisible thread joining the study by Nisbett and Wilson and the usability test we’ve just described. The reason we call the thread “invisible” is because few UX researchers seem to be aware of it—despite the fact that there’s a whole sub-discipline of psychology called Prospect Theory2 devoted to it—and that Daniel Kahneman won a Nobel prize for exploring the effect.
People don’t have reliable insight into their mental processes, so there is no point asking them what they want.
This quotation from Rob Fitzpatrick3 captures it perfectly: “Trying to learn from customer conversations is like excavating a delicate archaeological site. The truth is down there somewhere, but it’s fragile. While each blow with your shovel gets you closer to the truth, you’re liable to smash it into a million little pieces if you use too blunt an instrument.”
How can we overcome this problem?
Our definition of a successful UX research study is one that gives us actionable and testable insights into users’ needs. It’s no good asking people what they like or dislike, asking them to predict what they would do in the future, or asking them to tell us what other people might do.
The best way of gaining actionable and testable insights is not to ask, but to observe. Your aim is to observe for long enough that you can make a decent guess about what’s going on. Asking direct questions will encourage people to make things up, not tell you what is actually going on.
There are two ways to observe. We can observe how people solve the problem now. Or we can teleport people to a possible future and get them using our solution (a prototype) to see where the issues will arise.
The key point is: What people say is not as useful as what people do, because people are unreliable witnesses.
Dogmatism
Dogmatism is the tendency to lay down principles as undeniably true, without consideration of evidence or the opinions of others. The form this takes in UX research is believing there is one “right” way to do research.
We’re sure you’ve worked with people who think that a survey is “the right way” to understand user needs. Perhaps because we hear about surveys every day in the news, people tend to think of them as being more reliable or useful. The notion of using an alternative method, like a field visit or a user interview, doesn’t have the same face validity because the sample size is comparatively small.
But sadly, having a large number of respondents in a survey will never help you if you don’t know the right questions to ask. That’s where field visits and user interviews come in.
Field visits and user interviews are a great way to get insights into your users’ needs, goals and behaviors. But these aren’t the only solution either.
Recently, we worked with a UX researcher who seemed to think there was no room for any research method other than user interviews. To validate personas, run more user interviews. To identify your top tasks, run more user interviews. To compare two alternative landing pages, run more user interviews.
This kind of dogmatism is unhelpful.
Field visits and user interviews give you signposts, not definitive answers. It’s broad-brush stuff, a bit like the weather forecast. There may be some patterns in the data, but these aren’t as useful as the conversation you have with users and the things you observe them do. It’s those conversations that help you identify the gap between what people say and what they do—and that is often a design opportunity.
But there comes a point when you need to validate your findings from field visits and user interviews by triangulation: the combination of methodologies in the study of the same phenomenon. Quantitative data tell us what people are doing. Qualitative data tell us why people are doing it. The best kind of research combines the two kinds of data. For example, you might choose a survey to validate personas you’ve developed through field visits. Or you might choose multivariate A/B testing to fine tune a landing page that you’ve developed by usability testing.
Triangulation is like having different camera angles in a movie. It would be hard to understand the full picture of what is going on in a movie if every frame was shot as a close-up. Similarly, it would be difficult to empathize with the characters if every image was shot as a wide angle view. Like movies, you want your research to show the close-ups but you also want to see the bigger picture.
Bias
Bias means a special influence that sways one’s thinking, especially in a way considered to be unfair.
UX research is a continual fight against bias. There are a handful of different kinds of bias that matter in UX research, but it’s response bias we want to discuss here. This is caused by the way in which you collect data.
Sometimes the bias is obvious. For example, if you ask poor questions you’re likely to get participants to tell you what you want to hear. You can correct this bias by teaching people to ask the right questions. But there’s an even more pernicious type of response bias that’s much harder to correct. This happens when the development team carries out the research and find that people don’t really have a need for the product or service. It’s tempting to hide this from senior managers because no one wants to be the purveyor of bad news. But if there’s no need for your product, there’s no point trying to convince senior managers that there is—you’ll be found out in the end. It’s a bad idea to cherry pick the results to support what a senior manager wants to hear.
You shouldn’t approach interviews with a vested interest: The UX researcher’s job isn’t to convince people to use a service, or to get the results management want; it’s about digging for the truth. This doesn’t mean you shouldn’t have a point of view. You should. Your point of view should be to help the development team understand the data, not just tell the development team what they want to hear.
Obscurantism
Obscurantism is the practice of deliberately preventing the full details of something from becoming known. The form this sin takes in UX research is keeping the findings in the head of one person.
UX research is often assigned to a single person on a team. That person becomes the spokesperson for user needs, the team’s “expert” on users. This approach is a poor way to do research, and not just because the UX researcher doesn’t know all the answers. The reason it fails is because it encourages the development team to delegate all responsibility for understanding users to one person.
One way you can prevent this sin on your own project is to encourage everyone on the team to get their “exposure hours.” Research4 shows that the most effective development teams spend at least two hours every six weeks observing users (for example, in field visits or usability tests).
What you’re aiming for here is building a user centered culture. You do that by encouraging the whole development team to engage with users. But you also need to design iteratively. And that takes us to our next sin.
Laziness
Laziness is the state of being unwilling to exert oneself. The form this takes in UX research is in recycling old research data as if it’s boilerplate that can be cut and pasted into a new project.
Our favorite example of this comes from the world of personas.
We find that clients often approach the process of developing personas as a one-time activity. They will hire an outside firm to do field research with the requisite number of users. That firm will analyze the data and create a set of beautifully presented personas. Now we already know this is a bad idea because of the sin of Obscurantism. We want the development team doing the research, not an external firm.
But let’s ignore that issue for a moment. The reason we’re using personas as an example here is because we are often asked by a client if they can re-use their personas. They are now working on a new project, which has a passing resemblance to one on which they developed personas last year. Since their cus...

Table of contents