eBook - ePub
Think Like a UX Researcher
How to Observe Users, Influence Design, and Shape Business Strategy
David Travis, Philip Hodgson
This is a test
Share book
- 294 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
eBook - ePub
Think Like a UX Researcher
How to Observe Users, Influence Design, and Shape Business Strategy
David Travis, Philip Hodgson
Book details
Book preview
Table of contents
Citations
About This Book
Think Like a UX Researcher will challenge your preconceptions about user experience (UX) research and encourage you to think beyond the obvious. You'll discover how to plan and conduct UX research, analyze data, persuade teams to take action on the results and build a career in UX. The book will help you take a more strategic view of product design so you can focus on optimizing the user's experience. UX Researchers, Designers, Project Managers, Scrum Masters, Business Analysts and Marketing Managers will find tools, inspiration and ideas to rejuvenate their thinking, inspire their team and improve their craft.
Key Features
- A dive-in-anywhere book that offers practical advice and topical examples.
- Thought triggers, exercises and scenarios to test your knowledge of UX research.
- Workshop ideas to build a development team's UX maturity.
- War stories from seasoned researchers to show you how UX research methods can be tailored to your own organization.
Frequently asked questions
How do I cancel my subscription?
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlegoâs features. The only differences are the price and subscription period: With the annual plan youâll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weâve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Think Like a UX Researcher an online PDF/ePUB?
Yes, you can access Think Like a UX Researcher by David Travis, Philip Hodgson in PDF and/or ePUB format, as well as other popular books in Informatica & Programmazione di giochi. We have over one million books available in our catalogue for you to explore.
Information
1
Setting the Stage
The Seven Deadly Sins of UX Research
Most companies would claim to design products and services that are simple to use. But when you ask customers to actually use these products and services, they often find them far from simple. Why is there a disconnect between what organizations think of as âsimpleâ and what users actually experience?
Itâs fashionable to blame poor usability on firms not doing enough user research. On the face of it, this seems like the obvious cause of poor usability. If firms only did the research, they would realize their product was a dud. But, like most obvious reasons, itâs wrong.
In reality, thereâs never been a better time to be a purveyor of UX research tools. Every organization seems to want to âtake the temperatureâ of their customers. Take a quick look in your email junk folder at the number of times youâve been asked to complete a survey over the last month. If itâs like ours, it will number in the double digits.
The problem isnât with the quantity of UX research. Itâs with the quality: organizations struggle to distinguish good UX research from bad UX research.
Here are seven examples of poor UX research practice that weâve come across in our work with clientsâalong with some ideas on how to fix them.
⢠Credulity.
⢠Dogmatism.
⢠Bias.
⢠Obscurantism.
⢠Laziness.
⢠Vagueness.
⢠Hubris.
Credulity
The dictionary defines credulity as a state of willingness to believe something without proper proof. The form this takes in UX research is asking users what they want (and believing the answer).
A couple of months ago, David was attending a usability study on behalf of a client. He was there because the client thought that the usability tests they were running were not delivering much predictive value. The client was concerned they werenât recruiting the right kind of people or maybe the analysis wasnât right.
As David sat in the observation room, he watched the administrator show three alternative designs of a user interface to the participant and ask: âWhich of these three do you prefer? Why?â
Asking people what they want is very tempting. It has obvious face validity. It seems to make sense.
But itâs also wrong.
Hereâs why. Over 40 years ago,1 psychologists Richard Nisbett and Timothy Wilson carried out some research outside a bargain store in Ann Arbor, Michigan.
The researchers set up a table outside the store with a sign that read, âConsumer Evaluation SurveyâWhich is the best quality?â On the table were four pairs of ladiesâ stockings, labelled A, B, C and D from left to right.
Most people (40%) preferred D, and fewest people (12%) preferred A.
On the face of it, this is just like the usability test David observed.
But thereâs a twist. All the pairs of stockings were identical. The reason most people preferred D was simply a position effect: The researchers knew that people show a marked preference for items on the right side of a display.
But when the researchers asked people why they preferred the stockings that they chose, no one pointed to the position effect. People said their chosen pair had a superior denier, or more sheerness or elasticity. The researchers even asked people if they may have been influenced by the order of the items, but of course people looked at the researchers like they were crazy. Instead, people confabulated: they made up plausible reasons for their choice.
Thereâs an invisible thread joining the study by Nisbett and Wilson and the usability test weâve just described. The reason we call the thread âinvisibleâ is because few UX researchers seem to be aware of itâdespite the fact that thereâs a whole sub-discipline of psychology called Prospect Theory2 devoted to itâand that Daniel Kahneman won a Nobel prize for exploring the effect.
People donât have reliable insight into their mental processes, so there is no point asking them what they want.
This quotation from Rob Fitzpatrick3 captures it perfectly: âTrying to learn from customer conversations is like excavating a delicate archaeological site. The truth is down there somewhere, but itâs fragile. While each blow with your shovel gets you closer to the truth, youâre liable to smash it into a million little pieces if you use too blunt an instrument.â
How can we overcome this problem?
Our definition of a successful UX research study is one that gives us actionable and testable insights into usersâ needs. Itâs no good asking people what they like or dislike, asking them to predict what they would do in the future, or asking them to tell us what other people might do.
The best way of gaining actionable and testable insights is not to ask, but to observe. Your aim is to observe for long enough that you can make a decent guess about whatâs going on. Asking direct questions will encourage people to make things up, not tell you what is actually going on.
There are two ways to observe. We can observe how people solve the problem now. Or we can teleport people to a possible future and get them using our solution (a prototype) to see where the issues will arise.
The key point is: What people say is not as useful as what people do, because people are unreliable witnesses.
Dogmatism
Dogmatism is the tendency to lay down principles as undeniably true, without consideration of evidence or the opinions of others. The form this takes in UX research is believing there is one ârightâ way to do research.
Weâre sure youâve worked with people who think that a survey is âthe right wayâ to understand user needs. Perhaps because we hear about surveys every day in the news, people tend to think of them as being more reliable or useful. The notion of using an alternative method, like a field visit or a user interview, doesnât have the same face validity because the sample size is comparatively small.
But sadly, having a large number of respondents in a survey will never help you if you donât know the right questions to ask. Thatâs where field visits and user interviews come in.
Field visits and user interviews are a great way to get insights into your usersâ needs, goals and behaviors. But these arenât the only solution either.
Recently, we worked with a UX researcher who seemed to think there was no room for any research method other than user interviews. To validate personas, run more user interviews. To identify your top tasks, run more user interviews. To compare two alternative landing pages, run more user interviews.
This kind of dogmatism is unhelpful.
Field visits and user interviews give you signposts, not definitive answers. Itâs broad-brush stuff, a bit like the weather forecast. There may be some patterns in the data, but these arenât as useful as the conversation you have with users and the things you observe them do. Itâs those conversations that help you identify the gap between what people say and what they doâand that is often a design opportunity.
But there comes a point when you need to validate your findings from field visits and user interviews by triangulation: the combination of methodologies in the study of the same phenomenon. Quantitative data tell us what people are doing. Qualitative data tell us why people are doing it. The best kind of research combines the two kinds of data. For example, you might choose a survey to validate personas youâve developed through field visits. Or you might choose multivariate A/B testing to fine tune a landing page that youâve developed by usability testing.
Triangulation is like having different camera angles in a movie. It would be hard to understand the full picture of what is going on in a movie if every frame was shot as a close-up. Similarly, it would be difficult to empathize with the characters if every image was shot as a wide angle view. Like movies, you want your research to show the close-ups but you also want to see the bigger picture.
Bias
Bias means a special influence that sways oneâs thinking, especially in a way considered to be unfair.
UX research is a continual fight against bias. There are a handful of different kinds of bias that matter in UX research, but itâs response bias we want to discuss here. This is caused by the way in which you collect data.
Sometimes the bias is obvious. For example, if you ask poor questions youâre likely to get participants to tell you what you want to hear. You can correct this bias by teaching people to ask the right questions. But thereâs an even more pernicious type of response bias thatâs much harder to correct. This happens when the development team carries out the research and find that people donât really have a need for the product or service. Itâs tempting to hide this from senior managers because no one wants to be the purveyor of bad news. But if thereâs no need for your product, thereâs no point trying to convince senior managers that there isâyouâll be found out in the end. Itâs a bad idea to cherry pick the results to support what a senior manager wants to hear.
You shouldnât approach interviews with a vested interest: The UX researcherâs job isnât to convince people to use a service, or to get the results management want; itâs about digging for the truth. This doesnât mean you shouldnât have a point of view. You should. Your point of view should be to help the development team understand the data, not just tell the development team what they want to hear.
Obscurantism
Obscurantism is the practice of deliberately preventing the full details of something from becoming known. The form this sin takes in UX research is keeping the findings in the head of one person.
UX research is often assigned to a single person on a team. That person becomes the spokesperson for user needs, the teamâs âexpertâ on users. This approach is a poor way to do research, and not just because the UX researcher doesnât know all the answers. The reason it fails is because it encourages the development team to delegate all responsibility for understanding users to one person.
One way you can prevent this sin on your own project is to encourage everyone on the team to get their âexposure hours.â Research4 shows that the most effective development teams spend at least two hours every six weeks observing users (for example, in field visits or usability tests).
What youâre aiming for here is building a user centered culture. You do that by encouraging the whole development team to engage with users. But you also need to design iteratively. And that takes us to our next sin.
Laziness
Laziness is the state of being unwilling to exert oneself. The form this takes in UX research is in recycling old research data as if itâs boilerplate that can be cut and pasted into a new project.
Our favorite example of this comes from the world of personas.
We find that clients often approach the process of developing personas as a one-time activity. They will hire an outside firm to do field research with the requisite number of users. That firm will analyze the data and create a set of beautifully presented personas. Now we already know this is a bad idea because of the sin of Obscurantism. We want the development team doing the research, not an external firm.
But letâs ignore that issue for a moment. The reason weâre using personas as an example here is because we are often asked by a client if they can re-use their personas. They are now working on a new project, which has a passing resemblance to one on which they developed personas last year. Since their cus...