10 Cognitive Biases Anyone Doing UX Research Should Know
Understanding and mitigating biases in user research is crucial. Here's how you can get started.
I had a childhood memory about a time when lightning struck the chimney of my childhood house, which set off the fire alarm and prompted a visit from the local fire department.
I told people this story because it was a crazy memory, which started a lifelong fear of thunderstorms. Years ago, I was telling this story to a friend who had come over for dinner at that same house. Both of my parents turned to me and said, "Nikki, that never happened."
There was, in fact, a thunderstorm, which hit a neighbor's house. But a fire never happened. A fire alarm never went off. And the fire department never showed up.
What happened? Why did I have such a vivid memory from my childhood only to find out it never existed?
The answer: cognitive bias.
There are two ways we can mitigate cognitive biases: 1. Become robots. 2. Become acutely aware of them.
What is cognitive bias?
In the story above, I had created a false memory—a common form of cognitive bias.
A cognitive bias is an error or mistake in judgment that impacts the way you make decisions and analyze situations. They can be illogical and, occasionally, dangerous.
Thinking is hard. These biases make it even more difficult by impacting our thoughts and actions. We can make up trends/patterns, misinterpret information, imagine memories that never actually happened, and every other mistake.
These biases are prevalent in society and our day-to-day lives. The most intriguing aspect of cognitive biases is the fact they are generally unconscious.
Now, my false memory ended up being a mind-blowing experience that ends in some laughs. However, what happens when it comes to your user research? We are always asking people to recall different events or to create meaning from something. But there are 175 cognitive biases out there. With these 175 in mind, what can we do?
There are two ways we can mitigate cognitive biases:
- Become robots
- Become acutely aware of them
We need to be super aware of the different biases that can impact our user research, and have tools to combat these in the best way possible. We won't be perfect (again, we aren't robots), but at least we are trying to bring the unconscious into the conscious.
I will focus on the top ten biases we see in user research and give examples on how to avoid them best.
Participant biases
These are biases from a participant's point-of-view and are profoundly unconscious. The participant won't know they are exhibiting these behaviors, which make them harder to control for researchers. However, there are ways to make your participants feel as comfortable and to ask the right questions. For participant biases, we need to pay close attention to how we say things, and what actions we take.
Framing bias
Definition: One of the most significant biases in decision-making. People make a decision or form an opinion based on the way information is presented (ex: positively or negatively), rather than facts. The same question rephrased can lead to two different outcomes.
Real-life example: Say we have a friend, Suzy. She is very particular about how her apartment and workspace looks. You could say she is a neat freak. Or, you could say she is very tidy and has a close attention to detail. Both phrases comment on Suzy's behavior, but the framing is very different.
Biased user research example: "What did you like about Netflix?" or "What did you dislike about Netflix?" These questions focus on either positives or negatives and might cause people to represent their thoughts falsely.
Rewrite: "Walk me through the last time you used Netflix" or "How did you feel when you last used Netflix?"
How to avoid this:
- Start with writing biased questions (it is okay to get them out of our brains!) and then rewriting them by using neutral language
- Use open-ended questions
- Use the participant's exact language
- Summarize what they said to make sure you understand it correctly
- Let the participant fully explain their point-of-view before asking more questions
Hindsight bias
Definition: When people overestimate how well they could have predicted an outcome once they have passed. This bias makes people believe they can confidently predict future events, or how they might behave concerning the future. It is otherwise known as the "I knew it all along" bias.
Real-life example: You are cleaning your apartment, and you put a glass near the edge of a counter. You bump the glass slightly, and it falls, shattering to a million (annoying) shards of glass on the floor. You quite literally say to yourself: "Why did I put that there? I knew that was going to happen!"
Biased user research example: "How did you know your plant e-commerce store was going to succeed with your users?" The person could never know that something may or may not succeed, so this question puts them in a hindsight bias mindset
Rewrite: "Why did you decide to open a plant e-commerce store?" or "How did you choose your target market?"
How to avoid this:
- Always ask questions about more objective past-behavior
- Ask about the current process people are going through, instead of having them predict behavior
- Get users to recall recent, concrete memories
Social Desirability
Definition: As humans, we want to do and say things in a way that makes us look good around other people, even if that means we are lying. We can over-report "good" behavior (or what we believe are desired responses) and under-report "bad" behavior (what we think are less desired responses).
Real-life example: You bought a membership to a gym about six months ago to become more healthy. The gym recently contacted you to fill out a survey in which they ask you, "How often would you like to go to the gym?" Due to social pressures, you may end up answering in an untruthful way, such as inflating the number of times you went to the gym, or the number of times you "plan to."
Biased user research example: "How often do you binge-watch Netflix episodes?" People may feel like binge-watching is an undesirable trait, and would possibly avoid truthfully answering that question
Rewrite: "Last time you watched Netflix, how many episodes did you watch?" or "When was the last time you heard a friend talking about 'binge-watching' Netflix?" -> "We all do it! When was the last time you watched X number of episodes?"
How to avoid this:
- Make it clear you did not design what you are testing, which means no feelings will be hurt with feedback
- Repeat the need for constructive criticism to improve the product for others
- Reword "less desirable" phrasing
- Give an option for anonymous data collection
- Ask what they would do to improve it for their parents, friends, or other users
- Avoid focus groups
Serial position
Definition: Due to the way our memory works, people tend to remember or choose the items at the beginning or end of a list.
Real-life example: Try to recall a name that starts with the letter "M," "N," or "O." Unless your name begins with these letters, these names are much more difficult to imagine. Another example is if someone asks you to remember a list of 6 words, the middle two will be the hardest to remember.
Biased user research example: In a card sorting exercise to understand how users group particular information, always keeping the cards in the same order for each participant. Participants may ignore or find less importance in the middle elements.
Rewrite: Reshuffle the deck of cards before every participant to ensure they aren't in the same order
How to avoid this:
- Vary the order of survey question responses for each participant
- Alternate the order in which you test concepts or prototypes
- Don't test every prototype at once. Focus on 1-2 ideas per participant.
Illusion of Transparency
Definition: We overestimate the extent to which others know what we are thinking or trying to convey. We believe people can understand our personal mental state more than they actually can.
Real-life example: You are playing a game of taboo or charades with a group of friends. You believe the way you are describing the word is so apparent to others; they must know what you are talking about. Another example is assuming an audience understands how nervous you are feeling during a presentation. You have an exaggerated sense of how apparent your nervousness is.
Biased user research example: A participant briefly explains, "I couldn't easily find how to reset my password on Netflix." You respond with, "Oh, that must have been annoying, right?"
- You are jumping to a conclusion about what that person felt without allowing them to explain further. The participant might think it is obvious you know that the experience was frustrating, upsetting, or annoying, but give them the space to tell you this on their own.
Rewrite: A participant briefly explains, "I couldn't easily find how to reset my password on Netflix." You respond with, "How did that feel?" The participant says, "It was annoying." You return with, "I'm sure that was annoying. What exactly was annoying about that experience?"
How to avoid this:
- Use the participant's exact language
- Be aware of body language
- Ask them to describe what they mean by a word or phrase
- Reiterate what they said and ask if you are interpreting it correctly
Nikki Anderson-Stanier is the founder of User Research Academy and a qualitative researcher with 9 years in the field. She loves solving human problems and petting all the dogs.
To get even more UXR nuggets, check out her user research membership, follow her on LinkedIn, or subscribe to her Substack.
Subscribe To People Nerds
A weekly roundup of interviews, pro tips and original research designed for people who are interested in people