Bad User Research Advice: How to (Delicately) Deal with Unhelpful Feedback
Here are a few ways to redirect and reassure when stakeholder feedback feels off-base.
User research is a team effort. You need buy-in from others to conduct effective research, which means you need people on your side. Product managers, designers, developers, and even executive-level colleagues need to align on the importance of user research.
With this alignment comes education. You need to teach them the value user research can bring to a team and organization. Often, this means...
- Bringing colleagues through the entire journey of user research
- Showing them how research can have a positive impact
- Including them in defining the problem, talking to users, and synthesizing data into meaningful insights
However, sometimes, by including others in your craft, you can encounter challenging scenarios. When someone goes through a process with you several times, they can form opinions and thoughts about approaching certain situations. At times, these opinions can go against the best practices you have learned as a user researcher.
When you get bad advice on your craft, how do you redirect colleagues?
What do I mean by bad advice?
People mean well. More often than not, disagreements are due to misalignment, not malice.
When it comes to user research, we constantly teach colleagues about impact, value, reliable and valid studies, and everything in between. We also teach teammates how to conduct interviews and usability tests. But this is where we need to be careful.
Our colleagues are experts within their role, such as product management and design, but it can be difficult for them to remember that they're not users or research experts. When they forget this fact, you can receive a lot of bad advice.
In the past, I thought the struggle would be done once I got buy-in, but I also realized how critical it was to manage expectations. These are a few instances in particular where I struggled with receiving bad advice:
- Colleagues believing they are the user (especially if they are!) and already knowing what users want
- Non-experts advising me on how to do user research
- Others determining who we should recruit and talk to based only on quantitative/demographic data
One of the most difficult journeys to traverse with stakeholders is dealing with this bad advice on doing research. You don't want to alienate them from the research, but there are shortcomings to their suggestions.
A real-world example of bad advice
I was preparing for a study on a new feature the team wanted to launch, and was writing the interview guide when I ran into the issue. I love to share work in progress with whoever I am working on to get feedback, so I shared the interview guide before a meeting.
During the meeting, my colleague insisted that we ask users, "Would you use this feature?" The person mentioned that this question would be critical to whether or not we launch the feature. If we couldn't answer this question by the end of the study, we would not be able to indicate if this feature would succeed or fail.
I sat back and took a deep breath. I had spoken about the danger of using future-based questions as reliable indicators of usage or purchasing.
I reiterated that the future could never be a reliable indicator of desired behavior. We can't simply ask people what they will do because, as humans, we are notoriously bad at predicting the future. We don't know if we will purchase or use something until we do. I explained that we should look into recent past behavior.
Despite my efforts, the colleague continued to tell me we had to ask this question. I felt immense pressure to ask the question, and thought that I failed to educate others in best practices properly. I knew we shouldn't be asking this question, and I didn't want the team to latch on to the answer, but I felt like I couldn't get out of the situation tactfully.
I ended up asking the question, and the predicted nightmare came true. The team focused so much on the answer to that question and couldn't see the other evidence that refuted that data point.
Three common instances of bad advice
Since then, I have encountered the above problems so often; I don't even consider them issues anymore. I have learned how to overcome these tricky situations and ensure the best research and data get through.
1. Assumptions versus reality
We can hold assumptions about people that we sincerely believe are correct. These assumptions can blind us from reality. It is tough to tell colleagues they are "wrong" and are assuming way too much. Instead of going that route, I have leveraged the assumptions.
I use the assumptions as a starting point through holding an assumptions workshop. During this workshop, colleagues write down all the assumptions and hypotheses they have. Then I ask, "How do you know for sure this is true?" and, "How much money would you bet on this being true?"
After this exercise, we take the list of assumptions that stood through those questions, and look to validate or disprove them in research, so that we know we are making the most thoughtful decisions about our product and strategic roadmap.
2. Non-expert advice
I now have a go-to example whenever I get bad advice about which questions to ask users, precisely the future-based question. I always ask, "Why is the fitness industry (specifically gym memberships) a billion-dollar industry that never seems to fail?" I explain it is because humans cannot correctly predict our behavior.
We sign up for the gym and pay a lot of money. We may want to go to the gym, but many people don't end up going but continue paying for a gym membership. We have NO idea what we will or will not do in the future.
I also explain other instances where companies have asked users if they would use a product/feature, spent the time building it, and then had no revenue or usage. For example, consider Google Plus. Google asked people if they wanted to connect with others through Google, like a similar social network to LinkedIn or Facebook. The answer resonated with, "Yes, of course!"
Very few people used Google Plus. They listened too closely to the future-based answer rather than the past behavior of how people currently interact with social networks.
3. Recruiting based only on demographics
We can get stuck using only demographics to recruit people, which means potentially getting the wrong participants. When we focus too much on gender, age, location, income, marital status, education level, and other standard demographics—we lose a critical understanding of the participant.
While it is always good to include these demographics (and sometimes necessary, depending on your product), other information is needed to recruit the right people. To illustrate the importance of having other screener questions, I ask, "When was the last time you opened an app/used a website/purchased something because of your education level, gender, or age?”
Again, sometimes these demographics are necessary, and people do things because of them, but it is usually just one layer. If you want participants that can speak to specific experiences and give you rich qualitative data, you can't just use demographics.
To combat this, I share a scenario in which I messed up recruitment by focusing too much on demographics. I wanted to speak with people who had purchased a luxury trip in the past, so I focused on people who were currently working and had a high income.
Sure, some of those people fell into the bucket of expensive-trip-purchasers, but not always. A few decided to spend their money on other things or disliked travel since they frequently traveled for work. This example illuminates the importance of asking demographic and past behavior-based screener questions.
A final note: Trust your expertise
Remember that you are the user researcher and the expert on user research! Some of these instances are the equivalent to you approaching a product manager and telling them how you believe they should conduct certain Scrum ceremonies or roadmap planning, based on your opinions rather than relevant experience.
You can use your authority on the topic because, ultimately, your thoughts should be the final decision. We learn and train ourselves in this scientific-based user research approach to make the call (while taking constructive feedback).
You may also like...
Nikki Anderson-Stanier is the founder of User Research Academy and a qualitative researcher with 9 years in the field. She loves solving human problems and petting all the dogs.
To get even more UXR nuggets, check out her user research membership, follow her on LinkedIn, or subscribe to her Substack.
Subscribe To People Nerds
A weekly roundup of interviews, pro tips and original research designed for people who are interested in people