Skip to content
Ideas

How to Win Friends and Influence Stakeholders with Qual Research

We gathered four leading UXRs to discuss how they've won hearts, minds, and budgets for qual research in quant-minded orgs.

Words by Tony Ho Tran, Visuals by Emma McKhann

Despite the fact that research continues to grow at innovative companies, stakeholders overvalue data—while diminishing (or ignoring) the impact of qual insights.

Knowing how to become a champion of qual inputs becomes a crucial skill, though that’s easier said than done.

As part of a webinar you can stream on-demand, we assembled a panel of leading UXRs to share how they advocated for qualitative research and make sure that user insights are valued in their companies.

They are:

  • Leah Kaufman, Senior UXR Manager, Lenovo
  • Dan Schantz, UX Research Manager, BB&T Bank
  • Jeanette Fuccella, Principal UX Researcher, LexisNexis
  • Sarah Kennedy, Senior UX Researcher, AnswerLab

Here’s what they had to say:

Marrying two data streams

Our panel knows that to be good a UXR, you need to be a good mixed methodologist. It makes for better design and, ultimately, business decisions.

But striking a balance between quant and qual can be difficult—especially when you’re on a smaller team (or you are the team). That means working with a more limited budget and resources.

Leah (Lenovo): When I started at Lenovo five years ago, I was the first researcher that the web team had hired. I had a really small team that I worked with that had a couple of designers, a couple project managers, and that was it.

Before six months had gone out, though, they totally revamped the entire team and literally airlifted a big group of managers and designers from the North America merchandising team into the global web team. So we all of a sudden had this big explosion of scaling up the amount of work and the amount of responsibility the team had, which meant that I was going from doing mostly qualitative stuff to a combination of qualitative voice of the customer and quantitative research.

This is a company that's built from engineers. Numbers absolutely rule and get the respect. So it's been a fascinating journey to try to build out how I get data that is qualitative but also supports web analytics, optimization, voice of the customer, and large scale survey data.

We’ve moved from doing just a single usability study to putting together a description of the issues based upon multiple data sources, both qualitative and quantitative.

Jeanette (LexisNexis): I was fortunate to come into the organization at a time when there was a lot of reorganization going on. So UX was placed in a global strategy segment of the organization.

Our sister group is a data science group, so we actually report up through the same chain of management. We have data analysts and data scientists who are assigned specifically to work with us as UX researchers.

That has just set the tone in the organization that both types of data are necessary for decision making. They sit side-by-side and they work best when used in conjunction with one another.

We really benefited from that and I would strongly encourage where it's possible to have that pairing. That's a great place to start if you’re a less UX-mature organization.

Dan (BB&T Bank):

Sitting on our teams, we have UX designers. The UX designers are the primary requesters of research. They'll come to us with different problems and questions. If it's a why question, then I'm not necessarily going to design a quant study. I'm going to go get some qualitative feedback. I want those deep, rich insights.

We're constantly looking at every question as whether or not it is quant or qual, but also putting that up against their design timeline. So the structure of the team is very agile. We're super fast.

I came in at a time when the company was starting to figure out how to support the speed for this team. They'd say, "Here's the problem, we need an answer in two weeks. Can you go do a qual study?" Then we’d need to figure out how on earth we could do all the recruiting and the mod guide, and get the responses back and try to help them.

What we’d wind up doing is building their needs into our cost access flow. We’d do all the “upfront stuff”— all the recruiting, getting access to the tools we needed. That way when they come to us with questions we can say that we're going to get them in front of users and we're going to get those qualitative answers back in time.

Now, before they're done with the development cycle, we can use that qualitative to tell that story. I'm actually looking at the qual researchers on my team and thinking about how I'm jealous of them because they're starting to realize how quickly they can get qualitative facts.

And teams are hungry for it; they're starting to ask more for that type of research. They can sit in front of it and they can see what's going on and they can hear what people are saying rather than digging out a survey and seeing what responses come back.

Takeaway: Lean into multiple data streams—and be smart about the kind of data you’re taking in.

As researchers, one of our key skills is to be able to empathize. We should empathize internally as much as we do with our end users.

Jeanette Fuccella, LexisNexis

Speak their language

Often, the reason many stakeholders are turned off by the idea of qual research is because they’re not research literate. Engineers, scientists, marketers, or even salespeople might fall into this category.

And sometimes to answer their questions, using qual or mixed methods is the best approach. Here’s how our panel approaches doing qual research to help bring insights to the table.

Leah (Lenovo): A big part of it is education. A qualitative study is going to give you examples of issues that are happening. It's not going to tell you how pervasive any one of those issues is.

That's why you want to marry the quant and the qual. The qual is going to tell you the problems people have, and then you use a site intercept, or a survey, or voice of the customer data, to see how frequently any of those are actually happening.

You have to really explain that approach and say: “We need two kinds of data to really understand this problem, or to understand what our customers are doing. So the first thing we're going to do is get a sense of what the issues are by talking to them and watching them, and then we're going to go out and figure out how often each of those happens.”

That's one really basic way to explain it to somebody who's not familiar with how you marry qual and quant research.

Jeanette (LexisNexis): As researchers, one of our key skills is to be able to empathize. We should empathize internally as much as we do with our end users. Especially if you're in an organization that may be less UX mature, there's a lot of education that has to happen.

People don't know how to use UX generally, but specifically UX research. And they certainly don't know how to triangulate data or what mixed methods even means.

So in addition to education, I think it’s important to take a consultative role. That means partnering as much as I can with my product managers.

My role is to help them make the best decisions that they can, and also to have them participate with the data that they’ll need to be able to make those decisions. If I can do that for them and then explain to them how we're going to get that data, then they don’t have to decide so much. They learn by doing. They learn the value of different types of methodologies through experience. We've found that to be very successful here.

Dan (BB&T Bank): Even at the beginning of our process, if somebody requests research, we sit down with them with what we have. Once I understand what their goals are and what they're trying to learn, I'll let them know what I'm looking to do and the type of information that we expect to get out of this.

We level set with them so they know the expectations are, so that they can check incident rates necessarily with small end studies and just making sure that we're completely aligned before anything goes out the door.

Takeaway: Empathize with your org and stakeholders as much as you do with your end users. Listen to them and their goals, and learn what they are looking for from the research.

You have to really explain that approach and say: “We need two kinds of data to really understand this problem, or to understand what our customers are doing. So the first thing we’re going to do is get a sense of what the issues are by talking to them and watching them, and then we’re going to go out and figure out how often each of those happens.”

Leah Kaufman, Lenovo

Meeting expectations

Sometimes your data might conflict. Qualitative insights might butt heads with quantitative data, and vice versa.

That’s okay. In fact, it’s to be expected. When that happens, though, you need to know how to bring this up to your quant stakeholders without seeming like you’re attacking their integrity—a feat easier said than done.

Jeanette (LexisNexis): I usually tell folks there’s a disconnect because we just haven't figured out what the fulcrum point is. It usually means we have to dig a bit deeper, that we don't fully understand what's going on well enough.

It's not that one is right and the other is wrong. There’s something we haven't yet discovered and it’s causing a contradiction. My recommendation is to actually move in closer.

Leah (Lenovo): I think one of the most common explanations for qual and quant disconnect is the fact that you’re tapping into different groups of people. So the data that the quant is based on, whether it's voice of the customer or a survey, may simply be people with different characteristics than the people who are in the qualitative study.

You've got to turn around and understand the source of both kinds of data and see if it was something with the participants. Is it something with the task that they were asked to do? There are going to be issues with data cleaning on both sides.

It’s not that uncommon to find someone who says they got certain results, and half of the participants actually didn't follow the instruction, or didn't answer the right question, or had some other characteristic that would have disqualified them.

So, really understanding where the data is coming from and what kind of tasks elicited those particular responses is really your first step. You actually can get some really good “aha!” moments if there's a difference there between the two. That's actually really compelling because you're learning two different aspects of something that you thought were the same thing.

Dan (BB&T Bank): I had this example happen when I was talking with my new counterparts at Suntrust (soon to be Truist). They told me they had one study where they talked to small business owners. We realized that we did a similar study with small business owners but got completely opposite answers from them.

We had to unpackage it and see what was going on. We came to the conclusion that we were actually defining small business owners differently. We were looking at small business owners that were more like sole providers, and they were looking at small business owners that were $11 million to $15 million a year companies.

So we realized that all of our answers made sense. Now we’ve painted a picture around it. So I think really unpacking both of those pieces of information and sharing an understanding is what's going to help in that scenario.

Sarah (AnswerLab):

Contradictions can actually get you a richer understanding. It's never that our assumptions were wrong per se, it's just that the definitions may have been different or slightly different and just coming together.

When we go through the data and we find different responses than we expected or things that may "disagree" with each other, we say, “Okay, well why is that?” Not, “Oh we're wrong” and move on.

We had an experience where we did quant work and then tried to support it with qual, and guess what? It was slightly different. We found more new subsets of each “group” that we didn't realize existed.

The client was actually thrilled because we learned even more than we thought that we would. So it's really just telling the story in a deeper sense. It's never just that we missed the mark, it's just, we need to go deeper and tell that story in a deeper way.

Takeaway: Understand where each piece of data—both qual and quant—is coming from. The context that surrounds the insights can provide a deeper level of understanding into how you got them.

Tony Ho Tran is a freelance journalist based in Chicago. His articles have appeared in Huff Post, Business Insider, Growthlab, and wherever else fine writing is published.

Subscribe To People Nerds

A weekly roundup of interviews, pro tips and original research designed for people who are interested in people

The Latest