Skip to content

Winning Hearts & Budget for Qual Research (in Quant-Minded Orgs)

Learn from leading UX researchers on ensuring that your qualitative research has an impact. Hear their strategies for engaging stakeholders with qual.


Hear how leading user researchers ensure qualitative research has a place at the table at their companies.

The role of research at innovative and impactful companies continues to grow. But often, stakeholders overvalue (or only value) big data.

Striking the right balance behind quant and qual inputs, especially when you're on a small team (or, you're the whole team), can be a challenge.

We've assembled a panel of UXRs across industries and company sizes to share knowledge on how they build advocacy for qualitative research and ensure the actual user perspective is valued across their orgs.

Stream the conversation with our panel of UX researchers to:

  • Hear strategies for driving impact and visibility of qual research across your org.
  • Learn how you can develop influence and advocacy across stakeholders to build momentum.
  • See how to apply tried-and-true best practices to your team, via real-world examples and cases.

Our crew of experienced user researchers has made inroads by advocating for qualitative methods—they share advice on how you can do the same.


Transcript:

Ben:

Hello everyone. Welcome to December's discount webinar. We're really glad that you were able to join us. Thanks for spending some time wherever you might find yourself. These are our presenters for today. Oh boy. There it is folks. We have a really great lineup of folks who have worked at a multitude of places. They presently find themselves at what we at dscout have termed a “quanti org” and that's just a catchall for an organization that might think about insights in a more numerical or quantitative first manner. This is not to say that the places represented here are against qual far from it. These are folks who, because they've, they are at these, these, these various places are making inroads and trying to bring human centeredness and user centricity from a non quant or mixed methods approach to the table. So we have Dan from BB and T, Jeanette from LexusNexis, Leah from Lenovo and Sarah from Answerlab.

Ben:

And so we are going to go through a couple of questions that I have for them. We've been thinking and talking about this. And so the hope is that we'll be able to provide you with strategies and examples and use cases of how if your, someone's sitting right now in an org and you're like, man, I can't quite make the headway that I want on getting these human centered principles or even doing a diary study or getting more interviews or doing more field work. Again, however you're defining qualitative, these folks have experienced doing it, they're pros at it and they're currently doing it. And so I hope that they will be able to share some things that you can take away. So with that, let's have them introduce themselves in a much better way than I just did and tell you a little bit more about where they are and in particular the structures. So why don't we move as I understand it left to right. So Jeanette, do you want to begin?

Jeannette:

Sure. I'm Jeanette from LexisNexis. We build a legal software and other news-related software and I'm a researcher here. I've been here about three years that I've been in the industry for a little longer than a little while.

Ben:

Dan, how about you go next.

Leah:

Hey, I'm Leah Kaufman. I manage user experience research for Lenovo's websites. Lenovo makes all kinds of computers, hardware, a little bit of software for them. Also makes servers as networking and storage devices and gaming computers. Just about anything associated with computers. You can probably buy from us and there is a ton of work to do on the websites in order to make sure all of the information about all of these products gets communicated clearly. So yeah, there's always a lot of work and because this is such a big company, it is very hard to do work that can be easily shared and distributed and that supports all the different places within the company that all seem to be working on exactly the same thing. So lots of silos, lots of pulling people in to understanding what we're trying

Sarah:

to, what we've learned about users and helping them understand how to use that information. So it's quite the rodeo.

Ben:

Dan or Sarah

Dan:

Sure. I can go then. So I'm Dan Chance, I am a UX manager at BB&T, which is a, if you are not a part of the the Southeastern region of the United States, it is a a bank. We have recently merged with another bank, so as a Friday I will now not be working with ENT, but I was working for a bank called Judah fun times.

Ben:

Here we go. Here's announcements. That's great. Yeah,

Dan:

so we're growing and yeah, I mean we, we are very much a quantity org. I mean everyone that has come up through the bank is a, they're all bankers. They're like numbers. That's the thing that they're interested in. I happen to be a little bit more quantity than quality, but my team does an enormous amount of qual work. So I'm excited to talk more about that.

Sarah:

Hi everybody. So I'm Sarah Kennedy. So I'm a senior UX researcher at answer labs. So what answer lab is, we're a UX research consultancy. So what we do is we work with clients and create custom quantitative and qualitative research programs for the clients. I've been an answer lab for a little over a year now. I kind of sit on the qual side at answer lab, but I have been in the UX research field for over 10 years, I have to say.

Ben:

That's great. Thanks everybody and I would love to hear from some, some or all of you on how when you arrived at either where you currently are or another place like it, how the structure was set up either for qual or maybe not for qual. I know that some of us were talking about how you, you had to kind of take a lay of the land when you got in. Again, whatever position you were at to think about like, okay, research is currently defined in these ways and is being done in these ways. How did you, if you could talk a little bit about what that structure is and then how you sort of took stock of what it was and then found ways to make qual or, or mixed methods happen again, like early days when you were just starting at wherever you are now.

Sarah:

Sure. Sorry, let me go ahead. Okay. Thank you. When I started at Lenovo five years ago, I was the first researcher that the, that the web team had to hired before then they were using just using vendors like user testing and UserZoom and allowing them to do all of the work. I was the first person I brought in to do that. So I had a really small team that I was working with that had a couple of designers, a couple of project managers and that was it. And before six months had gone out, they totally revamped the entire team and literally air lifted a big group of managers and designers

Leah:

from the North America merchandising team into the global web team. And so we had this, all of a sudden this big explosion of scaling up the amount of work and the amount of responsibility the team had, which meant that I was going from doing mostly qualitative stuff to a combination of qualitative voice of the customer and quantitative research. And that's really been sort of the pattern over the past five years isn't, again, this is a, this is a company that loves, that's built from engineers. You know, it's a love child of a Chinese company in IBM. So numbers absolutely rule and get the respect. So it's been a fascinating journey to try to build out how I get data that is both qualitative but yet supports web analytics, data and optimization data and voice of the customer data and large scale survey data. So it's been a really good process actually of moving from doing just a single usability study to combining usability, voice of the customer data, all the site intercept data, all of these so that any issue, we're putting together a description of the issues based upon multiple data sources, both qualitative and quantitative [inaudible].

Jeannette:

And so I was fortunate to come into the organization at a time when there was a lot of reorganization going on. They had just moved the technology center down from Dayton to Raleigh and I can't take credit for any of the good decisions that were made around this. But UX was placed in a global strategy organization and our sister group is a data science group. So we actually report up through the same chain of management and have data analysts and data scientists who are assigned specifically to work with us as UX researchers. And that has just set the tone in the organization that, you know, both types of data are necessary for decision making. They sit side by side and they work best when used in conjunction with one another. And so we, you know, again, I can't take credit for any of that, but we've really benefited and I would strongly encourage where it's possible to have that kind of pairing. At least if you're in a a less mature UX organization that that's a great place to start. Perhaps if you get more mature than you might be able to separate more. But for us that was just a brilliant move.

Leah:

Yes sir. Do you have any examples of, I like the both, both Jeanette N and Leah having these examples of leaning into like the multiple data streams like it, it's going to make for better business decisions when you know, we're mixed methodologists, not just let's

Ben:

do more qual, but let's just get smart about the kinds of data we're taking in. Right? Like what insights might we be able to say or produce if we're only collecting and thinking about questions in one kind of way. D D do either of you have experience with the sort of multiple data streams? Make sense to me?

Sarah:

Yeah. Yeah. Yeah. I have very similar experience at a previous role as Leah did and you know, combining those and kind of speaking to both. And it's almost like having quant with qual. They compliment each other so well. And so at my previous role that worked really well and kind of getting the word out that user, your testimonies are, are awesome. And so just those soundbites are really useful to speak to what the user is saying. And through qualitative data, we can get that more than we can with just statistics and percentages. And so to kind of move, you know, and migrate to more of a qualified, you know, telling that story I think is really powerful and it really does and did help in my experience and, and does today too.

Ben:

Dan, how about you? You've got a both a quant quant background and you live in a very quant world. How have you been thinking about, especially as you're scaling a team, multiple data streams or mixed approaches?

Dan:

Yeah, interesting. So, well mixed approaches is kind of all just based on whatever questions the, the teams need. So I'll talk a little bit about the structure. I mean we're on the team org structure slides. The, our research team is very much a support team for the product teams, the development teams, and sitting on those teams. We have UX designers. So UX designers are our primary requester of research, right? They'll come to us and say, we have these problems. So a lot of times when they're like, Hey, we need to know the answer and how are users doing this? What are users doing? Why do users feel this way? We're going to look at that question and say, okay, well we need to get the right answer for you. If it's a why question, then I'm not, I'm not necessarily going to design a client study, right.

Dan:

I'm going to go get some qualitative feedback. I want those deep rich insights and so, so really, yeah, we're, we're constantly looking at every question as whether or not it is quant or qual, but also kind of putting that up against the timeline. So the structure of the teams is very agile. They're super fast and what was really hard for us before I got there, or I'll even say it wasn't me, but I kinda came in at a kind of a golden age where they were just starting to figure out how to support the speed for the teams where they're like, Hey, we have to, here's the problem. We need an answer within two weeks. Can you go do a qual study? And we're like, okay, we need to figure out how on earth we could do all the recruiting and develop the mod guide and get their responses back to try and help these teams.

Dan:

So from, from that respect, what we wound up doing was kind of just building into our process where we know what their cycles are. We know that they have these two weeks. In our case right now, it's three weeks cycles is going to be two weeks. So we're like, Hey, why don't we just schedule the sense to come in? Or why don't we just schedule a study for three cycles from now and we'll tell everybody, Hey, we're going to have this. If you have questions, come tell us the, you know, the Monday before. So we already did all the, the upfront stuff, all the recruiting and all that, getting the right tools we needed. So that way when they came to us said, Hey, we have these questions. They're like, okay, well we're going to get, we're going to get them in front of users and we're going to get those qualitative answers back to the team.

Dan:

So now before they're done with the development cycle, we can use that qual data to tell that story. And for us it got to the point where now I, I'm actually, I'm, I'm kind of looking at the qual researchers on my team, like, Oh, I'm jealous of you because they're starting to realize how quickly they can get qualitative data back and they're hungry for it and they start asking more for that type of research. They can sit in front of it and they can see what's going on and they can hear what people are saying rather than like picking out a survey and, and just being what like, you know, responses come back.

Ben:

That's a really, that's awesome. Dan, what, we just got a question. I know I said I would wait, but I'm looking at, I have this big screen y'all, so I'm looking at your questions. Dan speaks to I think something that I, I think the audience would like to hear from, from you all and specifically that is okay. So you, it sounds like many of you are partnering with or supporting a diverse set of teams. How do you, how are you doing the business of qual research? When many of your stakeholders are scientists, engineers, folks who might not even be very research literate like marketers or sales folks. Can you give an example of how you've brought qual or mixed methods to the table for a question that a stakeholder that might be more quant has asked,

Leah:

but I think a big part of it is education. You know there's a couple of core principles in here. The fact that a qualitative study is going to give you examples of issues that are happening. It's not going to tell you how pervasive any one of those issues is, but that's why you want to marry the quant and the qual. The qual, the qual is going to tell you what are the problems people have and then you use a site intercept or a survey or voice of the customer data to see how frequently any of those are actually happening. And if you can explain that approach to, you know, say we need two kinds of data to really understand this problem or this issue or to understand what our customers are doing. So the first thing we're going to do is get a sense of what the issues are by talking to them and watching them and then we're going to go out and figure out how often each of those happens. That's a really basic way to explain it to somebody who's not familiar with how you marry qual and quant research. So that's one way to do it.

Ben:

Have any of you had experience with, I have heard it from some of my other colleagues that they're, the first thing that they like to do is just kind of level set on what they mean as the researcher or the or the designer with by qual. Because I think there are misnomers of, Oh well it's going to take a, to Dan's point like, Oh, we only have two in this iteration, or even three weeks. And you know, stakeholder might come to me and say, that's not enough time. And so, you know, they're thinking about recruiting and they're thinking about maybe in homes or a deep ethnography. Are there much of what Lee is talking about like the education as you're talking about the design and the studies you might use?

Jeannette:

Yeah, I think that it's not just education. I think that's an excellent, excellent point that we have to be really careful. And as researchers, one of our key skills is to be able to empathize and we should empathize internally as much as we do with our end users. And when, especially if you're with an organization that may be less UX mature, there's a lot of education that has to happen there. People don't know how to use UX generally, but specifically UX research. And they certainly don't know how to triangulate data or you know, know what mixed methods even means. So even quantum, well there's, I've, I've spoken with pot researchers in this organization and we're an organization full of data scientists and engineers and you know, they're, what, so what's a survey? Is it quads or is it fall? While survey is a research device.

Jeannette:

It can be both Quantic wall logic wall is a characteristic of a piece of data. So you know, even with people who you know, kind of should know. So what I find in addition to the education that had talked about is really taking a consultative role. So partnering as much as I can with my product managers, my role is to help them make the best decisions that they can and to the best of my ability to anticipate the data that they will need to be able to make those decisions. And, and so if I can do that for them and then explain to them how we're going to get that data, then that doesn't really leave it to them to be the one who has to decide that so much. And they sort of learn by doing, right. We do it and they learn the value of different types of methodologies through experience. We've found that to be very successful here and have a, you know, a seat at the table with right alongside the designers tech leads PM.

Dan:

Yeah. And I'll jump in and say, even at the beginning of our process, when somebody requests to research, we sit down with them with what we have. We call it an intake meeting. We're kind of intaking the work and once I kind of understand what their goals are, what they're trying to learn, I'll, I'll pretty much let them know like here's what I'm looking to do and here's the type of information that we'd expect to get out of this. And kind of just level set with them so they know the expectations are. But you know like they can't check incidents rates necessarily with with you know, smallN study and just making sure that we're completely aligned before anything goes out the door.

Ben:

How are you all, if you are doing level setting in terms timelines, I know that for some of your stakeholders in the folks and sometimes the timeline is the timeline, but are there any tricks or tips or things that you've come upon that help you manage tighter or agile is the new catch all phrase time timelines when you're doing more intensive qual work is, is there anything that you can share that has worked to let that lets you do speedier call and I, and I'm not the assistant to set up for [inaudible]. Importantly they are like, I of course, I would like for you to say there are really great research platforms, but you know like do you have, you know, like trusted recruitment partners or we write the interview guides with the stakeholders, like things that, things like that, that, that helps shortener or help you meet those deadlines.

Sarah:

Okay. Yeah, I mean we, we have a really kind of nice way we have a process in place. You know, we have a research ops team that kind of helps us do recruitment so the researcher can start doing the research plan in tandem almost. And I think that really helps. But I think always the challenge too is like, well the qual is running. So if it's D scout or another qual project not using D scout, it's really taking great notes while you're doing it. Right. And so I found that that actually is the trick to faster research is in the research process. So making sure you're taking Excel notes and as a plug of D scout, well we use D scout, I also have an Excel file open. What I and I use both, it's almost like summarizing the summaries from those research elements too. So we do have a really quick timeline turnaround a lot. And so the ways you know, we can do it. And I think while the research is in progress, it's really trying to synthesize that data as it comes in, especially when it's qual and you're getting hours of feedback from interviews or focus groups or in homes.

Leah:

Yeah, I think Sarah is spot on. I think one of the things that helps that helps us, my team and my research the most is to be very, very careful with the scope of any project you agreed to do so that you don't let it get blown out of proportion with more and more questions or issues or ideas really make it very, very clear and limited relative to the amount of time and whenever someone says, Oh, but can we also ask this? Do you say, no, we cannot. If you want it done in the timeframe, we can do another test, we can follow that up. But keeping that scope man, being that to the set of things that are most important to answer that question or issue, that's really going to be your best friend in the whole thing because number one, it manages expectations from your project team that you're doing the study for, manages your stakeholder expectations.

Leah:

Secondly, it helps you just like Sarah was saying, you can't. You're going to be able to create tasks that directly address specific things you want to do so you know what data, what questions you're going to answer, and what tasks are going to give you those answers right from the get go and then that makes the coding that Sarah's talking about much cleaner, much simpler. You're not just, it's not exploratory, it's not let's listen and see what happens. It's we want to see them if they do these five things. So I'm not saying there isn't a place for exploratory or serendipity and discovery in that research sessions, but managing the scope, defining the tasks in a way that allow you to really get straight to the data, the things you want to observe, people doing are going to be your best friends.

Jeannette:

We've with, I would plus when both of those, we've also been so very pragmatically, we built an internal user panel that's been just monumentally fundamental to our ability to move quickly. Our audience is predominantly legal researchers, so attorneys and other researchers, that's, it's really hard to reach them. And so the the panel and having a a research ops person, if you don't have a research ops person or people cannot speak highly enough for how important that is, if you want your researcher spending their time doing research, but then separately from all of that is trying to separate out your short term pragmatic research or you're much more narrowly focused with so more of a dual track approach. Right. So that there, there's some research that is very sprint driven that's very narrowly focused and then others, as Leah said, it's much more discovery. It's much more serendipitous and we're not really sure what we don't know and unmet needs and and trying not to accomplish both of those things at the same time or even potentially with the same researcher. So you really try to separate those two strings I think makes a big difference.

Jeannette:

Start with the end in mind. That was the other thing. Yes, no it's you're going to report on and if you know what you're reporting on then you will collect the data in a way that you almost build a report at as you're collecting.

Ben:

Yeah, we are at discount always. I mean both me as a researcher at D Scott and as a researcher, as been the researcher, I have taken that from my, my quantitative background wherein I didn't ask a survey question unless I had hoped to analyze it or implement it in some way. And similarly like certainly exploratory and discovery and and the the grounded theory approach where I'm going to just interview and then I'm going to let the data speak to me and bubble up. You can still do that and no, you're going to ask these five questions to give you these three sorts of inputs to answer the one question, right? Like at the end of the day you do need to deliver on this or that. Should we build this product or or you know, crush this product. You will need to speak when you all know this, you need to speak to that.

Ben:

And so I have, it sounds simple but it is, as everyone has said, it's so important to, to ask your of your research like what is the question doing? What is this question in my guide doing or why are we visiting participants in their home versus having them, you know, do calls. You're like, again, asking yourself what you're going to get on the back end will certainly help. Maybe save you time when you go into your analysis and synthesis and it'll, it'll save you time on the front end when you're designing to at least it shows that that leads well into if there are no other. Dan, did you have anything that you wanted to add to what we've been chatting? Where is it leading to? I'm curious. I w I'm curious about, cause I'm, I'm looking at, I feel like I'm a, I'm Wolf Blitzer is even still on CNN.

Ben:

I'm going to go with it. He is like, well there's 7,000 pieces of information working around. I have really good questions. I want to turn the screen, but my, my, my boss will be really mad at me about that. But we have questions about the specific methods that you're using to do this. And you hinted at a little bit of them, but have you found that there are kinds of questions or you know, it could be as simple as, I do very few field work interviews. I do these remotely. Do you use diary studies? Are you using survey tools? Like what are some of the tools that you're using and how has that evolved as you found yourself making way in these quanti orgs?

Sarah:

This is what we're supposed to say. [inaudible] want that $100 I've been talking a lot. Someone else could. So yes is my answer to your question. Honestly, I feel like there's just so many tools. I'm not sure if I could just, you know, narrow it down anymore. You know, you're always trying to find whether our clients or our stakeholders the most effective way to get to that answer. Also the most efficient and also just really making sure they get what they need. I know I'm a Mayan. We're always trying to be, and we are very super agile when it comes to tools because we want to be able to kind of meet all the needs from all the angles. You know, when we started out in UX research, some of us have been around longer than others. There were a lot fewer options out there and so we're kind of, you know, it's a blessing and a curse.

Sarah:

We have lots of options so we can kind of do quant and qual mix. We can get both sides just, I actually think as much as I love scout and I use it often, but I also think it's really good to be flexible and that we have this great rich tool set these tools available to us. I think it's really important to understand Dan [inaudible], any of the biases that you get with any tool use really good to know what it's good for, the kind of data you can get from it. If it's a tool that includes recruiting, what are some of the limitations of the participants

Leah:

that you might expect? For example, most, I've found that most major tools that have a very, very large subject pool tend to have rather large numbers of retirees and students. So you have to keep that in mind in your recruiting and make sure that if you want [inaudible] that you're using any other screening tools that they have available to help you make sure you're getting the right distribution of particular characteristics that you want. That Sarah's right, there are a number of tools available and you work with them according to which ones are going to be what you actually need. I've got a range of things in my toolbox including D scout that allow me to do moderated and unmoderated remote research, large scale surveys, site intercepts. Mmm Hmm. Stuff that feels more like marketing research than, than UX research. It's, it's, it's really interesting and it's really a case of being, you know, that, you know, it's like, man, the plumber that shows up with the box tools and you're looking at the, well, you know, if you're all here, your problem with your sink, or here's your problem with your UI based on how much time and how much money you got, here's what your options are.

Leah:

And you, you want to be that Jack of all trades with those tools. So you can pick the one that's going to get [inaudible] [inaudible] get as close to the results as you need within the time and the resources you have. So you want options.

Dan:

Yeah. And I'll piggy back on that and say that I think one of the most important tools we have where I'm at is a strategic partner. So somebody who does research that we can look to that has a local panel so we can recruit from, but also has that, that UX research skillset. So if we have to kind of open the valve and ask them to take on some work they can. But I think having that strong partner, that good vendor you can work with is probably one of the most important things we have.

Ben:

I'm curious what happens when you have a, so I want to now move to, we're getting a lot of really good questions about this next topic. Folks. Want to know how you're working with stakeholders, how you're looping them in. I want to start with this. This is one that that was asked earlier. What happens when a qualitative insight might, I don't want to say disprove, but calls into question, the veracity of a quantitative insight like like, Oh, I'm tiptoeing around that. Well what if you find conflicting results when you are, you know, you've got a stakeholder saying like, well here's what the survey stairs with the back end data stares with our data scientists are saying, you know, go out and and see if that's the case and you find something that conflicts with that. What, what, what sorts of converse? How are you bringing that out? How are you raising that with, with the stakeholders, the, the quants, stakeholders

Leah:

w we done already. We run that

Jeannette:

more than once. And I, and I usually tell folks, you know, it's because it depends and you just haven't figured out what the fulcrum point is. So it usually means digging in a bit deeper that you don't fully understand what's going on well enough [inaudible] and that's why there's a disconnect there. It's not that one is right and the other is wrong. It's that there's some open point you haven't yet discovered and why there's a contradiction. So that would be my recommendation is actually, you know, move in closer.

Leah:

I think not. I think one of the most common explanations for that is literally we were tapping into different groups of people so that the data that you're, that the quant is based on, whether it's voice of customer or a survey may simply be people with different characteristics than the people who are in the qualitative study. It may be the qualitative study is representing one particular aspect of reactions or use or interaction with whatever the UI was or whatever the, whatever you were showing them. I think, you know, it's basically you gotta turn around and understand the source of both kinds of data and see if it was something with the participants. Is it something with the task that they were asked to do and as, sorry, there are going to be issues with data cleaning in both sides. Not that uncommon to find someone who says, Oh, we got this result, but it turns out half the [inaudible] people actually didn't follow the instruction or did an answer the right question or had some other characteristic that would've disqualified them or make them really different from the people actually from the rest of the people that answered something.

Leah:

So really understanding where the data is coming from and what kinds of tasks elicited those particular responses is really your first step and you actually can get some really good aha if there's a difference there between the two. That's actually really compelling because you're learning two different aspects of something that you thought were the same thing.

Jeannette:

I want to say just one more thing on Leah's point there that just to add is, and I'll name it, is there's also bias, right? Every one of us, no matter what, we all have bias and we bring our bias to our work. And sometimes, and that's true quant and qual. So let's, you know, I say this to people all the time, repeat after me. Every act of data interpretation is an interpretive of that quant or qual. And so sometimes we've got to really take a hard look at ourselves and say, am I the one who's, you know, interpreting this incorrectly?

Dan:

Yeah. And I'll say, I actually had this example happen up when I was talking with the, my new counterpart at sun trust seem to be triggered. They were like, Oh yeah, we had done a study where we talk to small owners and they said this, and you're like, Oh, we did a study where we talked about business owners and they said the complete opposite. And what we really did was kind of have to unpackage it and go, okay, well why, like what is going on? And it came, we came to the conclusion that we were actually defining small business owners differently. We were looking at small business owners that were more like to sole proprietors and they were looking at small business owners that Marino like 11 million, $15 million a year companies, and we were like, Oh well now in that light, all of our answers make sense and now we have painted the full picture around it. So I think really unpacking both of those pieces of information and, and sharing and understanding is what's going to help in that.

Sarah:

Yeah, and that's exactly what I was talking about is you now have a deeper understanding ending of both of those groups as opposed to having this one idea that's supposed to apply to all of them.

Ben:

That's great. That's Sarah, you're, you're in a particular position because you are an agency consultative. So you're working with a variety of different customers with not only a different backgrounds and questions, but different understandings of quality. How, how are you bringing to them when they're, they might not believe or trust what you're showing, what, what, what are some of the strategies that you're using to make the case when you find something that might not align with what they thought?

Sarah:

Yeah, I mean it definitely happens and in any situation we, I think we've all encountered that. And again, it's like, it's like Leah already said, you know, it's actually getting a richer understanding. So it's never that maybe our assumptions were wrong per se. It's just that we had, like Dan said, the definitions may have been different or slightly different and just coming together. And so when we go through the data and we find kind of different responses or different answers and we expect it or things that may kind of quote unquote disagree with each other, we say, okay, well why is it not that, Oh, we're wrong. We move on. It's full by, we dig deeper and really go down, drill down into it. We had a similar experience that Leah discussed and Dan discussed where we did quant work and then supported it, tried to support it with qual and guess what? It was slightly different and what we found was we found actually more info that there were subsets of each group that we didn't realize existed. And so, and actually in that, the client was, was thrilled because they're like, Oh, we actually have, we've learned even more than we thought we would. And so it's really just kind of telling the story into a deeper sense. It's never really just that we miss the Mark, it's just we need to go deeper and tell that story in a in a deeper way.

Ben:

Yeah. It sounds like for all of you it's an opportunity. It represents an opportunity either to check the data stream that you're being given or to, as you've all seen or spoken about, dig deeper and unpack more. That can then, again, as as my former stats and this explain more of the variants of whatever the thing is, like why is this happening? Well, we're only explaining this much of the variants. But with qual and this this weird finding, we're getting more of the variance, which means our users will be happier and we'll have higher conversions or whatever the metric, you know of successes. It strikes me that, that's just another, again, we're having a lot of questions about like, but how do you make the case, how do you make the case? I think the thing that you're all saying is that like, well this decisions will be better and if we really want to be customer centric and you know, reduce, reduce frustrations for our customers or have them engage more, you know, we want to be collecting and capturing more of the variants and the doing qual as a way to capture more of the variants.

Ben:

Cause you're, you're asking questions in different ways. I'm curious if there's been a particular deliverable or style of presenting the results, especially qual, which you know, takes a lot of interpretation. Are you relying on rich media? Are you showing, you know, Pat, who was, you know, that represents this persona and here's a quote from Pat and here's a photo of Pat. Like how are you when you're doing your share outs, communicating the richness of the quality and what's working.

Leah:

I mean, not a plug, but I would say D scout videos or just videos in general of those users, especially if it doesn't necessarily go the direction we expected. Really showing the user's voice or literally from their mouth, whether it's them showing me their experience or them just reliving their experience. I think it speaks volumes because it's no longer a data point or just a line on a PowerPoint slide. It's, it's the actual users feelings and thoughts that are conveying that. So I think that's when videos and testimonials are the greatest use. Yeah, I absolutely agree with Sarah and I want to go back to her. The very first thing she said about building empathy, and that's really what our work is. That quantitative data really shows where the problem is. The videos and the quotes are all things that demonstrate the re the actual problem and what it feels like to have the problem. And that's your biggest, that's your, that's your biggest chunk of evidence when you're trying to explain why you think a UI needs to be different. Getting someone to really emphasize, emphasize is just huge. And watching the video, and this is a real trick, you don't show them just one, show them at least two or three or four examples so they can get the sense of the range of things that the ways that people respond. But that's how you get the, that's how you get that empathy.

Leah:

Okay. Yeah.

Dan:

Could he jump off Leah's point? I remember we at a prior place, I used to work, but Andrew would call him talking head videos and I remember we were trying to make a point that something was not a good decision. So we had like talking after talking head of how this is bad or this is bad or this is bad. And like the third or fourth one, the person was like, alright, I get it.

Ben:

All right, cool. And that Dan, that's really good cause one of the, one of the repeat repeating questions we're getting from the audience is like, okay this sounds great. You've, you know, you've sent the size, some really impactful in a data, you've got a nice highlight reel or a quote. But what if, and it just, just that, what if the stakeholder is like, yeah, but you know, your end was 30 people or you know, what, what do we really care about Pat's poor experience? What happens when, during those delivering of insights again that you've been, I mean, as a researcher I often forget how little my stakeholders know about the data. When I've been like up to my eyeballs and trans transcribing the wall behind me, it's covered and posted notes. I'm like, Oh they'll totally get this. Cause I've been looking at it for two weeks. So how could they not see this? And then they're like, I don't say it, I don't see it. What are you, how are you not like, you know, we'll find what sorts of things are you doing in those moments?

Leah:

Part of it is knowing your stakeholders and knowing what kind of data they respond to best. You know, so you know, you just keep pay attention when you're giving a presentation and see where they wake up, where they perk up a little bit. Do they really like graphs? Do they really respond to big numbers or are they actually deeply interested in individual responses and it's going to help you figure out what you want to talk to or show them in the report that you're delivering. Yeah, you gotta take the audience into consideration for along with the points that you're trying to make, but you can also be really clear, here's, here's the quant data that shows the the distribution of the different issues that we are seeing. This chunk of it came from, here's an example of quantitative that came from customers and feedback they gave us.

Leah:

Here's an example of quant data that came from session recordings. Here's quant data that came from site analytics. Now you've seen gotten a good sense, we've got, you know, triangulation across here. Three different sources. Here are the top five problems and they're the same across all these sources. Let's look at some examples from people who are experiencing these problems and what that experience is like. So it's your story telling through this and that's what you want to do. You understand who your audience is. Think about the kind of data that's going to work best for them that's going to carry the most weight for them and make sure you have a story with your data. Don't take too long to tell it, don't let it take an hour for you to get to your point and above all be absolutely excited about what you are learning and passionate about. Oh my God, we saw the coolest thing. You have to watch this video. Look at this. When you watch this guy, you see he's going to try this new thing that we created, we thought was so great. Watch what happens. You know? So let your passion for it and your excitement, learning this stuff.

Jeannette:

Be a big part of what you deliver.

Jeannette:

Yeah, I'll, I'll again, just agree with everyone here and the video is being so, so, so important. We, because of our audience being dispersed throughout really the globe, we do a lot of eating research. We do a lot of remote one-on-one research and have, have recently really made a concerted effort to try and get our participants to be on video, which is not, they can't always do it. They, you know, there's lots of things at play there and, but when they are, if it's amazing, the difference in impact when you can see someone's face, I literally in one study I was doing, I literally had a grimace tag, you know, and just the, you know, and it just speaks to the fact that, you know, particularly for some of us, you know, our emotions are completely communicated through our face. And I could watch, I could have a screen capture of everything that they're doing, but you would not and you would know what they're doing is frustrating and wrong, but what you see their face, then you really get the full, you know, that's where the answers point to the empathy comes into play of, you know, this is just, this is a busy professional who's billing out at $500 an hour.

Jeannette:

I've just wasted six minutes of their time. Unacceptable. Okay. So yeah.

Ben:

That's great. Well with the, I want to get to the last bucket of questions because it was one that you all raised when we were, when we were going through and running through the questions, you, you all said that you wanted to not only talk about the rosy moments, but things that you think are still areas for growth for both qualitative researchers broadly and qualitative researchers living and working in more quanti org. So I'm hoping that each of you can speak to something that you hope to work on with your teams or yourself or that you hope changes in the industry. Something that you think still needs addressing and improving?

Jeannette:

Yeah. We've plenty of improvement to do here. I mean, I'll speak to that as the one who said, I think it's important to talk about the things you know, the challenges that still face us because we've accomplished a lot, but we still have a lot of room for growth and so one of the places that we're really looking at 2020 toward is to better align our insights and on a foundation of close fall so that we're much more intentionally marrying up the data and bubbling it up into insights that that do have triangulated data to to stand on. Right. This isn't just one data point in all. Aha. We are going to extrapolate insight from that, but no, this is firmly rooted in multiple different data points and and, and even having an insights library requires a lot of gardening and a lot of tending and so there's a lot of room for growth there that we still me to do in that regard. It just tracking all of that. We continue to still have a lot of room for growth that we need in terms of better coding, tracking code lading, particularly our video, you know, all of our qualitative data but in particular video recordings. And so how do we do a better job of tracking across studies?

Leah:

[inaudible]

Jeannette:

having a standard set of codes but leaving room for flexibility and being able to pull from that in ways that, for example, I mentioned we do dual track, so if we're researching a feature then I can easily pull a highlight reel, throw that into that, you know, in in the backlog. And so six months from now or three months from now or whenever it is that the team starts to work on it, they've got a video cause I maybe researching it months in advance of the thing actually being built, but they can then self-serve to get the data that they need to really build it well. So there's people I could go on. There's lots of things I want to accomplish next year. Those are a couple,

Leah:

I have two things on my plate. One is doing a better job. Getting the reports that I've got organized in a way that they're fully accessible and searchable by anybody in the company so that if someone has a request for stuff that's on our configurator, I don't have to go dig it out and send it to them. I can point them to a site or a place where they can go find that research. That's number one. Number two is that I have done a good job getting my team addicted to unmoderated remote research, which we can do very, very quickly and turn around really quickly. But I believe we are missing a lot of opportunities to get a deeper sense of empathy by watching and listening to live people. So we've started doing that this past year. I want to do a lot more of it in the coming years so that my team, the designers and the project managers and the data analysts on the team are coming to me saying, when can we watch people as opposed to when can that study be done? So I want them to get invested in what it feels like to connect with someone who's having a problem. [inaudible]

Sarah:

I would say, I mean I don't know if it's needs improvement per se, but what I would love to do more of is, you know, multi-phase and watching something kind of evolve as it goes and just seeing how those little changes and seeing how things change after user feedback and after iterations. And so I want to, I love to see more of that and see multiple phases within a project and you know, especially if there are short timelines, we can do spurts of data across over time and gather feedback. And then also I do think can, I think Jeanette already said this and echoing her of there isn't really one or the other and having quant and qual together to support each other I think is another thing that I just really want to see continue or grow is having those both research programs work together to, to really have that rich result that we've been getting when we do both.

Ben:

Yeah.

Dan:

Yeah. And then don't mind me, I'm just fighting the sun. So if you notice my face going in and out of light, that's why. But I've always noticed the struggle with UX teams and UX researcher, especially qualitative researchers kind of marrying their data back to analytics. Cause there are a lot of computer scientists out there that say, Hey well we've got all their behavioral data so we know what's happening. And when you actually dive in or rather kind of unpackage it and find out that no, we actually don't know. Like just to get they landed on this one page doesn't mean you want it to or you don't know why they did it. So I kind of wished that in the future of people get a lot better about kind of mining the questions that come from the analytics data and using that to source quantitative research.

Ben:

Yeah, it, and I have heard from a lot of folks that we both serve and I work with friends. Many of the things that you all said, one repositories. So many qualitative researchers are like, but we did that already and it already lives here. So to Leah's point, like how do I get you to go where it already is? And so Jeanette, your idea of like let's just create a highlight reel and append it to, I don't know, a confluence ticket or a customer message or drop it in the Slack channel. We are thinking a lot about how we can get UX researchers or qualitative thinkers to have their research insights live somewhere because it, it can be time consuming and it is intensive to interpret and synthesize and so that those living documents or rather those, those outputs should be living somewhere. So thinking about repositories and then the other thing I'm hearing a lot about is self-service DIY.

Ben:

Can those repositories serve to educate our stakeholders on what the heck qual even is so that when they come to my team, their question is sharper, more focused, and they know something about what we already did enough to say, Hey, I saw this in the video. Can we then go into that? So that the researcher's time is spent more focused on doing questions that have more immediate impact and that the business has more awareness about what the heck quality research is in the first place. So I'm hearing a lot about repositories and then getting stakeholders or designers or even product managers or engineers to sort of self-serve and do some research for their own, which is a topic for another day as to who wants to own the researcher Mansell and who wants to play with. That's something I've been hearing a lot about, so that's, that's really cool.

Ben:

Well, I think with that we have answered many of your questions, which I'm really glad I put the big purple slide at the end. For those of you whose questions I didn't get to, I will holler via email. I want to put up the contact information for these smart, brilliant people and thank them again for their time, especially as we get close to the holidays. Just coming off Thanksgiving. Thank you so much Dan. Jeanette, Leah and Sarah. It's always a pleasure to get to hang out with you all. Even if I get to do so remotely. Folks, thanks so much for tuning in.

The Latest