Skip to content

Mixed Methods for Modern Orgs

Free webinar: Join researchers from dscout's The Studio as they share how they design and execute fast, accurate and user-centric mixed-methods research.

Featuring Casey Oberlin, Rosalind Koff

Today's innovative organizations are rethinking the traditional approaches to "collecting data."

Stakeholders want insights, ASAP...and the digital research ecosystem offers tools for speed and scale.

A mixed-methods approach delivers in-depth insights for any question or problem.

Join researchers from dscout's in-house research org, The Studio, who have plenty of mixed-methods experience.

They'll share how they design and execute fast, accurate and user-centric research—and how you can, too.

This webinar:

  • Highlights some client-tested research method combinations
  • Describes ideal use cases and question types for mixed-methods approaches
  • Outlines high level analysis and synthesis strategies for eye-opening outputs
  • Answers attendee-submitted questions to help fuel your mixed-methods research

Transcript:

Benjamin Wiedmaier:
We are People Nerds. Thanks for joining us. If you haven't joined or thought about joining People Nerds, we have an event every year. We're really excited about it. It's later this fall in September in San Fran. If you are interested in joining us, we will link some more information about that. You didn't miss much, just the most important part, that is the illustrious guests that are our speakers here. We've got Roz and Casey. They're from the studio, the Innovation Lab here at dscout. I am the bumbling fool who gets to ask them smart questions and they'll respond with smarter answers. Roz, Casey, would you mind introducing yourself?

Rosalind Koff:
Sure. Yeah. My name's Rosalind Koff. I've been at dscout for about a year as a research lead on the studio team. Prior to my experience here, I spent a while at a large, 80-year-old research organization where I ran a lot of public opinion, government, and academic work. I fielded a ton of quantitative, qualitative, and mixed methods research, and I worked on an innovation team there that was trying to move traditional methods a little bit forward. So super excited to be here to talk mixed methods with y'all today. Yeah.

Casey Oberlin:
I'm Casey Oberlin. I'm coming from academia. I've been at dscout now for a little under six months, so it's still kind of new to the industry, but I'm excited. I have a PhD in sociology, and I was an assistant professor at a college before where I taught then a variety of mixed methods courses and conducted mixed methods research, particularly on topics related to healthcare.

Rosalind Koff:
So she's always keeping me on my toes when I'm like, "Hey, I ran this project this way." She'll reign me in and correct me for the real way to do it.

Casey Oberlin:
And Roz gives me all of the applied, realistic, "This is what we need to do in this context."

Rosalind Koff:
Yeah. Cool. So I think, first, we'll probably start off talking about what mixed methods are. Casey, want to give us a little bit of an overview?

Casey Oberlin:
I'm ready for it. So the thing about mixed, there's kind of a key part of we identify mixed methods as something that's qualitative and quantitative. This is different than when you're using multiple types of data, which could be qualitative or quantitative in one project. So this is where you have a lot of disciplinary input. We're talking about it today. We'll be joining together the strengths of qualitative and quantitative approaches. The main goal is really to think about how can I triangulate these types of data. Triangulation is just meaning that you're bringing together different types of data to be able to address a question you have more effectively.

Rosalind Koff:
Yeah, and in our day-to-day, we'll have customers come to us with burning questions, but they're not always sure the method to go about that work. So it's super fun to really prioritize their key questions and think about, "All right, what is the best method or methods that can pair with this?" Versus saying, "I'm a quant person, and I'm definitely going to run these through the quantitative way, regardless of what the best fit might be.

Casey Oberlin:
Right. It's always about what's the best tools for the job rather than just being fixed in one lane. It's definitely a way we both approach our work.

Rosalind Koff:
Yeah. For sure.

Benjamin Wiedmaier:
Again, sorry for the echo and everything. Hopefully it sounds fine now. Casey and I were talking a few weeks ago about mixed methods, and one of the things that I always liked about them ... I didn't get to use them a lot in my background. I was more stats, quantitative experimental. But it always helps me see more of the picture when I could launch a great big survey, get some incidents and frequencies of whatever I was looking at, and then even if I did 10 interviews, I would get so much more richness, nuance, which is part of the clarion call for dscout and what you all do at the studio. Triangulation is the phrase that I hear a lot. We're trying to use various different data points to see the best view of things. So I think that is what I'm hearing from you all here.

Rosalind Koff:
It paints a bigger picture of the insights that you're getting out of your work. It gives you more confidence in what you're seeing in your data. But it also makes for more compelling storytelling for when you're actually sharing those insights with people who don't share your fellow research hat. Sometimes that means multimedia. Sometimes that just means painting a fuller picture of what you're hearing from users. Just like you said, Ben, that coupling of qual and quant to show the full picture of what's happening is super helpful. I also think that mixed methods ... So the triangulation happens definitely in the data. You're hitting it from multiple perspectives.

Rosalind Koff:
But that inherently involves a variety of people and their training and their perspectives on your research questions too. So the triangulation is both in storyline and insights as well as getting the right stakeholders, getting the right perspectives in the room to really target your research.

Casey Oberlin:
Which I think is a great point because you don't need to be able to do it all as one researcher. It's really about who's on your team or who can you partner with that can help you bring their skillset to the table. I think that's a thing I enjoy the most about mixed methods research is it's always kind of, by nature, collaborative.

Benjamin Wiedmaier:
Yeah. A brief thing that I didn't get to mention, again, this is on me, y'all. Ben from People Nerds really dropped the ball here, so thanks for sticking with us. If you have a question, please use the Q&A box. Roz and Casey and I are going to try to answer as many as we can. We won't be able to get to them all. We'll have follow-up information for y'all. If there's something that we aren't able to answer on air, Casey and Roz will be able to holler at you offline. So please use that Q&A button. And then you can up vote questions if there's something in particular that you like. Thank you, Louis, for example, for just posting yours in there. I really do appreciate that.

Benjamin Wiedmaier:
Okay. It feels like mixed methods is one of those tautological concepts where you're like, "Well, it's when you mix your methods." So I think that's a really nice way of talking about qual and quant or different sorts of trainings is I think really useful because a lot of folks who join People Nerds on these webinars are from various backgrounds. Let's go to when you might use a mixed methods approach. Let's just say broadly. Can we start with the questions or the cases or the outcomes that maybe are best suited for mixed methods? When might you use something like that approach?

Casey Oberlin:
Sure. I think I kind of like to think about it as two general approaches where you're either trying to confirm something, so you've done either maybe, let's say, a large-scale survey, but you're not sure of the validity if it's really holding up. Is that true or is it an artifact of the large sample that you have? Or when you can think about it the other way of qualitative, where you've interviewed maybe 20 people but you don't know if this is at all a pattern will hold up. So I think this is something that our people really are optimistic about the role of tech. You want to know that from a different vantage point you confirm something you already have data on.

Casey Oberlin:
The other approach tends to be more complementary. So you're extending existing work, but you have no way to really address it with your current data. That's where you're trying to address a gap. This tends to be more, I think, when going from quantitative to qualitative. You can't get at the bigger why and how this works. That's another way you can really target, we don't know much about this issue and we want to be able to expand that more. That's when you can really think about who else could help us with this mixed methods approach. So I think the confirmation or complementing are two different general overview approaches.

Rosalind Koff:
Yeah. In an ideal world, we set out from the beginning of our research and we're like, "Hey, we really want to do this thing," and we see exactly, cleanly, and clearly how we'll set a mixed methods study up. In practice, I've often found that I'll go out and I'll do a large quantitative effort or a large qualitative effort, and I'll come back and I'll be like, "Man, I have more questions than what I started with," or, "I'm seeing this particular group of users act in a way that I did not anticipate. Why is that happening?" That's often how that extra method will get tied in. It's like I got more questions than I started with and I really want to understand why I'm seeing this pattern of behavior.

Casey Oberlin:
It's so fun to be able to. That's the thing with the messy part, right, of digging into it, knowing you have tools. You feel more validity. You feel like you trust your data more, but it is never a clean and neat process in my experience.

Benjamin Wiedmaier:
Yeah. We've got Janet asked, "So mixed methods is identical in meaning to hybrid methods or multi-methods?

Casey Oberlin:
Those are often language, I mean-

Rosalind Koff:
Yeah.

Casey Oberlin:
Yeah.

Rosalind Koff:
I would say yes as a more practitioner, less grounded, don't have my PhD. For sure, if we're talking about a mixed methods approach, I would say multi-methods or hybrid approach. I feel like if you went to a textbook and you looked at the differences, there is a difference between mixed and multiple methods. Is that right?

Casey Oberlin:
Yeah. But I think in how it's used disciplinary-wise would vary, so even across different textbooks it would be.

Rosalind Koff:
Super fair.

Casey Oberlin:
I think what I've seen most common though is multi-method might be, again, you could be having different types of qualitative approaches. So you could be doing different types of interviewing or different types of content or textual analysis combined then with interviews. It's not necessarily always meaning you're going across this old, kind of really outdated mode, thinking qualitative verse quantitative. That's definitely not what we're about and that's, I think, definitely not what's captured by mixed methods.

Benjamin Wiedmaier:
Yeah. I think from the folks I get to talk to at the UXR and Design Thinking Conferences, it's largely qual and quant. Folks can become very [inaudible 00:09:21] and, Roz, we're going to talk a little bit more about bridging those gaps. But I think a lot of folks feel like they're "doing mixed methods" if they're partnering with someone in data science, which is very valuable. Okay, I did all these great ethnographic work and I'm seeing this thing. Can I confirm it or support it in the big data backend work? I think it can be more nuanced, as you said Casey, where you're doing maybe two different kinds of interviewing and then you follow up with a diary study.

Benjamin Wiedmaier:
They're all "qualitative", but you're asking things in slightly different ways and you're shaking the prism. I don't know. To when you would use them? Whenever I've used them, historically, I did just what you said, Roz. You go in thinking, "Okay. Hopefully, this great big survey or observational study will do this thing." And then I get something I had no idea would come up. I'm like, "Well, I can't keep going with a survey. I need to visit a home or observe someone do a thing or bring someone into a lab." So when I use it is when I need it. I can never say that there is that question's definitely mixed methods.

Benjamin Wiedmaier:
I'm curious to know if, best case scenario, unlimited resources, interviews, mixed methods for every question that a customer has that comes through your door at the studio. They have unlimited resourcing and time, which we know is rarely the case. Would you typically say, "Well, we're going to go mixed methods?"

Casey Oberlin:
No, because you want to know what the question is. This is immediately, why do we need it? It's great. It makes it really interesting and it's powerful. But if you have a really straightforward question that you just need a large survey sample for, then what else is going on there? I think, again, maybe it happens later on. You might think you don't need mixed methods right in the beginning and that could change. But I would never just say we have to do a mixed methods approach just to do it.

Rosalind Koff:
I mean I think part of the gift of sitting on a studio team is that we work with clients who are at various parts of their process. So depending on where you are in the process, what work you've done already, and what work you haven't done, it helps to evaluate if it's necessary, and often the answer is no. Dscout has a diary tool, but that's not always 100% of the time the right method. You might only want to do a quantitative study. You can be making things more complicated if the data won't necessarily align beautifully at the end. You want to make sure that when you're mixing methods it's with intention and it's data that complements each other rather than contradict each other or complicate the picture more than the questions you're already starting out with.

Casey Oberlin:
Right. Or be prepared to do more if you have contradictions, right? That could be a goal, but then you're going to have ... It's definitely extending and making it more complicated.

Rosalind Koff:
That's why the endless resources is hard because I could spin my wheels all day.

Casey Oberlin:
To keep moving, right? You could spend your money. Yeah, that's not a problem.

Rosalind Koff:
It's hard. What a dream.

Benjamin Wiedmaier:
Into another iteration. So I wonder, we're having some questions about, so not such the method, like survey usability, the level-up of you want to evaluate a thing. You want to explore the space or potential for a new product line or you want to discover more about your users. Is there a fit with mixed methods in those buckets that you've found historically? Like, "I really like using mixed methods or multi-method approach when I'm exploring users more broadly, trying to understand what it means to be a family in 2020 or understanding one's gender identity or how I exercise my right to blank."

Benjamin Wiedmaier:
You have these stickier, more ambiguous concepts that might not be fully "answered" in a survey. Are there buckets of broad types of questions, exploratory discovery, evaluative that are better suited you've found for working with mixed methods?

Rosalind Koff:
Sure. Want to go?

Casey Oberlin:
I often think about are you starting with a what or when or is it a pattern? Are you looking, then, to go from this quantitative to a more qualitative approach, which I think tends to be a lot of folks in the field? You're looking at what are the patterns that I want to then be able to dig into further? So I think there's kind of what are the levels of trust people might have or what different types of groups think this? Then you want to move into why do people follow this order or this search process or what causes them to turn off notifications at that moment? Or how does feeling lonely effect your use of the app?

Casey Oberlin:
So I think this is where there's kind of you can imagine a larger question about what's the role of tech in people's lives is a huge question. So you're going to have research questions that target different aspects of that. Go back to, I think, Roz's point of really finding a more satisfying answer, but not necessarily contradicting each other.

Rosalind Koff:
Yeah. I'll lead with qual when I ... If I put paper to pen and I don't necessarily know all the quantitative items I would even want to put in front of people that would give me helpful information, I'm like, "Man, I need to lead with some qual, understand this landscape, get an idea for what's out there and where people's heads are at. And then I can help them validate those things with a larger sample." The flip side is if I have those measures, I run a big quant study, and I'm like, "Wow, I didn't expect this outcome." Then I'll flip back to qual and say, "Where did this come from? Why did you answer this way?" To get some more nuance on it.

Benjamin Wiedmaier:
To your point about unlimited resources, it sounds like then you would go back to a survey to then check those. That's your and my background. We're never done.

Casey Oberlin:
It's never done.

Benjamin Wiedmaier:
You run a 10-year study, and it just gives you 10 more years worth of studies and questions, which is both frustrating and part of why you're in it. We have a lot of folks who are asking about specific things that you're doing. Folks, we have a case later that we're going to be running through that I think will answer some of your questions around examples. And then we have a lot of questions about, I'm a small team. How do I get started? So don't worry, folks. We will get to those questions. I want to first, however, talk through some what I'm calling modern mixes.

Benjamin Wiedmaier:
These are best practices or wins that you've had. You've alluded to some of them where you're starting with a broad survey and you see things, and then you're going in to interview. Can you talk about some of the nice mixes that you've got that have combined well?

Rosalind Koff:
Yeah. Go for it.

Casey Oberlin:
Our client was a free healthcare clinic. They wanted to know who was using their services, but, more importantly, they wanted to get at non-users, which is for any researcher is really tough. How do you capture who's not using your product, your service, or coming through your door? So we did a large survey of every single patient they had. But then we also noticed that we wanted to make sure it was really we were connecting to understanding what they needed from their current patients to get at who might be in their networks that's not currently using the service. We then did cognitive interviewing or following up on that survey, going back through some really eloquent patients to ask them, "What were you thinking about when you were taking this survey here? How did you think about stigma or what was the role of that for you in seeking out free healthcare services?"

Casey Oberlin:
This kind of helped us dig into that we really needed more interviews, which was expensive and we did not initially have that on our research agenda. We're working with a small, non-profit agency. We were a team of three researchers. So we were like, "Okay, how are we going to determine who to target?" We really got a lot of information from that cognitive interview asking them what did they think we should really be asking about to people in the community. So then we knew who to target very specifically doing focus groups with the most in-need population, that we were capturing that gap or who they were missing. That was a really effective use that we didn't plan on initially for a mixed methods approach.

Benjamin Wiedmaier:
And by cognitive interviewing, briefly, you mean ...

Casey Oberlin:
Typically it's used when you're having some kind of survey and you want to then understand the actual participants' understanding or their interpretation of the questions you have. So you are interviewing them about the survey. It's a very meta moment. I think researchers love it, and folks, you kind of explain to them like, "I just want to know more and make sure that I'm getting it right," is really the way I usually frame it to the people that I'm talking to. I think most people are frustrated when they're taking the survey. They're kind of in between two categories or they're not quite sure what you mean by a certain question. So I think it's a very effective way, and I've found a lot of success with that in getting to tricky or sticky things.

Benjamin Wiedmaier:
I know this is a challenging question because I know that it is so question-dependent. Your best mix is the one you just used to answer the question.

Casey Oberlin:
Exactly.

Benjamin Wiedmaier:
We had an event here this week with women in product and we had some brilliant product managers up on stage talking about how, again, folks who as a product manager aren't typically doing research, but they're working with and maybe collaborating with researchers. Someone in the crowd asked, "Well, how do you make the space for research when you don't know what research you might need? You might not know the questions that you're going to get from building this thing and shipping, if this has got a value of ship and learn." I think many of you out there have similar, just get some product or something in the hands of the users. And then you find, "We need to actually learn more about ..." One of the product managers said though that she builds in some oh shit research time.

Casey Oberlin:
Yeah. I love that.

Benjamin Wiedmaier:
It's like I don't know what method we're going to use, but I know we're likely going to need something. It sounds like cognitive interviewing is part of that. We know we want to do a great big survey, however let's build in some time. Again, think about if you're on a small team. You can advocate for this method with your client saying, "it will give you a better ..." Again, they're probably focused on a method they're bringing and like, "We want a great big survey, tons and tons of data from non-users." You go, "Yeah, we're going to definitely do that, however it's going to be better," or, "Yes, and it's going to be much better if we use cognitive interviewing, do some observational studies, use a focus group."

Benjamin Wiedmaier:
If you're someone out there who's saying, "Yeah, I've advocated for mixed methods and I'm banging my head against a wall," use some cognitive judo to say you know what your client wants. They've likely said it in pretty plain terms. How can you use, not just a qualitative method, but how can you use a method they're not thinking about to support the method they really want?

Rosalind Koff:
Right. You lead with kind of the end result. What are the insights they'll get out of this and then walk it backwards in terms of the best way of getting there.

Casey Oberlin:
Yeah.

Benjamin Wiedmaier:
Go ahead, please.

Casey Oberlin:
You're going to just be able to leverage it more then, as well. This is kind of it's going to be more satisfying in the end, I think, for them.

Benjamin Wiedmaier:
Is there a, innovative is the worst adjective that I can think of, but it's the only one. When you think about the things you've thrown together, can you give me a creative or unexpected mix that really worked well? For those of you out there, the studio is ... I mean there are our skunk works. We give them the time and the freedom. You've shipped foam products to people for them to build when you didn't have something or you've asked them to draw things.

Rosalind Koff:
Yep. We've recently shipped a bunch of Lucite blocks out to scouts to get their sense for in their world where it would be helpful for them to have a tool that would help them with day-to-day activities. So it was this kind of random form. I think it was actually a picture frame from Amazon that served as this Lucite block. Just to think about in their worlds to get creative and pain points that this fake tool might be able to help with.

Benjamin Wiedmaier:
Yeah. Just for our listeners out there who might be pressing against stakeholders, externally or internally, who don't quite know what qual is, who think of it as interviewing ... I mean the three of us are advocating for these sort of ... Obviously, we're at dscout, so remote methodologies are always top-of-mind for us. But when you think about how much you could get if we built a wearable or we gave scouts something that they could build their own wearable because the customer didn't know exactly if they wanted to or what that thing might look like. We'd have scouts making necklaces and glasses and watches and bracelets.

Benjamin Wiedmaier:
Our team wouldn't have thought of those. I mean I guess we could have spun our wheels, but it's much better to have the would-be users figure that stuff out. So if you're out there, challenge yourself and your team to think beyond interview. Interviews and focus groups are no doubt imperative, and the studio goes out and visits folks plenty, and observational studies are wonderful. There are ways you can challenge your participants, your would-be users, to think more creatively. I don't know if that's bubbling up into anything else that you've done.

Rosalind Koff:
No, it's true. I mean we'll do a lot of journey mapping. My favorite part of every project is the moment where we have the participant actually map their own journey out and take a picture of it and share it with us and the things that they're able to communicate. No skillset of artistry is needed. The things that they're able to communicate with us in that snapshot could not be measured in a quantitative study. It's special. Participants are excited to share their viewpoints. So speaking to that stakeholder who maybe doesn't know where they're trying to get, sometimes doing the work and showing them the value that they'll get out of it is the best way to get their buy-in for future studies.

Casey Oberlin:
Yeah. It's another kind of data too that's not necessarily tied to being verbal. It's some kind of spatial representation that matters to them.

Benjamin Wiedmaier:
Yes. That's a great point.

Casey Oberlin:
So I think that's kind of a way for different scouts or participants to engage that might not be maybe as expressive verbally, but then they can be very detailed, intricate maps or use software to produce things we never thought they would to formalize what they're thinking about.

Benjamin Wiedmaier:
Yeah. To that point, I have some of our customers report to and work with engineers who, again, have a different worldview. They're working on different questions and problems day-to-day, giving some of our customers ... We ask them, "Well, why don't we just ask for screenshots? We'll just get hundreds and hundreds of screenshots of what's going on in moments of bliss and dread." We have stakeholders who then just share them out with the engineering team. "Here are 25 images of the worst moment of the last week using this app or service."

Benjamin Wiedmaier:
It's great for engineers and designers to like, "It's that page that we thought was working really well. Our backend data wasn't showing us any latencies or any error messages." Because users are kind of resilient, and so they're just forging through. But as researchers, it's our job to, "No, no. Let's pull on that. Let's pull on why this particular screen or this part of the experience isn't working." To your point, Casey, the visual data speaks wonders when you can put it in front of a builder and say, "Well, it's this." Most people are saying, "This is really not good," and here are the words they're using to describe that particular screen's experience.

Rosalind Koff:
Yeah. I ran a workshop earlier this week with one of our customers. It was a room of engineers that we were working with, which is always such a fun group. They all were so bought-in and proud of what they had built, which is awesome. It was a great foundation. But there were some real pain points for users as they navigated through it. Our ability to kind of create materials, get them on the platform, have them watching videos, and seeing from the user's perspective what their reactions were, number-minded people seeing that qualitative work, it really flipped the coin, I think. They walked out excited and motivated to make the changes that they needed.

Benjamin Wiedmaier:
Yeah. And we're hearing-

Rosalind Koff:
Very cool.

Benjamin Wiedmaier:
Sorry about that, Roz. Yeah. We're hearing so much from folks who are, again, I'm seeing ... We have 24 questions, which-

Rosalind Koff:
Dang. Sorry.

Benjamin Wiedmaier:
No. It's awesome.

Rosalind Koff:
We're talking too much.

Benjamin Wiedmaier:
Thanks so much, y'all, for the engagement. I love it when y'all are motivated and engaged and interested in the topic. I wish we had many, many more hours. Many of you are working on small teams and we've been hearing and writing about democratization and finding cross-team collaboratives. I promise we will get to it. I keep teasing like in ESPN, "Next, we'll talk about why."

Rosalind Koff:
Oh my gosh.

Benjamin Wiedmaier:
Before we get to it, I wanted to talk about a project we did internally, and not just this, but we can talk about others. But this one in particular leveraged the experience and the expertise of both Roz's background and Casey's to make for what we thought was a really compelling piece. So Zo City, that's right, another nickname. Zo City. Zoe, my colleague and compatriot, will drop the link for this. Tech & Us was a big project we did. We work with a lot of digital companies, folks who are interested in media broadly and I was hoping we could use this to spin up how we, dscout, our studio specifically answered some questions in a mixed methods way. So I don't know if you want to give a quick rundown.

Rosalind Koff:
Yeah. Let's do it. So we ran this work last summer. It definitely was asking, well, we knew that we had an interest in studying big technology companies and how people felt about them, but we didn't go into it totally sure about exactly what areas we wanted to dig into. So we ran a diary study and learned a lot from participants that they're super interested in how trustworthy they are, how big tech companies are regulated. So we built a quantitative instrument out of that to run against a large non-probability base sample to validate what we were seeing in the qualitative work. We got that data back and we were like, "Man, what are we seeing? People are super positive."

Benjamin Wiedmaier:
About social media companies-

Rosalind Koff:
About big tech companies.

Benjamin Wiedmaier:
Got it. Okay.

Rosalind Koff:
We were like, "Didn't expect this." We expect some positivity, but not to the extent at which we saw it. Coming from my really annoying, super quant-heavy background where I worked with probability samples I'm like, "Guys, let's think about the sample composition. We surveyed Internet-only households. Obviously, they're going to be more excited about tech. We've got to fix this. Go to a probability-based sample with non-Internet households. See what those rural, non-Internet users are doing and let's see how it impacts our findings." Reran the work quantitatively once again and our findings barely changed.

Rosalind Koff:
We really saw that people overwhelmingly are fairly positive about big tech companies. While they want some things to change in terms of how they're regulated and different social responsibilities that might be associated with them, they were really excited. So we went back to qualitative, ran another diary study, and we said, "Hey, this is what we found from this work that we've run. We want to run a few concepts past you to understand if our perspective on what we're getting from this data, how you feel about it." We had some participants kind of poke holes in that framework.

Rosalind Koff:
I think that a study is successful whether you have big findings, like revolutionary findings or not. This is definitely an example of we didn't necessarily go into it planning to have so many stages of data collection, but what came out of it was a lot of learning for us, definitely that ebb and flow, kind of on-your-toes. We had that initial qualitative, quantitative pair planned. We did not have the subsequent add-on quantitative and qualitative work planned and, yeah, it was pretty cool. It was a lot of fun.

Benjamin Wiedmaier:
Yeah. You were doing this fairly quickly.

Rosalind Koff:
Oh yeah. This was over probably two or three weeks. We had this-

Benjamin Wiedmaier:
And you were running through-

Rosalind Koff:
... really fun guy, Ben from People Nerds, who really wanted us to get our findings together.

Benjamin Wiedmaier:
Oh boy. Yeah. I do remember, yeah. Similar background to Roz, actually the three of us all have similar background. The confirmation or rather the non-statistical significances and just, well, there's nothing. I mean there is, but there are differences or changes that we thought. I think it, for me, reflecting on it now, it was a smooshed design thinking innovation sprint when, just like you said, we had the qual-quant pairing that we really thought was going to tell us, "Okay, we'll see a bunch of negative attitudes in the quant. And then we'll dive deeper with some folks on both sides of the aisle, so to speak, and we'll jump in to see what are those moments and what really resonates."

Benjamin Wiedmaier:
Didn't see anything or, rather, saw something we were unexpected. Ran another survey and then another qual and we were sharper. You all were much sharper in what you could ask and when.

Rosalind Koff:
For sure. The picture was much clearer.

Benjamin Wiedmaier:
Exactly. Triangulation.

Rosalind Koff:
Yeah. For sure. This happens with our customers all the time too. We'll go out set to do a specific diary project and we'll realize, "Man, we really need to get some IDIs in here. There's something more to be said in this effort." So I think if I'm talking unlimited resources, I think it's the fun, unexpected stuff that makes those unlimited resources worth it.

Benjamin Wiedmaier:
Certainly.

Casey Oberlin:
But it highlights how quick you could also do that. There's the sense that you're always kind of having to do long, drawn-out processes. I mean it's a lot of human power in that time period, but I think it is something that you can see. It's much more robust in a quick period of time.

Benjamin Wiedmaier:
Yeah. This is a nice lead into what we'll finish with, folks. I promised it to you all day. We're going to now talk about how you can start some of the stuff that we've been talking about. We're going to talk about leveraging your partners internally and externally, setting expectations for your stakeholders, and how, again, those of you who are more qualitatively-minded, you know that you don't want to do 100 IDIs. You want to do, depending on if you're aligning them with a persona or with a product line, you may have a specific number or if you're doing geographically in certain households. But five to 10 IDIs with a big dataset, man.

Rosalind Koff:
The things you can do.

Benjamin Wiedmaier:
After the the first three, I'm like, "Oh my gosh. I want to run another two more surveys that focus in on something I had no idea." Can each of you start talking a bit about how folks can get started? Let's start with internally. You're at a big or small place and you need to start building some bridges. How can you do this a bit more?

Rosalind Koff:
I mean real talk?

Benjamin Wiedmaier:
Real talk. [crosstalk 00:30:56].

Casey Oberlin:
That's real time.

Rosalind Koff:
Sorry. I was initially a qualitative researcher. I remember the moment in grad school where they were like, "Yeah, but you got to put some numbers to that." It hurt my soul. After learning quantitative work I'm kind of like, "Okay, I thought I was such a qualitative diehard." I learned quantitative work. I appreciate them both. I've kind of just become a champion of anyone can learn how to do these things. I think researchers, and as a researcher myself, view ourselves pretty highly. While we do have lots to contribute to an organization, when you don't have a big team and you want to share the love of research, there are lots of people who can learn some fundamentals on qualitative and quantitative work and get their hands dirty.

Rosalind Koff:
The more people who are thinking with a researcher mindset even if they're not actually running work, in their day-to-day they're infusing those things in what they're doing. In organizations that silo people into you are only quantitative, you are only qualitative. They're creating in the work that they're approaching, the way that they explore research, there's no triangulation going on. So there can be some really big holes in what you're doing. So I feel like organizations, even a researcher team of one or none, can really empower people to start thinking about these experimentation, bringing creativity to the work they're doing, to better understand users. In the end, it'll give you more reliable data because you're thinking about it with a bigger lens.

Casey Oberlin:
Right. I think there's such an important point of not being precious. It's not only you hold ... You have to be an expert. You have to know every single part of one methodology. You only do qual. You don't do quant. I mean there's also something that I found. People have searched what you have on LinkedIn or what you have on a random website you forgot that even existed any more, that you did some project 10 years ago. So you have some inkling of what that might look like and you can quickly refresh some of those skills. I think there's a lot of times where we have to feel like you have to have all this experience and before you can even say you're interested in participating in this one project together.

Casey Oberlin:
I think that's where I've been found by folks where I'm like, "I don't remember that. I published that article five years ago. So it was really done seven or eight years ago." But this is something that you have experience with. So you could go back to doing deep, ethnographic work even after all of your quantitative expertise. I think that's where just willing to be more vulnerable and saying, "It's been a little while, but I don't need to be precious. I don't need to be perfect." We can really engage in these questions together and teach each other while we're doing the project. Let's be honest. It's what you learn.

Casey Oberlin:
Every kind of project brings with it new issues that come up, new methodologies you have to kind of tweak that I think is really powerful of getting started and just not ... There's not a perfect starting point.

Rosalind Koff:
Totally. Those of us that are in this conversation right now, we are truly People Nerds.

Casey Oberlin:
Truly.

Rosalind Koff:
We are excessively socially aware most of the time.

Casey Oberlin:
We try.

Rosalind Koff:
We have imposter syndrome. If I am a trained quantitative researcher, it can be really scary to dip your toes into the qualitative pool. I read this awesome article about an organization that actually has all of the qualitative and quantitative researchers sitting on the same team with the same titles without big distinctions between who they are. That's a world in which you can get to collaborate, you can get to experiment. It's okay to fail and learn from it and try out different methods. So that's our pedestal.

Benjamin Wiedmaier:
Yeah. That is really important. We're going to be talking next on People Nerds about iterative rolling research, so stay tuned about information on that. How you could build these practices where it's like, "Let's just get out and do some ... Let's just do some research. Just ask some questions and get some answers." We have a lot of folks asking questions about convincing or persuading stakeholders. The two of you have worked with many customers, again, in different parts of organizations, with different levels of expertise and knowledge about research. I'm using air quotes here, what "research" means.

Benjamin Wiedmaier:
How do you persuade or convince them to allow within the work something that's qualitative or quantitative or just mixed in its approach?

Casey Oberlin:
I mean some success I've had was what are you really concerned about? Is it getting an increased budget line in terms of thinking about who your stakeholder is? Who are they reporting to or what's the larger context in which they're operating? Because thinking about them, well, how can that research help you get to that bottom line faster or more effectively I think really helps make that concrete in terms of their problems, not coming in with my own, "Here's the perfect research agenda that I would do." Typically as a former recovering academic, that is a tricky kind of balance.

Casey Oberlin:
But it is something I've really learned a lot from Roz and members in the studio of thinking about we need to really meet our stakeholders where they're at and help them address and look good in front of their own community of the broader stakeholders that they're reporting back to.

Rosalind Koff:
Yeah. In identifying that bottom line, you're definitely aiming to accomplish whatever goal they have set out and maybe push the line a little bit more, share a little bit extra that they didn't know that they were looking for. But with your researcher brain you can read between the lines and add that offering. And then it sounds silly, but also the delivery of the insights I think makes a huge difference. Having a compelling storyline, including multimedia as best you can, getting the actual voice of the customer behind really can have more impact than you would expect and can open the door for continuing work like that in the future.

Benjamin Wiedmaier:
Yeah. This great product manager group, Women in Product, where they're meetup-based in San Fran but they have chapters all over the US. Check them out. Doing some really killer work. Another product manager was saying how she gets a lot of pushback. They're like, "Let's just ship, ship, ship. Everything just get in users' hands." She is a design thinker and researcher, so she doesn't call it research. This is Autumn Schultz. Shoutout to Autumn. She says it's early optimization is what she calls it.

Casey Oberlin:
I love that.

Rosalind Koff:
That's so awesome.

Benjamin Wiedmaier:
We have a lot of folks who are saying that they're reporting to, again, PMs, engineers, folks who are more technically-minded. They just want to, "Okay, there's a problem. Let me build it. Let me ship and learn." She makes the case that, "Well, yes. You don't want to do all these engineering hours in this sprint only to have to redo it again two sprints later. Why not take four days or a week? Let me do some light touch work that involves an interview or focus group." I mean you can go this with a Google form or ask people to email you screenshots. I mean you can be really scrappy with it. And then that can make for such a much more informed roadmap.

Benjamin Wiedmaier:
You might have to call it something else is what I'm saying. You might not be able to say mixed hybrid or multi-research. You may have to get some business idioms out there. Early optimization is one-

Rosalind Koff:
That's a great one.

Casey Oberlin:
Yeah.

Benjamin Wiedmaier:
... that folks are thinking about. Well, we have lots of other questions related to, again, stakeholder pushback, convincing data scientists to work with you. I think one of the theses that we've got here, if I can be so bold, is that there's a lot of misconceptions about ... It sounds like most of you out there are qualitatively-minded, working with folks who may be more quantitatively-minded. So let's take that case specifically. You have someone who, "Just give me the numbers. Just give me the frequencies on ... Just tell me what is."

Benjamin Wiedmaier:
What are some of the phrases or I mean, again, we touched on them. But what are some of the things that you've done historically or have learned yourself in bolstering qual's role at the quant table?

Rosalind Koff:
Yeah. I ask a lot of questions about what was unexpected in their findings. So you ran a bunch of these frequencies and whatever aggressions that you are putting together. What popped that you wouldn't have anticipated? Interesting. Why do you think that is? Well, have you considered these angles of it? Wouldn't it be a cool add-on to go to users of this particular group and get a better understanding of why that's happening? I think sometimes in quantitative work it can be really easy to overlook that small subset of people who have something different going on.

Rosalind Koff:
But that can actually be a really impactful part of the people that you're serving. So yeah. I'm a question asker.

Casey Oberlin:
Yes. I think a lot of questions and figuring out how can you maximize on that group and make that really leverage to how else are you going to explain what's going on here? It's going to be very not satisfying to just present these are certain patterns. Everyone inevitably asks why is that happening? How can we make that better? What is it going to look like in two years from now? So I think there's a sense around how do we really blow that up for further narrative? You kind of are left short often when you just have a close-ended survey item, even if it's thousands of people.

Rosalind Koff:
Yeah. But to our point earlier that mixed methods isn't always the best approach, if the answer to that question of what was unexpected and they say, "This thing, but that's not a group that we're looking to include in this future work," then that's a time to back off. I feel like pushing your agenda on whatever research is to come doesn't give you that continued legitimacy as an expert consulting on it.

Casey Oberlin:
Right. Yeah. There are boundaries around it's not always going to be the best tool.

Benjamin Wiedmaier:
There's a lot of other questions about speed, which is another misnomer. Doing research "right" sometimes very often bumps up again, well, we have this roadmap or we have this timeline or the executive team is meeting on this day and that's when they're meeting. So shameless plug, that's when a tool like dscout, which is smartphone-based, can help you. But again, leverage the media. Leverage the tools out there, Facebook polls or jump into a Slack community. You can be scrappy if, again, it's just in time insights. You've got your quant, let's say, but you need some qual.

Benjamin Wiedmaier:
I don't know. Leave your office. I studied flirting as part of my academic research. I was in bars and restaurants and I was literally that guy, "Hey, I just saw you talk to someone. Will you ..." You may, again, only be able to talk to three people, but it will hone you in on some things that will allow you to say to the leadership group or the executive team, "But actually what I'm hearing is this." Yes, they're going to ask you sample size. But drive customer centricity, which is good business, and making fans of your users, which is good business. Be scrappy.

Rosalind Koff:
Yeah. Neither quant nor qual has an insights button in which magically emerge your findings as much as I would be interested to see one. Picking the right tool, dscout or otherwise, I mean not just the vehicle in which you're collecting your data, but different menu-ing that you have available to you to help sum things up or clouds of that sort, super helpful. I think also being thoughtful about your sample, so if I know that I don't have a ton of time to look at my data I'm not going to do a qualitative study of 200 people. I will only feel bad. I probably won't get that much out of it.

Rosalind Koff:
I won't make it through all the data. I would focus in, as you said, Ben. Pick a sample of 30 and really dig into what I'm seeing with them. Get a sense for the landscape to at least have that initial conversation and maybe pursue more research after that.

Benjamin Wiedmaier:
Yeah. You all do very vigorous research briefs with timelines as to what you're going to do when. I would encourage you to do that with your clients and stakeholders both internally and externally. Advocate for the methods you want to do. Append KPIs or discuss how that nature will help further methods. There's been some questions about what are lists of mixed methods. My colleague, Zoe, just chatted an IDO design kit, which is this great, great big group of methods that are both qualitative and quantitative. They lean more qualitative because ... Roz and I will have to get on another subsequent webinar. We'll just talk correlation regression, ZT tests. We'll do structural equation modeling.

Rosalind Koff:
Let's do it.

Benjamin Wiedmaier:
For the 2% of you out there who are like, "Yes, do that, yes," we'll look forward to having you. But yeah. Check out that IDO design kit. You're looking to answer questions. Don't be too prescriptive in which method is best. Just like Roz and Casey have been saying, start with the question or start with what you're hoping to do. What's the action? Is it you're trying to iterate on a feature? Are you trying to understand whether or not you need to turn something off? Are you trying to get into wearables? Are you trying to become a brand? Start with that question and then your methods should follow.

Benjamin Wiedmaier:
As they've said, partner with folks. Lean into the folks that you have in the office who, again, may not have research in their title but are folks who are trained in asking questions and seeking solutions. Thank you again. We're running low on time. I know we've got a lot of questions left. We'll have to follow up with Roz and Casey. But is there any lasting words on mixed methods that we didn't get to, something you want to leave our audience with?

Rosalind Koff:
I would say not to be scared to try them out. Again, speaking to that idea, the imposter syndrome, truly experiment even if you don't initially have findings. You'll learn from the opportunity and eventually be able to find the synergy between method that really works for you and what your business questions are. That's my biggest thing. I think there's a line in the sand for actually no reason. We could do a much better job of having open conversations about blending our approaches.

Casey Oberlin:
Yeah. I think there's something around not just the team that you're on right now, but the teams you've been on in the past, who you know, other researchers of broader community, of not being precious and being vulnerable and saying, "This is something I really want to know more about. I've seen you've done a lot of work in this area. Can we chat about it?" Having more of a one-on-one with them about their process can really be a helpful resource. I constantly am asking folks, "I know you've published this. You did this work last year. You have this big project. How can I learn from that?" The general insights, not obviously the content.

Casey Oberlin:
But thinking about what are the best ways to move forward. You have a broader team than just the one you're actually employed by.

Benjamin Wiedmaier:
Right. Last year I had an opportunity to speak with the head of research at Airbnb, Judd Antin, and I asked him, "Are you hiring more qualitative people or are you hiring more quantitative people?" He said, "I eschew that entire ..." He's a person who can say something that's rather out there because he's leading a team. But he's leading a team of smart Airbnbers working on a really thorny problem and he said, "I'm not hiring qualitative and I'm not hiring quantitative. In fact, I don't want to hire either of those people. I want people who are flexible and are interested in end goals, are collaborators."

Benjamin Wiedmaier:
He kept stressing how at Airbnb, "We can't hire silos. We need to hire people who are going to break through the lanes." Granted, Airbnb is a place where he might be able to do that. But if you're out there thinking, "Everything's really siloed," just reach out. Challenge yourself to both. There's a lot of methods that you can quickly spin up and learn. But again, as we've said, don't need to be precious and perfect. But if you do a few interviews and you pair it with a small survey, man, there you go. You're the next methodologist. Welcome to the club.

Benjamin Wiedmaier:
I know that we can be really ... I like that I have a PhD. Roz has a master's. Casey has a PhD. We enjoy these titles that we worked hard to earn and along the way I wish I didn't have to pick a lane, at least methodologically, because whenever I've met with somebody who, at least in my case, was doing qual I'm was like, "That's so cool. I wish I would have interviewed people about flirting and hookups instead of just surveying." So press yourself to see the question or the potential solution from different angles. I think that will go a long way.

Benjamin Wiedmaier:
Thank you so much for joining us. This is Roz and Casey's contact info. They have been generous enough to offer their inbox time. If there's something that you want to holler at them about, please do so. I'm ben@dscout. I'm always happy to receive emails. I know that I email many of you each week. Today you'll actually be getting another one. So thanks for subscribing to People Nerds. Subscribe if you haven't already. Tell your friends about People Nerds. Roz, Casey, thank you so very much.

Casey Oberlin:
Thank you. This was just fun.

Benjamin Wiedmaier:
It was a great pleasure to chat for a little bit.

Rosalind Koff:
It's been real, guys. Thanks for having us.

Benjamin Wiedmaier:
Be well, y'all. We'll talk soon.

The Latest