How Relish Works Used dscout to Address the Rapidly Changing Restaurant Industry
The restaurant industry is complex and continually changing. Here's how Relish Works and dscout teamed together to better understand the food service employee experience—through 400 rapid-fire interviews.
Relish Works is an innovation and strategy studio helping restaurants meet the moment presented by an ever-changing landscape.
Given the restaurant industry's unprecedented and ongoing changes—from pay equity and schedule flexibility to critical safety and health concerns—the Relish team wanted to know what opportunities and challenges operators should prepare for.
Innovation Studio Director Kate Micheels and Design Director Fernando de Buen Lopez conducted research to understand the food service employee experience, capturing moments of life from the folks who make the industry possible.
Conducting accurate research for those goals was no small task. The Relish Works team strives to match methods, data, and insight-type with their stakeholders, which impacts everything from their recruitment strategy to secondary research. And the restaurant industry is diverse and nuanced, with hyper local, neighborhood-only brands all the way to multinational conglomerates.
In the following case study, Kate and Fernando describe how their team translated an opportunity facing the entire industry—the crisis of labor—into a quick-turn mixed-methods study.
The process involved 400 two-minute interviews that snowballed into in-depth insights, plus the opportunity to cross-pollinate information with different teams.
Read on to learn about their approach, findings, and next steps.
dscout: What was the impetus to launch this research?
Fernando: The industry was and still is facing a complex set of issues around staffing. Recruitment, retention, and development were on the minds of operators before the pandemic, which turned the heat up on all of this.
Most immediately, the restaurant's unemployment rate was double the national average and operators were asking what needed to change so that they were ready for whatever the new normal of food service was.
Our team had access to more information about the operator/owner side of things: the operational, financial, and strategic sides of running restaurants, and how those were impacted.
Less was known about the experience of the people serving, cooking, delivering, and taking orders—the folks making it all possible. We wanted to provide that POV and narrative before we made any recommendations.
Kate: I want to mention representation and context here, too. Our team felt that surfacing the perspectives of the worker would clarify and add nuance to the splashy numbers and narratives floating around the industry.
A stat like "50% of food service labor has left the industry" can't be interpreted fully; it lacked comparisons and a complete picture. We wanted to help our partners understand the "why" behind the "how" and "what" data they were using to make decisions about their business.
When we added up what we needed and the timeline, dscout just fit really well.
Design Director, Innovation Studio at Relish Works
How did dscout fit into your research plan? Why choose it?
Kate: The scout panel is full of consumers who really represent the folks comprising this industry, especially from the worker side. We wanted to quickly recruit a national sample of industry workers across restaurant type, work mode, and experience. dscout's panel made it easy to choose, as it augmented our secondary work we were already doing to unpack the operator side of things.
Fernando: To that I would add the ability to collect quant and qual data—at relative scale—in the same study. This industry moves fast and is always looking for quick wins. dscout's ability to give us moderated interviews and a survey without a separate recruit was a nice fit for our needs here. Thetool gave us the chance to conduct a relatively quick, mixed-methods study with a national sample. When we added up what we needed and the timeline, dscout just fit really well.
We used the Express mission to capture several hundred responses, branching folks by a few key questions:
- How long had they worked in the industry?
- Were they still working in the industry?
- If they'd left, how recently did they leave?
We also dug into the motivatiations for remaining or leaving, and the context around those motivations. Was it pay, schedule, health, family reasons? The Express mission let us start wide and get a big sample to etch the boundaries of what might be going on.
And even though I knew the video responses would be powerful to our team, I didn't quite grasp the combination of authenticity and scale. We collected over 800 minutes of media from that survey alone!
That’s because the core of our analysis—we started tagging, thematizing, and converting what we could into quant to determine the stickiness of what we were seeing. We could use the open-ends and the closed-ends to add framing and context to. It was a trove of data to get us started in a short amount of time.
In a way, the Express mission was like having over 400 two-minute interviews. Our team approached the analysis of these shorter "interviews" much the same way we would longer interviews: pulling quotes, tagging, and using post-it notes to affinity map themes.
That focus—the two minute time limit—helped the participant laser in on the most important or top-of-mind element of the decision to stay or remain in industry, which really helped our team during analysis and synthesis. It was a different dynamic compared to fewer, longer interviews, but we got the data we needed—we got a lot of depth within a single, specific area. It was very interesting—a different way to do a similar type of research project.
Kate: And as this tagging and top-level analysis was happening, we were flagging scouts who we wanted to follow up with with a longer interview. Folks who had particular stories, worked in specific segments of the industry, or who were sharing new insight we thought operators should hear. Yes we were analyzing data, but we were also recruiting for the next phase.
Did other teams hop in or check out the responses?
Kate: Some consultancies hop around from industry-to-industry, whatever the client is in a particular week. We're different because we're so focused on a single industry. What that means here is that other teams, not working directly on our project, can also find value in the data we were collecting.
So other teams might have peeked at some of our analysis boards, grown curious about what data we were using, and checked out the videos for themselves. And even if the videos and open-ends didn't directly address their project goals, it might inform, shape, or sharpen their designs because they had something to build on.
It helped them narrow in on what they should or should not focus on for their work. For some teams, the work was interesting, but not immediately useful, so they'll bookmark it and come back to it.
The cool thing about the scale and mixed-methods nature of the data we collected was the repository element: as we socialized our findings internally, other teams wanted to check it out, which likely impacted their own projects and work.
As a company it's important that we stay connected in some ways to other work going on, because it's all in the same industry, so there's a benefit to cross-project collaboration and visibility.
Fernando: Yeah, other teams were looking at our tagging and thematic analysis structures to inspire their own. And I don't think they used the same tags—they had different objectives—but that cross-pollination was inspiring and helped other teams in their analyses.
Having all of this data on a single platform is a huge value-add for our team. We were regularly saying things like, ‘Hey go take a look at the video response. This is your data as much as ours.’
The data are not stuck in the platform—we can move it in and out. That ended up accelerating a lot of processes for us, because it offers us different data that inform and go into ongoing pieces we knew we wanted to create. It's collaborative and flexible.
Kate: I was just in a meeting with another leader from one of our other ventures who had a question and it had some overlap with this work, so I shared the link. Even in those early, developmental stages of the work, being able to share a key user insight goes a long way.
The cool thing about the scale and mixed-methods nature of the data we collected was the repository element: as we socialized our findings internally, other teams wanted to check it out, which likely impacted their own projects and work.
Director, Innovation Studio at Relish Works
So after the survey, you went deeper with a few specific workers?
Fernando: Yeah, this project was different because compared to typical interview studies where we might not know very much about the participant when sitting down with them, we had a lot more context going in.
As a result, our interview guides could be much more tailored and focused, and we were able to maximize their time and ours, getting to some core elements of their experience in this industry.
It made for a richer, deeper conversation and I felt like the participants had a better experience, too. They knew more about the general goals of the study and why we were curious to sit down with them.
Overall, the process of launching the Express mission and then the Live ones helped us cater our interviews more to the specific person and their story.
We had a few other team members on the interview at once, but we wanted to limit it for the sake of the interviewer. The platform worked really well—the observers could invisibly message the interviewer if there was an area they wanted to hear more about. It was easy to coordinate with a few teammates, share a link, and co-experience the sessions together.
Kate: And during that part of this study, I was reflecting on ethnography's evolutions and what it looks like in today's hybrid and remote world. I remember doing two plus hour interviews in someone's home, setting up equipment while trying not to disturb their living spaces, then sitting down and trying to build rapport, and I just think there's a different expectation in a remote setting.
I think the scouts in this study appreciated how to the point we were—we allowed them to meander and wander if the moment called for it, but we were able to hone in on the two or three things we wanted to unpack that they had said in the Express mission. In person, we would have spent a lot more time warming them up, building trust, and then hoping to get to the heart of the matter.
Fernando: I think that's spot on because actually, at that point, they all know why they're there. They've already talked to you in some ways. They're like, ‘Oh, you're talking to me about this thing that I told you.’ And so it's like that context set up has already happened.
Finally, any advice for someone looking to replicate this approach?
Fernando: Consider the critical information you need. For example, how much demographic data can you tune on the backend and what can that do to the questions you have scouts answer? Does it make sense to filter folks or just program another mission for a specific subset of folks?
Those questions will go a long way toward maximizing the scout's time and getting your team quality data that matches your goals.
Kate: Even if you've used a tool several times, there's always a surprise or two in store. You just have to get into the platform and test things out—determine what might replace an activity and what might complement one.
My advice is to not over complicate things. Don't try to do too much all at once. As someone in the agency world, I know the value of maximizing research, but don't sacrifice ease of use for expediency. Research is iterative and is often better when one project informs the next.
Fernando: One other thing I'd add is to sign up for dscout as a scout. Try this as a participant. Viewing research questions and activities from the participant side is a great way to understand the ways it can work best for your needs.
Experiencing the study from that side gave me empathy that I used when programming questions and activities. I think our research was better because I knew what it felt like from that side of it.
You may also like...
Ben has a doctorate in communication studies from Arizona State University, studying “nonverbal courtship signals”, a.k.a. flirting. No, he doesn’t have dating advice for you.
Subscribe To People Nerds
A weekly roundup of interviews, pro tips and original research designed for people who are interested in people