Skip to content
Field Reports

How T-Mobile Used dscout to Harness Qual Data on Their Team’s (Lean) Timeline

T-Mobile's Andrea Lindeman takes us through two transformative use cases for longitudinal, in-context research.

Words by Mac Hasley, Visuals by Emma McKhann

Describing research at T-Mobile requires a lot of adjectives.

The user data they need most comes from studies that are mobile, agile, in-context, longitudinal, proactive, and challenging to do efficiently.

But as their UXRs work to incorporate research into fast-moving, fast-building, ready-to-test design sprints—collecting data efficiently becomes imperative for product success.

We talked with Principal User Experience Researcher Andrea Lindeman about how they’ve worked to do thorough research, with highly specific recruits, on a timeline that makes sense for their stakeholders.

Andrea is…

…a Principal User Experience Researcher who has been shaping the course of UXR at T-Mobile for nearly a decade. She ended up in UX after “falling in love with the human side of engineering” and has always been devoted to engaging her stakeholders more directly with their end-users.

User research at T-Mobile is…

…in a word, “scrappy.” They’re invested in building quickly, shipping frequently, seeing “what sticks,” and noting what needs to be repaired. It’s essential they go from “concept” to “data collection” fast. UXRs work to influence designs before they come out (which is frequently), or to quickly follow and test once new designs have launched.

Everyone wants to get things out quickly and address problems they’re seeing. But on the research side, that hasn’t always translated well to getting ahead of projects, and being able to influence things before they go out.

Andrea Lindeman

The challenge:

At a company like T-Mobile, mobile research is a necessity, and longitudinal, contextual research is often needed. And executing mobile, longitudinal, in-context research can be a resource drain.

“Oftentimes, we're interested in very concrete moments that happen intermittently,” says Andrea. “For example, understanding the context and experience of someone paying their bill.”

“Shadowing someone in real life until the moment they decide to pay isn't an effective use of our time. And it's way more useful to see the actual moment than to tell users in a lab 'Let's assume you just got your bill...'"

Time and resource constraints can be more of a roadblock when your company moves fast and is hungry for insights. Product teams works at the speed of agile, and research teams had tried a plethora of tactics to keep up: Google design sprints, pre-scheduled in-person usability sessions on a regular cadence, and longitudinal studies that combined manual diary entries over time and wrap-up focus groups.

“Everyone wants to get things out quickly and address problems they’re seeing,” says Andrea. “But on the research side, that hasn’t always translated well to getting ahead of projects, and being able to influence things before they go out.

“It put us at a disadvantage; because of the pace projects move, we were mainly coming at things in a way that’s reactionary, rather than proactive. We ended up testing super granular features, and we weren't able to think more high level about why are we doing what we're doing.”

“Since we've discovered dscout, we’ve been able to reach a larger number of participants and get more in-context mobile data—and we still get that data super quick. We can also reach out to the same participants over time and follow up post-study as needed. That’s been really valuable for us; ‘how much data can we get before a design comes out’ is a measure of success.”

With dscout, designers and product managers can watch the videos at their leisure, when they have free time. And so they do.

Andrea Lindeman

The solution:

For UXRs at T-Mobile to be successful, they need to pool a large number of specific users, gather their feedback quickly, engage their stakeholders successfully, and give “qual” data “quant-like” credibility.

“When we do in-person lab studies, we encourage stakeholders to observe the sessions. There's nothing more powerful than watching a person totally fail your design. But people don't often do it—schedules don’t often permit,” says Andrea.

“With dscout, designers and product managers can watch the videos at their leisure, when they have free time. And so they do. A three minute video they can watch at their desk is a lower barrier to entry than observing a full session in a lab.”

Beyond accessibility, teams also started to regard research as having more “weight.”

“When you're testing usability of a few different designs and asking 'which design do you prefer,' it's hard to make the case based on a lab session with five to eight people,” Andrea says, “You can’t go back to your stakeholders and say, “well, three of five people liked this one, so let’s go with this one.’”

“When you see that fifteen of twenty preferred one or the other and have rich video data to understand why, that feels more reliable.”

With dscout, we can target really specific audiences. I’m not necessarily going to find somebody that’s local in Seattle that uses the T-Mobile Tuesdays app once a week—and I certainly won’t find enough of them to recruit a whole study. But I can screen for that in dscout and get hundreds of participants that are willing to participate tomorrow. Reaching those targeted audiences is really powerful.

Andrea Lindeman

Case 1: Out of the app and in context

When we launched T-Mobile Tuesdays (our rewards program) we had a lot of behavioral analytics data within the associated app. But once people save offers and go to redeem them—they’re out of the application. We lose track of what they're doing, what happens to them, and what their experience is.

dscout allowed us to really effectively track where our users go and helped us to better understand their experience outside the app. They're able to record their screen and upload what they do—versus being constrained to a specific prototype or application—which was so powerful for us. We could also capture users when redeeming offers in a physical retail store to understand the in-store experience as well.

It was the first full picture that we had of customers redeeming various offers. And naturally, we found the experience wasn’t always delightful. So we were able to then go back to our partners and say, "Hey, the experience redeeming your offer isn't as good as it could be." And we were able to then make some changes on our end as well.

dscout allowed us to really effectively track where our users go and helped us to better understand their experience outside the app.

Andrea Lindeman

Case 2: More effective beta bug zapping

While the T-Mobile app was in its beta period—before a major redesign—we did a dscout study to test it. Participants signed up to interact in the beta over the course of four weeks. We asked them to do some specific tasks, but we also asked them to, whenever they went into the app, record their natural usage. 

If they were going in to check their data allowance, or pay their bill or what not, we asked them upload those moments as well. In this way, we could capture key interactions with the app in users’ own contexts and timing, resulting in much richer and true-to-life insights than we get in the lab.

Anytime a user submitted a video with a serious issue I was able to upload it to our project slack channel and say: “This is a bug.” Watching the videos back, we were able to replicate how the user encountered the problem—and we resolved countless issues that way, in real time, before the new app was launched.

dscout enables us to see users’ experiences once they leave the app or website and interact with another channel, such as in-store. For example, we could see a customer’s experience saving an offer in our T-Mobile Tuesdays app and then trying to redeem that offer in a retail setting.

Andrea Lindeman

The impact:

Relying on remote research at T-Mobile has been instrumental for expanding scope and capacity. With greater access to user data, stakeholders can move on research more independently and effectively.

"Once, I had finished a study and collected the data right before a week’s vacation,” Andrea says “I shared those videos with the team before I left, and by the time I got back, they had already ingested much of the information and started to act on it without me."

As the stakeholder’s access to data has improved, their capacity for data collection has expanded, the team’s scope has had more room to evolve. The path forward at T-Mobile has begun to look a lot less “reactive” and a lot more “holistic.”

“Our team is being more involved at the strategic level and thinking bigger picture—looking at the user experience end-to-end,” Andrea says. “In the past, things have been developed in silos. I think really our role now is stepping up a level and saying, ‘Okay. We have to think about the entire user experience as a whole and see where the pain points are and how we can address them holistically.”

Before dscout

  • Stakeholders are too busy to engage with lab sessions (and are skeptical of small-sample results)
  • Difficult to test in-app and multi-channel experiences in context
  • Unable to follow up or engage with participants after a study is complete
  • After dscout

  • Stakeholders can watch short videos with visceral feedback and begin to implement findings immediately
  • More easily test across channels to more fully understand the whole customer journey (e.g., website or app to store)
  • Can source hyper-specific recruits within days and follow up with participants as needed
  • Mac Hasley is a writer and content strategist at dscout. She likes writing words about words, making marketing less like “marketing,” and unashamedly monopolizing the office’s Clif Bar supply.

    Subscribe To People Nerds

    A weekly roundup of interviews, pro tips and original research designed for people who are interested in people

    The Latest