Skip to content
Ideas dscout

Getting Introspective with Usability Testing: Using dscout Express to Improve dscout Live

Sometimes there's no better way to test new features than to use the very product itself. Lead Product Designer, Alfo Medeiros, shares his approach.

Words by Alfo Medeiros, Visuals by Jarred Kolar

When the time came for dscout's team to test new product features, Lead Product Designer, Alfo Medeiros, knew exactly what to do—and that started with the very meta decision of using dscout’s fast-feedback tool, Express to research dscout’s moderated interview tool, Live.

In particular, he wanted to validate the new placement of several UI controls and in-call features on the Live product. He had several hypotheses on how to improve the discoverability of Live's in-call controls. But the first step, as usual, was to conduct some research. And as Alfo put it, "What better way to do it than with our own tools?"

Alfo used Live and Express Usability Testing together to improve the Live in-call experience for dscout's customers. The combination of tools seemed like a great balance between speed and accuracy for this particular case.

Below, Alfo dives into what the experience looked like and what kind of impact it ended up having.

✓ The approach

Why Express? Why was this tool a great fit for this project?

As a designer, usability testing is usually the best approach when it comes to validating concepts, understanding user behavior, and identifying issues that could improve user satisfaction and efficiency. Now that Express allows me to do it, I try to use it as often as possible!

Who did you choose to recruit?

Before we jumped into validating these UI updates through customer feedback, we wanted to initially test them internally. However, I was particularly keen on ensuring that our internal testing panel was as diverse as possible, considering a wide range of experiences with the tool.

It was important to me that we included both “Live-pros” (seasoned experts of our platform), as well as folks who had minimal interaction with the tool, providing fresh perspectives.

I pinged our main dscout Slack channels asking for volunteers and found a number of folks who were interested!

As a result, we assembled a group of nine from three different departments:

  1. User Experience Research (UXR)

  2. Product

  3. Sales

This panel allowed us to gather a great spectrum of insights and feedback, ensuring that the UI updates were seen from multiple angles before being presented to our customers. Our next step is to test with users to understand their adoption of the new UI!

What kinds of questions did you ask?

This is where things get very meta. Initially, we asked the participants to step into the shoes of a researcher. Their task was to conduct a mock Live call where I played the role of a Scout (dscout research participant). Though it was a mock Live call, each task we asked the researcher role to do was delivered through an Express Mission.

This additional layer was designed to really understand the experience from the other side of the interface. The usability mission was structured to get detailed feedback, by asking very specific questions aimed at dissecting the effectiveness of my decisions about the repositioning of critical controls within the interface.

These controls included functionalities essential for the Live experience, such as screen sharing, note-taking, and using the chat. The goal was to gain a nuanced understanding of the impact these changes had on the overall experience, identifying both the strengths and weaknesses of these modifications in the context of real-world application.

We would ask questions like the following in Express for folks to execute on the Live call:

  • "During the call, you want to confirm you're using the correct microphone and video inputs. Please show how you would find this information."

  • "Now, ask the Scout to share their screen. How does this experience feel? Is there anything unexpected about it?"

✓ The execution

How was your experience managing the participants?

Like every other time we invite folks to test our own products, everyone was genuinely excited to participate and signed up right away to help improve Live.

This project was a special case because I was actually "the Scout" in their Live call while they were completing the Express mission, so participants would be reading the prompts at the moment and I could see their reactions in real-time.

So before even jumping into analysis, I already had a very clear idea of how [participants] were feeling about the UI changes.

How was the analysis experience? What were your key learnings?

We had SO much rich information after these sessions! Not only could I create great playlists with moments of the Live calls, but I also had a straightforward and simple analysis with Single Ease Question (SEQ) thanks to the Usability mission.

It was super rewarding to find out that the majority of our hypotheses held true! The repositioned actions proved to be significantly more effective than their predecessors, enhancing the overall experience.

We observed a notable improvement in both the discoverability and user-friendliness of nearly all the features within the interface. This validation of our design choices underscored the success of our approach, confirming that the adjustments directly contributed to a more intuitive and efficient interaction for our users.

✓ The impact

How did you share out your findings?

We held very quick and smooth meetings with stakeholders, to show them the conclusions and highlights of my playlist, and the insights were received extremely well! The data was there and presented in a very clear way.

In terms of the shareout itself, I like to generate a presentation that doesn't take more than 20-30 minutes and separate it into four or five blocks that look like:

  1. Hypotheses

  2. Final insights

  3. Small clip supporting them

As a result of that, we had a clear understanding of next steps and how to apply and deliver the updates to the UI.

How did your learnings affect the Live tool?

Every single new feature or big change applied to the platform was an informed decision. Thanks to this and other studies conducted over a range of four months, we were able to take feedback and make a direct impact on our product.

What’s next for this project?

We’re developing what we call internally, “Live 3.0.” We are working on the biggest changes the Live UI has ever seen, and we are currently studying them with both internal folks and customers.

Anything else you’d like to share? Any advice for readers running a project like this?

For me, this project really emphasized the importance of dogfooding and empathy building. This meta approach was a fun way to dig deeper, really get into the weeds of our tools, and focus on making better products for our users. The true power behind the set of tools that dscout offers is the ability to mix them in any way you can imagine. For me this time was Usability and Live, but I highly encourage everyone out there conducting research to get creative and mix methods as much as they can!

Interested in similar articles? Check out…

Interested in seeing dscout Usability Testing in action?

Schedule a demo with a member of our team to walk through the platform and see if dscout is a right fit for your upcoming research goals.

Alfo is a Lead Product Designer on dscout's Product Team.

Kris is a content creator and editor based in Chicago.

Subscribe To People Nerds

A weekly roundup of interviews, pro tips and original research designed for people who are interested in people

The Latest