Train Your Team to Usability Test in Just an Afternoon
Teaching your team to do tactical research can be time-saving and career-changing. Here's an outline for an interactive workshop you can run today.
We talk a lot as researchers about evangelizing the role of research—but teaching others how to conduct it can be an even more important tactic.
Often, we’re asked to juggle multiple projects at once and have to deliver reports quickly. When you’re on a small team, a resource-strapped team, or a research team of one, this can be tricky. The last thing we want is to compromise our results because we were rushed.
As companies grow to rely more heavily on user research, we have to keep up with the growing demands. The best way I’ve found to do this is to train the most relevant people on my team to conduct research sessions independently. I’ll host training sessions for designers, product managers and, if they’re interested, developers or data scientists.
These sessions are break down how to conduct more tactical research, such as usability tests. That way, instead of approaching me with every usability test the team needs, I can pour more of my focus into more strategic research.
I never force colleagues to come to these sessions; they’re based solely on interest. Surprisingly, I always have a good turnout and a significant number of people excited to learn.
I have a tried-and-true workshop approach when teaching others how to conduct more tactical research sessions. In general, I always try to have the following agenda and goals in mind:
- Educate your audience
- Explain the craft
- Practice a session
Part 1: Educate your audience
Be sure to cover...
- What is user research?
- What is usability testing?
- What is the importance of usability testing?
I start each of my sessions by explaining what user research is. Even if I have more experienced colleagues in the room, I always act as though I am explaining user research to people who have never heard about the concept before. Giving context for others to understand the discipline is key to ensuring they understand the methodology at hand.
For reference, here’s my “layman’s user research” definition:
User research is the ability to connect with your users on a deep level. It goes beyond your product and leads to a true understanding of, and empathy for, how a person thinks, feels, and interacts with the world.
Of course, showing is always better than telling. At this point in the workshop, I include video clips of our users sharing a surprising perspective or thought process. Since I currently work in the travel/mobility space, I’ll include tidbits on why people travel, and the stressful moments they encounter along their journey. When possible, I walk the group through visuals such as personas, journey maps, or Jobs to be Done timelines. These documents help the group gain a perspective on who our users are and what knowledge we currently have.
After I give this brief introduction, I take the time to dive into the purpose of the meeting: usability testing. As was the case above, the best way to explain usability testing is by showing examples of these sessions. I’ll pull clips of users who are frustrated, or failing to accomplish a seemingly simple goal. I show different hacks users employ to get to a desired outcome. I then give a simple outline of what someone could expect from usability testing.
Usability testing will tell us:
- Whether or not a user can use a product for its intended function
- Whether or not a product allows a user to reach their goals
- How a user uses a product, separate from how we think a user should use a product
- How a product functions when placed in front of a human
- Where bugs and complicated user experiences lie within a product
Usability testing will NOT tell us:
- The emotions a user is feeling outside of their immediate actions
- Statistically significant quantitative data on usage patterns or trends
- Preferences between two versions of a design (e.g. A/B testing)
- The desirability for a product
- Complete market demand and value
The core thing to clarify is: a usability test is not used to see how people feel about design, whether they like it, want to use it, or would share it with others. The point is to understand where the interface causes frustration and barriers that impede users from achieving their goals.
Once people understand what a usability test is, with given examples, I turn to the most critical part of the education process: why are usability tests essential to run?
Only when people see the value of doing something will they put in the time and effort into doing it. I try to relate this portion to whichever roles I am speaking to, but in general, I highlight the following advantages usability tests offer:
- Receive tactical feedback directly from your target audience that can be used to make immediate improvements
- Solve internal debates on "what should we focus on next" with direct feedback on pain points
- Highlight issues and problems before a product or feature launches
- Increase your likelihood of acquisition and retention (and, thus, revenue)
- Minimize the risk of product failure
- Increase the probability of user satisfaction
The core thing to clarify is: a usability test is not used to see how people feel about design, whether they like it, want to use it, or would share it with others. The point is to understand where the interface causes frustration and barriers that impede users from achieving their goals.
Part 2: Explain the craft
Be sure to cover...
- How to run a usability test
- Writing a script
- Writing tasks
By this point, your audience has bought into the value of user research and usability testing. After setting the initial foundation, I take the group through how to put together a usability test from start to finish.
First, I ask them for an idea of something they would like to test in the future. A ubiquitous example I get, regardless of the company, is testing the checkout funnel of a product. This is actually a great, relatable, concrete example.
With our aim established, I bring the group through my process when faced with a usability testing project:
- Putting together a basic research plan
- Writing tasks and questions
A research plan is an explanation about the research initiative taking place, serving as a kick-off document for a project, and boiling down to what the goal of the research project is. It’s a great way to teach people how to think through a usability test. When explaining research plans, I cover:
- Project background: a few sentences on what the research is about and why we are doing it
- Objectives: what we’d like to learn during the project
- Participants: who are the participants we’re recruiting, and how we’re screening them
- Methods: which research methods we’re using
- Interview guide: a cheat-sheet containing topics/questions to follow during the interview
Instead of merely giving my examples of each, I take the group through the definitions of each section and give them 3–5 minutes per area to brainstorm their own. This allows them to learn by doing and makes the presentation much more engaging.
The areas I take more care to educate on are creating tasks and framing unbiased questions, since those are essential skills, and common sources of error for non-researchers.
A task is essentially asking a user to complete an action on your product. Each task should stand alone and should feed up into a longer flow. To figure out the most critical tasks, we have to ask ourselves:
- What are the necessary actions every user must be able to accomplish on this app/website/product?
- What are the main goals of most of the users visiting this app/website/product?
- What tasks do they need to do to accomplish those goals?
- What are their motivations for visiting an app/website?
Equipped with this information, we can start writing them out. There are three tips I give to help with writing tasks:
- Make the task as realistic as possible
- Make the task actionable
- Avoid giving clues and over-describing tasks
Instead of immediately having the group write tasks for the problem at hand, I prepare some "bad" tasks. Together, we turn these into better tasks and discuss why the initial statements aren't ideal. Here’s an example exercise I use:
- Poor task: Rent a jeep.
- Better task: _________
- Poor task: You are attending a conference out of state. Go to www.airbnb.com and show me what you would do next.
- Better task: _________
- Poor task: You were charged incorrectly for an Uber ride. Open the Uber app, tap help and show me how you would request a refund.
- Better task: _________
After this exercise, I give the group 5–10 minutes to brainstorm individually some tasks they would ask.
Part 3: Practice a session
Finally, the fun begins! After all of this learning and brainstorming, I show a quick 10-minute clip of me conducting a usability test with a user (usually an internal recording specific to this training). I then list out some of my moderating best practices:
- Spend much more time listening than talking
- WATCH the user and observe what they are doing
- Never interrupt!
- If the user is struggling with a task, wait a few minutes or until the user is very frustrated before moving on to the next one
- Don't lead the user to the answer (be careful with word choice)
- If the user asks you a question, repeat it back to them
Again, doing is learning, so I ask the group to conduct a practice session. Each person pairs up with another, and the pair chooses one moderator and one participant. The moderator brings the participant through the usability test they brainstormed earlier for about 15 minutes. After that, the participant gives the moderator feedback for about 3–5 minutes. Then the pair switches roles. After each person has had a chance to be both a moderator and participant, I take time to discuss the activity and allow time to answer any questions.
Overall, by completing these activities and this brainstorming, colleagues are much further along in learning how to conduct usability tests. Every time after I host these sessions, I always give the option for any attendees to practice sessions one-on-one with me, before branching out to conduct their own.
Nikki Anderson-Stanier is the founder of User Research Academy and a qualitative researcher with 9 years in the field. She loves solving human problems and petting all the dogs.
To get even more UXR nuggets, check out her user research membership, follow her on LinkedIn, or subscribe to her Substack.
Subscribe To People Nerds
A weekly roundup of interviews, pro tips and original research designed for people who are interested in people