Skip to content
Ideas

A Never-Fail Usability Testing Checklist

Make sure every usability test goes off without a hitch. We’ve built a robust checklist to ensure you’re prepared for everything prior, during, and after the test.

Words by Nikki Anderson, Visuals by Kate Degman

Usability testing is a user researcher skill that is essential, but often taken for granted. They may seem simple and quick, but I sometimes find myself tripping over the details.

There have been times where I’ve needed to quickly put together a usability test only to forget a critical component or skip a step entirely and need to backtrack. It can be frustrating to get something as "easy" as usability testing wrong, and wind up wasting a lot of time we don't have.

For example, a few years ago, I was planning a usability test for a team and overlooked a small but crucial part of the setup. I had whipped together the recruitment, screener survey, and a basic script. Since I was manually recruiting, I had sent calendar invitations to participants and colleagues. I always separate these invitations so that my colleagues don't see the participant's email (I play it safe with GDPR).

Unfortunately, I forgot to put links in the calendar invites for both, meaning there was a last-minute rush to get the links to everyone. I also hadn't reminded the designer to make the prototype link accessible to those outside our organization. With these (seemingly minor) mishaps, we lost a few participants and had to extend the deadline, which crashed into another project I was working on.

Since then, I have tried to save myself from these situations by making lists and templates. I created a usability testing checklist for myself and my colleagues as a fool-proof way to ensure every usability test goes smoothly and that no steps are left out of the process (unless intentionally so).

The checklist (+ a template)

After many scribbled-down notes and ad-hoc checklists, I managed to wrangle everything into one master checklist. This list takes you through the ideal planning stages of a usability test, from a few weeks out to the day of, and also covers post-test ideas.

Keep in mind that you may not include every step in this list, depending on your organization and how you run tests. If I’m missing a necessary step for you, please feel free to email [email protected] and send feedback!

I broke the steps down into different timeframes to be as proactive as possible while planning. Additionally, this checklist targets remote moderated studies, but you can easily adapt it to unmoderated or in-person tests. I'll leave some hints of where the list may diverge for in-person tests.

Before the usability test

You know what they say, prior proper planning prevents poor performance. Though the alliteration may be a bit old school, the sentiment holds a lot of truth when it comes to planning for a usability test.

It may seem excessive, but taking the time to really plan out each stage weeks, days, and hours before the test can be a huge time-saver for your future self. Here's how I recommend you prepare for each checkpoint:

Two to three weeks prior

Before the test, you will be doing a lot of pre-work. This work includes a huge chunk of the tasks that set your usability test up for success, so I definitely recommend trying to get it done three weeks before the test.

Define the goals and objectives

Use a research plan or brief to define the goal and objectives for the study. This step ensures the study's goals are clear and aligns with usability testing as a methodology. Be sure to take some time to align the team on the research questions and expected outcome.

Consider the type of test

Decide on the type of test you need to run to get the best results. For instance, will you be running an unmoderated or moderated study? Will the test be remote or in-person? At this step, make sure that usability testing is the right fit.

Determine the logistics of the test

How long will each test be? How many participants do you need? If you are doing a remote study, decide on the technology you will use for the session and the debrief (ex: Zoom and Miro). If you are conducting an in-person session, start thinking about directions to your office and booking rooms. Finally, if you are running an unmoderated test, decide on the software you will use to run the study.

Decide on the participants

Figure out the participants you want to speak to during the test. Figure out if there are specific criteria you need for participants to get the best information. For example, if you are looking for participants who are serious about purchasing a car, you should screen for this behavior.

Create a screener survey

Build a screener survey that allows you to recruit your specific participants. After you write the survey, share it with your colleagues to make sure you didn't miss anything.

Begin recruitment

Start recruiting about two weeks before the test begins using your screener survey. Just as a note, you won't start recruitment for unmoderated tests until you set up the tasks in your software. For moderated tests, you can break this step down into four smaller tasks:

  1. Figure out how and where you will recruit. Will you use an agency or email participants individually? Or will you post on LinkedIn, Facebook, Reddit, etc.?
  2. Think about how you will incentivize participants. Will you be able to offer them a gift card, discount, or raffle entry?
  3. Start sending out or advertising your recruitment—don't forget to include your screener survey. If you aren't using a tool like Calendly, always make sure you give date and time options (including time zones).
  4. Always plan a backup participant (or two) to ensure all the slots will get filled, even if someone drops out.
Review the prototype or concepts

To get a general idea of what you will be testing, review the prototype/concepts. If the prototype is not ready yet, have the designer sketch some ideas or a general user flow. This session will get your brain working on the discussion guide and flow of the session.

Start the discussion guide

If you have enough information to do so, begin writing up the introduction, warm-up, tasks, wrap-up, and thinking about the flow of the session.

Put blocks in everyone's calendars

Set calendar blocks for when the tests will be occurring, even if you don't have exact dates and times. I always encourage colleagues to be accessible on the days we run the tests and tell them I will have more specifics as we get closer.

If in-person, start to book rooms

Do this to make sure you will have space for yourself, a notetaker, and an optional observer. I usually encourage only two people beside the participant in the room, and others can sit in another room (or separately) to observe via an online recording software.

One week before

The week before the test is all about confirmation and double-checking. Looking at previous steps and confirming with colleagues is essential to ensure everything goes smoothly, so take the time to do some of these smaller tasks, even if they feel mundane. I always take one or two afternoons before the test to make sure I have everything ready.

Review the prototype

Go over the prototype with the designer and the product manager. The designer should finish the final prototype the week before the tests begin. I know this is ideal, but it gives you the space to create a great discussion guide. If the prototype isn't done two days before the tests, I will cancel the study or try to use the participants for a different test.

Write the tasks

Jot everything down in the discussion guide, including the flow of the study. For instance, if you have several prototypes to test, make sure you vary the order. I always write which participant is getting what order right in the guide. For example:

  • Participant 1: A, B, C
  • Participant 2: B, C, A
  • Participant 3: C, A, B
Start scheduling participants

Schedule everyone into the calendar. Send them a calendar invitation that includes:

  1. The date and time of the test
  2. The location (in-person) or link to the session (remote)
  3. If in-person, detailed directions to your office
  4. Your phone number (if you feel comfortable)
  5. An easy way to cancel or reschedule (if possible, such as through Calendly)
Create a debrief and synthesis board

Do this to ensure you are ready to start debriefing right after the first session.

Invite colleagues to the test and debrief

As mentioned, I always use separate calendar invites for colleagues. I also "bake" the debrief time into the session, so if we have an hour session with the participant, I send an hour and a half calendar invite to include the debrief. In the calendar invite, I include:

  • The session location (room) or link to the session (remote)
  • A reminder to mute and turn off their video
  • A link to the debrief session (ex: Zoom), and the debrief board (ex: Miro)
  • A basic breakdown of the session
Review and finalize the tasks

Collaborate with colleagues and get feedback. Make any necessary changes to the tasks and finish the discussion guide.

Create a sign-up sheet to determine roles for each session

This sheet will be the master sheet of the line-up, what roles everyone will assume (moderator, notetaker, observers), and has links to everything. Also, it can be helpful to send out a note taking template to the notetaker and review best practices.

Send out best practices to colleagues

This is important especially those who haven't done usability tests before. This list can include when to sign into a session (hint: 5 minutes early), reminders to mute, and how to send questions to the moderator.

Make sure you aren't conducting too many tests in one day!

One to two days prior

Depending on if you’re doing a remote or in-person test, this work can vary. If you are conducting in-person usability tests, there may be a little more work involved:

If relevant, send out confirmation emails

Send these out to participants. Certain cultures aren't used to receiving confirmations, so think about that before sending. You can also email out any directions detailing how to run the software you’ll be using.

Get passwords or enable access

Make sure you give participants access to the prototype.

Do a dry run

Test everything out the day before to make sure everything goes smoothly and, if necessary, you can make changes.

Setup, send out, or print the NDA and/or consent form

Depending on in-person, remote, or how your organization works, you can send this out before or have it ready the day of.

Make sure incentives are ready

Whether that means ordering any gift cards or creating any discount codes.

Double-check the calendar invites

Ensure all the necessary information is in there and all links work.

Determine how you will greet the participant

Prep to make sure you start the session off on the right foot if you are doing an in-person session.

Day of the test

Finally, the day comes, and fingers crossed, the internet works! This list will bring you step-by-step of what to do before the participant arrives, while you are testing, and directly after. This stage varies a lot if you are doing an in-person or remote study.

Remind everyone of the session

Check in with your colleagues especially the notetaker, and confirm all the roles.

Sign in to the session early

If you are conducting a remote session, sign in early and make sure everything is working, especially your internet connection. Test your screen sharing, microphone, and video.

Get snacks and water

Prep the room with snacks and water if you are doing an in-person test. The snacks are primarily for your team's debrief.

Turn off all distractions

Before the test starts make sure the room or your computer is distraction-free, including muting messaging channels, closing software, and putting your phone away. Remember to allow colleagues a way to send you questions.

Walk them through signing the NDA/consent form

Before you start recording the session, make sure everyone understands and signs the NDA/consent form.

Start the screen recording!

It is funny how many times I've skipped this step by mistake.

Get testing!

Go through the test and have fun!

One day after

The test is over, but there are a few more tasks left:

Send incentives to the participant

Try to do this within 24-hours or ideally right after the session (if you have time).

Send a thank-you email

Send a thanks to each participant, regardless if they were a good or bad participant.

Ask participants if they'd be interested in another session

If you felt some or all of the participants were excellent. Keeping track of good participants is a great way to start building a panel!

Check out the checklist template here.

Set yourself up for success

Although it can be easy to leave some planning for the last minute, especially if you are juggling multiple projects, completing this checklist will set you up for success. Taking the time to go through these tasks will help ensure your team gets what they need, participants are comfortable, and you, ultimately, are less stressed.

Nikki Anderson is the founder of User Research Academy and a qualitative researcher with 8 years in the field. She loves solving human problems and petting all the dogs. Explore her research courses here or read more of her work on Medium.

Subscribe To People Nerds

A weekly roundup of interviews, pro tips and original research designed for people who are interested in people

The Latest