July 30, 2023
July 30, 2023
Usability testing is a user researcher skill that is essential, but often taken for granted. They may seem simple and quick, but I sometimes find myself tripping over the details.
There have been times where I’ve needed to quickly put together a usability test—only to forget a critical component or skip a step entirely and need to backtrack. It can be frustrating to get something as "easy" as usability testing wrong, and wind up wasting a lot of time we don't have.
For example, a few years ago, I was planning a usability test for a team and overlooked a small but crucial part of the setup. I had whipped together the recruitment, screener survey, and a basic script. Since I was manually recruiting, I had sent calendar invitations to participants and colleagues. I always separate these invitations so that my colleagues don't see the participant's email (I play it safe with GDPR).
Unfortunately, I forgot to put links in the calendar invites for both, meaning there was a last-minute rush to get the links to everyone. I also hadn't reminded the designer to make the prototype link accessible to those outside our organization. With these (seemingly minor) mishaps, we lost a few participants and had to extend the deadline, which crashed into another project I was working on.
Since then, I have tried to save myself from these situations by making lists and templates. I created a usability testing checklist for myself and my colleagues as a fool-proof way to ensure every usability test goes smoothly and that no steps are left out of the process (unless intentionally so).
After many scribbled-down notes and ad-hoc checklists, I managed to wrangle everything into one overarching checklist. This list takes you through the ideal planning stages of a usability test, from a few weeks out to the day of, and also covers post-test ideas.
Keep in mind that you may not include every step in this list, depending on your organization and how you run tests. If I’m missing a necessary step for you, please feel free to email peoplenerds@dscout.com and send feedback!
I broke the steps down into different timeframes to be as proactive as possible while planning. Additionally, this checklist targets remote moderated studies, but you can easily adapt it to unmoderated or in-person tests. I'll leave some hints of where the list may diverge for in-person tests.
You know what they say: "Prior proper planning prevents poor performance." Though the alliteration may be a bit old school, the sentiment holds a lot of truth when it comes to planning for a usability test.
It may seem excessive, but taking the time to really plan out each stage weeks, days, and hours before the test can be a huge time-saver for your future self. Here's how I recommend you prepare for each checkpoint:
Before the test, you will be doing a lot of pre-work. This work includes a huge chunk of the tasks that set your usability test up for success, so I definitely recommend trying to get it done three weeks before the test.
Use a research plan or brief to define the goal and objectives for the study. This step ensures the study's goals are clear and aligns with usability testing as a methodology. Be sure to take some time to align the team on the research questions and expected outcome.
Decide on the type of test you need to run to get the best results. For instance, will you be running an unmoderated or moderated study? Will the test be remote or in-person? At this step, make sure that usability testing is the right fit.
How long will each test be? How many participants do you need? If you are doing a remote study, decide on the technology you will use for the session and the debrief (ex: Zoom and Miro). If you are conducting an in-person session, start thinking about directions to your office and booking rooms. Finally, if you are running an unmoderated test, decide on the software you will use to run the study.
Figure out the participants you want to speak to during the test. Figure out if there are specific criteria you need for participants to get the best information. For example, if you are looking for participants who are serious about purchasing a car, you should screen for this behavior.
Build a screener survey that allows you to recruit your specific participants. After you write the survey, share it with your colleagues to make sure you didn't miss anything.
Start recruiting about two weeks before the test begins using your screener survey. Just as a note, you won't start recruitment for unmoderated tests until you set up the tasks in your software.
For moderated tests, you can break this step down into four smaller tasks:
To get a general idea of what you will be testing, review the prototype/concepts. If the prototype is not ready yet, have the designer sketch some ideas or a general user flow. This session will get your brain working on the discussion guide and flow of the session.
If you have enough information to do so, begin writing up the introduction, warm-up, tasks, wrap-up, and thinking about the flow of the session.
Set calendar blocks for when the tests will be occurring, even if you don't have exact dates and times. I always encourage colleagues to be accessible on the days we run the tests and tell them I will have more specifics as we get closer.
Do this to make sure you will have space for yourself, a notetaker, and an optional observer. I usually encourage only two people beside the participant in the room, and others can sit in another room (or separately) to observe via an online recording software.
The week before the test is all about confirmation and double-checking. Looking at previous steps and confirming with colleagues is essential to ensure everything goes smoothly, so take the time to do some of these smaller tasks, even if they feel mundane. I always take one or two afternoons before the test to make sure I have everything ready.
Go over the prototype with the designer and the product manager. The designer should finish the final prototype the week before the tests begin. I know this is ideal, but it gives you the space to create a great discussion guide. If the prototype isn't done two days before the tests, I will cancel the study or try to use the participants for a different test.
Jot everything down in the discussion guide, including the flow of the study. For instance, if you have several prototypes to test, make sure you vary the order. I always write which participant is getting what order right in the guide. For example:
Schedule everyone into the calendar. Send them a calendar invitation that includes:
Do this to ensure you are ready to start debriefing right after the first session.
As mentioned, I always use separate calendar invites for colleagues. I also "bake" the debrief time into the session, so if we have an hour session with the participant, I send an hour and a half calendar invite to include the debrief. In the calendar invite, I include:
Collaborate with colleagues and get feedback. Make any necessary changes to the tasks and finish the discussion guide.
This sheet will be the primary sheet of the line-up, what roles everyone will assume (moderator, notetaker, observers), and has links to everything. Also, it can be helpful to send out a note taking template to the notetaker and review best practices.
This is important especially those who haven't done usability tests before. This list can include when to sign into a session (hint: 5 minutes early), reminders to mute, and how to send questions to the moderator.
Make sure you aren't conducting too many tests in one day!
Depending on if you’re doing a remote or in-person test, this work can vary. If you are conducting in-person usability tests, there may be a little more work involved:
Send these out to participants. Certain cultures aren't used to receiving confirmations, so think about that before sending. You can also email out any directions detailing how to run the software you’ll be using.
Make sure you give participants access to the prototype.
Test everything out the day before to make sure everything goes smoothly and, if necessary, you can make changes.
Depending on in-person, remote, or how your organization works, you can send out the NDA/consent form before or have it ready the day of.
Whether that means ordering any gift cards or creating any discount codes.
Ensure all the necessary information is in there and all links work.
Prep to make sure you start the session off on the right foot if you are doing an in-person session.
Verizon conducted remote product testing in a way that still delivered all the benefits of doing so in person. See how they approached the project and how it ultimately led to a set of more actionable insights and significant cost savings.
Finally, the day comes, and fingers crossed, the internet works! This list will bring you step-by-step of what to do before the participant arrives, while you are testing, and directly after. This stage varies a lot if you are doing an in-person or remote study.
Check in with your colleagues especially the notetaker, and confirm all the roles.
If you are conducting a remote session, sign in early and make sure everything is working, especially your internet connection. Test your screen sharing, microphone, and video.
Prep the room with snacks and water if you are doing an in-person test. The snacks are primarily for your team's debrief.
Before the test starts make sure the room or your computer is distraction-free, including muting messaging channels, closing software, and putting your phone away. Remember to allow colleagues a way to send you questions.
Before you start recording the session, make sure everyone understands and signs the NDA/consent form.
It is funny how many times I've skipped this step by mistake.
Go through the test and have fun!
The test is over, but there are a few more tasks left:
Try to do this within 24-hours or ideally right after the session (if you have time).
Send a thanks to each participant, regardless if they were a good or bad participant.
If you felt some or all of the participants were excellent. Keeping track of good participants is a great way to start building a panel!
Although it can be easy to leave some planning for the last minute, especially if you are juggling multiple projects, completing this checklist will set you up for success. Taking the time to go through these tasks will help ensure your team gets what they need, participants are comfortable, and you, ultimately, are less stressed.
Nikki Anderson-Stanier is the founder of User Research Academy and a qualitative researcher with 9 years in the field. She loves solving human problems and petting all the dogs.
To get even more UXR nuggets follow her on LinkedIn, or subscribe to her Substack.