Before the usability test
You know what they say, prior proper planning prevents poor performance. Though the alliteration may be a bit old school, the sentiment holds a lot of truth when it comes to planning for a usability test.
It may seem excessive, but taking the time to really plan out each stage weeks, days, and hours before the test can be a huge time-saver for your future self. Here's how I recommend you prepare for each checkpoint:
Two to three weeks prior
Before the test, you will be doing a lot of pre-work. This work includes a huge chunk of the tasks that set your usability test up for success, so I definitely recommend trying to get it done three weeks before the test.
Define the goals and objectives
Use a research plan or brief to define the goal and objectives for the study. This step ensures the study's goals are clear and aligns with usability testing as a methodology. Be sure to take some time to align the team on the research questions and expected outcome.
Decide on the type of test you need to run to get the best results. For instance, will you be running an unmoderated or moderated study? Will the test be remote or in-person? At this step, make sure that usability testing is the right fit.
Determine the logistics of the test
How long will each test be? How many participants do you need? If you are doing a remote study, decide on the technology you will use for the session and the debrief (ex: Zoom and Miro). If you are conducting an in-person session, start thinking about directions to your office and booking rooms. Finally, if you are running an unmoderated test, decide on the software you will use to run the study.
Figure out the participants you want to speak to during the test. Figure out if there are specific criteria you need for participants to get the best information. For example, if you are looking for participants who are serious about purchasing a car, you should screen for this behavior.
Build a screener survey that allows you to recruit your specific participants. After you write the survey, share it with your colleagues to make sure you didn't miss anything.
Start recruiting about two weeks before the test begins using your screener survey. Just as a note, you won't start recruitment for unmoderated tests until you set up the tasks in your software. For moderated tests, you can break this step down into four smaller tasks:
- Figure out how and where you will recruit. Will you use an agency or email participants individually? Or will you post on LinkedIn, Facebook, Reddit, etc.?
- Think about how you will incentivize participants. Will you be able to offer them a gift card, discount, or raffle entry?
- Start sending out or advertising your recruitment—don't forget to include your screener survey. If you aren't using a tool like Calendly, always make sure you give date and time options (including time zones).
- Always plan a backup participant (or two) to ensure all the slots will get filled, even if someone drops out.
Review the prototype or concepts
To get a general idea of what you will be testing, review the prototype/concepts. If the prototype is not ready yet, have the designer sketch some ideas or a general user flow. This session will get your brain working on the discussion guide and flow of the session.
Start the discussion guide
If you have enough information to do so, begin writing up the introduction, warm-up, tasks, wrap-up, and thinking about the flow of the session.
Put blocks in everyone's calendars
Set calendar blocks for when the tests will be occurring, even if you don't have exact dates and times. I always encourage colleagues to be accessible on the days we run the tests and tell them I will have more specifics as we get closer.
If in-person, start to book rooms
Do this to make sure you will have space for yourself, a notetaker, and an optional observer. I usually encourage only two people beside the participant in the room, and others can sit in another room (or separately) to observe via an online recording software.
One week before
The week before the test is all about confirmation and double-checking. Looking at previous steps and confirming with colleagues is essential to ensure everything goes smoothly, so take the time to do some of these smaller tasks, even if they feel mundane. I always take one or two afternoons before the test to make sure I have everything ready.
Review the prototype
Go over the prototype with the designer and the product manager. The designer should finish the final prototype the week before the tests begin. I know this is ideal, but it gives you the space to create a great discussion guide. If the prototype isn't done two days before the tests, I will cancel the study or try to use the participants for a different test.
Jot everything down in the discussion guide, including the flow of the study. For instance, if you have several prototypes to test, make sure you vary the order. I always write which participant is getting what order right in the guide. For example:
- Participant 1: A, B, C
- Participant 2: B, C, A
- Participant 3: C, A, B
Start scheduling participants
Schedule everyone into the calendar. Send them a calendar invitation that includes:
- The date and time of the test
- The location (in-person) or link to the session (remote)
- If in-person, detailed directions to your office
- Your phone number (if you feel comfortable)
- An easy way to cancel or reschedule (if possible, such as through Calendly)
Do this to ensure you are ready to start debriefing right after the first session.
Invite colleagues to the test and debrief
As mentioned, I always use separate calendar invites for colleagues. I also "bake" the debrief time into the session, so if we have an hour session with the participant, I send an hour and a half calendar invite to include the debrief. In the calendar invite, I include:
- The session location (room) or link to the session (remote)
- A reminder to mute and turn off their video
- A link to the debrief session (ex: Zoom), and the debrief board (ex: Miro)
- A basic breakdown of the session
Review and finalize the tasks
Collaborate with colleagues and get feedback. Make any necessary changes to the tasks and finish the discussion guide.
Create a sign-up sheet to determine roles for each session
This sheet will be the master sheet of the line-up, what roles everyone will assume (moderator, notetaker, observers), and has links to everything. Also, it can be helpful to send out a note taking template to the notetaker and review best practices.
Send out best practices to colleagues
This is important especially those who haven't done usability tests before. This list can include when to sign into a session (hint: 5 minutes early), reminders to mute, and how to send questions to the moderator.
Make sure you aren't conducting too many tests in one day!
One to two days prior
Depending on if you’re doing a remote or in-person test, this work can vary. If you are conducting in-person usability tests, there may be a little more work involved:
If relevant, send out confirmation emails
Send these out to participants. Certain cultures aren't used to receiving confirmations, so think about that before sending. You can also email out any directions detailing how to run the software you’ll be using.
Get passwords or enable access
Make sure you give participants access to the prototype.
Do a dry run
Test everything out the day before to make sure everything goes smoothly and, if necessary, you can make changes.
Setup, send out, or print the NDA and/or consent form
Depending on in-person, remote, or how your organization works, you can send this out before or have it ready the day of.
Make sure incentives are ready
Whether that means ordering any gift cards or creating any discount codes.
Double-check the calendar invites
Ensure all the necessary information is in there and all links work.
Determine how you will greet the participant
Prep to make sure you start the session off on the right foot if you are doing an in-person session.
Day of the test
Finally, the day comes, and fingers crossed, the internet works! This list will bring you step-by-step of what to do before the participant arrives, while you are testing, and directly after. This stage varies a lot if you are doing an in-person or remote study.
Remind everyone of the session
Check in with your colleagues especially the notetaker, and confirm all the roles.
Sign in to the session early
If you are conducting a remote session, sign in early and make sure everything is working, especially your internet connection. Test your screen sharing, microphone, and video.
Get snacks and water
Prep the room with snacks and water if you are doing an in-person test. The snacks are primarily for your team's debrief.
Turn off all distractions
Before the test starts make sure the room or your computer is distraction-free, including muting messaging channels, closing software, and putting your phone away. Remember to allow colleagues a way to send you questions.
Walk them through signing the NDA/consent form
Before you start recording the session, make sure everyone understands and signs the NDA/consent form.
Start the screen recording!
It is funny how many times I've skipped this step by mistake.
Go through the test and have fun!
One day after
The test is over, but there are a few more tasks left:
Send incentives to the participant
Try to do this within 24-hours or ideally right after the session (if you have time).
Send a thank-you email
Send a thanks to each participant, regardless if they were a good or bad participant.
Ask participants if they'd be interested in another session
If you felt some or all of the participants were excellent. Keeping track of good participants is a great way to start building a panel!
Check out the checklist template here.