Audition Research Participants First with Screener Surveys
Not all participants are created equal. Here's how to screen out the wrong ones before it’s too late.
I get excited about 1x1 interviews. I love them. I could conduct 1x1 interviews for the rest of my life and never get tired. However, nothing is worse than settling into a 90-minute conversation only to find out the participant has absolutely no input for the topic you've planned to discuss.
I remember sitting in one session, knowing I had taken two steps backward with my stakeholders. I spent months proving user research's value and developing a generative research program.
Finally, we secured the budget, and I got started. In my excitement, I got straight to recruiting as quickly as possible and ended up with a few participants that could have been more helpful.
From that moment on, I promised myself that I would make sure my recruitment was on point and that I wouldn't rush through this process. The one thing I changed that made all the difference in my recruitment process was ensuring I had an excellent screener survey.
What are screener surveys, and why are they important?
Screener surveys are an excellent tool for beginning research projects. You use short surveys to qualify participants and ensure you get the best fit for the information you need.
One big mistake I used to make was only asking for demographic information in my screener surveys. Since I came from an academic background, demographic information was a must in many projects. I didn't focus on behavior or habits in those projects.
Unfortunately, when I just asked for demographic information (ex: gender, age, location), I landed in the same scenario I described above. The participants might have fit a specific demographic, but they couldn't give me the needed information.
Only including demographic information became a knockoff effect not only for less ideal sessions but also for superficial deliverables. Since I didn't have in-depth knowledge about people's characteristics and I also needed more consistency across participants, my personas were shallow and scattered.
Instead of these personas having consistent needs, goals, and pain points across participants, I could only guess on similar groups and mainly bucket them by demographics. Unfortunately, with this superficial information, I didn't empower my teams to make better decisions, so my personas gathered dust in a Google Drive Folder.
Once this happened (a few times), I realized that it was important to write good screener surveys for a few different reasons:
- Finding and talking to the most relevant participants who have the characteristics, habits, and behaviors you need to understand better.
- Hitting the correct sample size by segmenting your participants into different buckets, so that you can later create meaningful deliverables that lead to action.
- Ensuring the return on investment for your research will be as high as possible, and avoiding wasted time and money on suboptimal participants.
- Avoiding burning out your participant list by asking all users to participate constantly.
7 steps to writing an excellent screener survey
As essential as screener surveys are, they can sometimes take time to write properly. It took me some practice to understand how to approach screener surveys optimally. Now, I have a process I love to go through every time I write a screener survey.
Ideally, your screener survey is about 5-10 questions. Once it gets longer and more complex, you can experience drop-offs.
1. List the criteria of your ideal participants
Before you even think about recruiting, list out all of the different criteria an exemplary participant would have. During this stage, think about the questions you want to ask them to achieve your research goals and the types of information you need them to give.
Whenever I am brainstorming different criteria, I always ask:
- What are the goals for your research?
- What are the questions your users have to answer to get you meaningful information?
- What gaps in knowledge do you have that you need your participants to fill in?
- What behaviors do you need to understand more?
- What habits are you trying to target?
- What are the goals the user is trying to accomplish?
2. Write one question for each criterion identified
Once you brainstorm the requirements you need, write at least one question per criterion you identified as important. With this, you can target the best participants for the information you need correctly.
Ideally, your screener survey is about 5-10 questions. Once it gets longer and more complex, you can experience drop-offs.
3. Ensure your questions are well-written
Writing a screener survey question well can take some time, but it’s worth it when you get stellar participants.
My favorite types of screener questions are multiple-choice because I can get precise with the question and gather important information about the participant.
Here are some question types to avoid:
✖ Imprecise questions
You want your screener survey questions to be as precise and straightforward as possible, with no room for interpretation. Including any vague information makes it difficult for people to respond properly. Ensure you have defined and removed vague information.
- Imprecise criteria: People who have wanted to move recently
- Precise criteria: People who have visited 3+ apartments in the past 60 days
✖ Future-based questions
People have a hard time talking reliably about future-based behavior, so focus on past behavior as much as possible.
- Future-based question: Will you look at apartments in the next 60 days?
- Past-based question: Have you looked at 3+ apartments in the past 60 days?
✖ Leading questions
Leading questions will influence people to answer in particular ways and can result in participants who aren't able to answer your questions well.
- Leading question: Do you agree that apartments without online pictures are not worth looking at?
- Non-leading question: How do you feel when apartment listings don't have any photos available online?
✖ Inaccurate questions
A quick shout (mainly because it is a pet peeve of mine!), take a look at any numbers you use in your multiple choice questions. So often, I see an overlap in numbers, making it impossible for a participant to answer. For example:
"How many people currently work at your company?"
- 1-100
- 100 - 150
- 150 - 200
- 200 - 250
- 250 - 300
- Over 300
Within this question, people could be forced to choose multiple answers. If 150 people work at my company, do I choose option one or two? Instead, reformat the numbers so it looks like this:
✔ "How many people currently work at your company?"
- 1-100
- 101 - 150
- 151 - 200
- 201 - 250
- 251 - 300
- Over 300
- Other (open text field)
4. Use logic to order your questions
One of the most frustrating experiences as a participant is getting halfway through a screener and then being screened out. Ideally, you ask for the broadest qualifying information at the start of the screener and then narrow it down to more specific information.
5. Mix up questions to avoid obvious answers
Using a combination of open and closed questions helps ensure you get the best participants. For example, a 60/40 mix is safe, where 60% of questions are closed and 40% are open-ended.
Open-ended questions help you see how a participant would answer in their own words without priming them to respond in a specific way. These questions are also helpful if you aren't sure what information to include in a multiple-choice question.
Adding an open-ended question to your screener survey can give clues on how much insight a participant will provide in your study. One-word responses, illogical rambling, or cagey answers can all indicate that a participant may provide a low return on investment.
6. Always include an open text field or N/A
I have seen many screener surveys that don't allow me to say something that is not applicable, so I am forced to answer a question incorrectly.
To avoid this, make sure you include fields like:
- Not applicable
- I don't know
- None of the above
Including an "other" option accompanied by an open-text field is also great practice. This allows participants to answer within their own words and experience.
7. Conduct a dry run
The last but most crucial step! I have been overly excited about recruitment (or rushing) and sent out a screener survey without first getting feedback. As a result, there were several typos, and I worded some questions confusingly.
After that mistake, I always tested my screener with colleagues to ensure it was as clear, simple, and straightforward as possible.
Let's go through an example
We own a plant shop in Brooklyn, New York, selling groups of exotic plants. We have been posting our plants on social media, and people from around the United States are contacting us and asking if we have an online shop.
We want to consider this an option but are still determining what people want or expect. We want to conduct user research to:
- Understand how people currently purchase plants online and what their end-to-end experience is like
- Uncover the pain points and frustrations people encounter when buying plants online
- Identify other plant stores they have purchased from and what the experience was like with those stores
- Evaluate how they interact with our online plant store prototype
While also looking at the goals, I would then ask myself:
What gaps in knowledge do you have that you need your participants to fill in?
- We need to understand how people currently purchase plants online, so we need participants who have recently gone through that particular experience.
- Why would people purchase plants online versus going into a store?
What behaviors do you need to understand more?
- We need to understand the end-to-end experience of how people have purchased a plant online, from searching to unboxing the plant.
What habits are you trying to target?
- People who buy plants online regularly can give us a good run-through of the experience.
I would create a screener survey with this information to ensure I target the right people:
- Where are you currently located?
- Type in (screen out anything within 50 miles of Brooklyn)
- How often have you visited an online plant store in the past three months?
- I haven't visited any in the past three months (screen out)
- 1-3 times
- 4-6 times
- Over 6 times
- Other (open text field)
- How many plants have you purchased online in the past three months?
- I haven't purchased any plants online in the past three months (screen out)
- 1-3 plants (screen out)
- 4-6 plants
- Over 6 plants
- Other (open text field)
- How often have you bought plants in-store in the past three months?
- I haven't purchased any plants in-store in the past three months (screen out)
- 1-3 plants
- 4-6 plants
- Over 6 plants (screen out)
- Other (open text field)
- From which store did you most recently purchase a plant online?
- (Short open text)
- Describe your most recent experience purchasing a plant online.
- (Long open text)
Preparing a recruiting screener can prevent potentially wrong participants from being part of your study and increase the ROI of your research sessions.
When in doubt, screen! It always pays off.
Ready to hold auditions and recruit engaged, expressive participants for your next project? Learn more about dscout Recruit.
Interested in other articles like this? Check out...
Nikki Anderson-Stanier is the founder of User Research Academy and a qualitative researcher with 9 years in the field. She loves solving human problems and petting all the dogs.
To get even more UXR nuggets, check out her user research membership, follow her on LinkedIn, or subscribe to her Substack.
Subscribe To People Nerds
A weekly roundup of interviews, pro tips and original research designed for people who are interested in people