Writing effective screening survey questions
Use screening surveys to find the right participants to participate in your research activities, whether they're surveys, forums, video discussions, or unmoderated usability tests. This article provides best practices and tips to consider as you're authoring your screening survey.
- You already have lots of information you can leverage for screening: Profile variables, data from past activities, information from third-party systems that has been added to the Alida platform. Having an insight community means you are not starting from scratch with screening. Review what you already have, what you can leverage, and where you need to fill in the gaps with your screening survey.
- When you are screening for your usability testing pool, consider what combination of answer choices will ensure that you include, rather than exclude contributors. Erring on the side of including rather than excluding people can help you ensure that your participants represent all of your key user groups.
You can include screening questions:
- In a separate screener survey
- At the end of an outgoing survey
- At the beginning of a usability activity
Use screening questions to ask about:
- Demographic information that's not already captured
- Jobs and occupations
- Experiences with similar products
- Experiences with usability testing
- Current behavior around the brand and with existing interfaces
- Any other data that's pertinent to the usability test
Use a Video Feedback question as a screener task.
In a separate screener survey prior to the unmoderated usability test, add a Video Feedback question that records participants speaking on camera. You are looking for participants who are comfortable appearing on camera, thinking out loud, and articulating their thought processes. Once you have identified these participants, you can flag them with a profile variable as part of your usability testing pool.
Alternately, add a Recording action before a small task in your screener survey.
The Recording action will record participants' screens and, if desired, audio and video. Again, you are looking for participants who are comfortable appearing on camera, thinking out loud, and articulating their thought processes. Once you have identified these participants, you can flag them with a profile variable as part of your usability testing pool.
Provide a "None of the above," "I don't know," "I prefer not to answer," or "Other" option.
This is a best practice for any research question. Answer options should be mutually exclusive and comprehensively exhaustive. The options should not overlap at all and there should be one for every possible response. In the context of screener surveys, you want to ensure participants aren't forced to choose an inaccurate answer and they don't accidentally end up qualifying for the activity.
Example | ||||
---|---|---|---|---|
Poor example
Better example
|
Provide clear, non-overlapping answers.
This prevents participant confusion about which answer to choose.
Example | ||||
---|---|---|---|---|
Poor example
Better example
|
Avoid asking leading questions.
You want participants to give the answer that applies to them, rather than the answer they think you want.
Example | ||||
---|---|---|---|---|
Poor example
Better example
|
Consider using choice questions instead of yes/no questions.
Yes/no questions are sometimes appropriate, but they can also be leading questions. If you provide a list of options to choose from, you can hide the response you're looking for so participants don't know which one you're screening on.
Example | ||||
---|---|---|---|---|
Poor example
Better example
|
Avoid compound questions.
Compound questions can make participants confused about which part of the question they're answering.
Example |
---|
Poor example How satisfied or dissatisfied are you with the features and usability of this product? Better example Split the prior question into two questions:
|
Ask participants to indicate their product familiarity using a scale instead of a yes/no response.
This makes things clearer for participants, and allows you to collect more useful data.
Example | ||||
---|---|---|---|---|
Poor example
Better example
|
Use specific timeframes when asking about the frequency of use.
This makes things clearer for participants, and allows you to collect more useful data.
Example | ||||
---|---|---|---|---|
Poor example
Better example
|
Screen for industry first, and then ask about someone's occupation in a follow-up question.
Start with a question that lists broad categories. Use survey logic to show participants a follow-up question that lists occupations based on the category they chose.
Example |
---|
Initial question: Which industry do you work in?
If someone chooses Health, show them a question that lists different health occupations. |