Writing effective unmoderated usability test questions
In an unmoderated usability test, there is no moderator who can answer clarifying questions or set the discussion tone or agenda. Therefore, writing questions that provide them clear guidance is crucial to getting the insights you want. Keep these tips and best practices in mind when you're authoring the unmoderated usability activity.
Need to get started with unmoderated usability testing fast? Check out our quick start guides:
Usability Task questions
Usability Task questions ask participants to complete actions and activities. Based on how the participants fare, you can better understand user experience and how to optimize it.
In an unmoderated test, it's important that tasks are as clear as possible to participants. Participants do not have someone to turn to if they are confused and have clarifying questions, or if they misunderstand the assignment and need someone to get them back on track.
Since participants are completing the task in a new browser tab, you need to give them a clear end point to indicate when they should switch back to the survey tab (for example, "Once you're done exploring or you've been able to sign up to join, come back to this tab and click Next").
The question text should also cover a scenario in which participants are unsuccessful (for example, "If you haven't been able to sign up to join within five minutes, please return to the survey and click Next"). You want participants to return to the survey and let you know they were unsuccessful instead of just closing their browser tabs and being marked as incomplete.
Usability Task questions can fall into two broad categories:
- Open-ended tasks
- Specific tasks
The following table outlines the differences in both.
Open-ended tasks | Specific tasks | |
---|---|---|
Description |
|
|
Use cases |
|
|
Potential pitfalls |
|
|
Examples |
|
|
Question writing tips |
|
|
If your usability test features multiple tasks and you want to have a mix of both question types, we recommend structuring your Usability Task questions like a funnel, working from general to more precise. Start with open-ended questions and drill down to more specific tasks.
Use piping to include participants' prior responses in the text of a Usability Task question.
For example, "You indicated in the screener questions that you subscribe to 2 or more streaming TV services. Please describe how you decide on which streaming service to watch on any given evening, how you look for content that interests you, which movies or shows you're most interested in, etc."
Follow-up questions
In an unmoderated test, you don't have a moderator to dig deeper in the moment and follow up on interesting tangents that participants' comments may present. Having good follow-up questions that probe a little deeper, and prompt participants to reflect more on the tasks they just completed, could be a viable substitute. You can use follow-up questions to collect reflections or any further data points you need in your analysis later. It's a good idea to follow up with both closed- and open-ended questions for the best understanding and ease of analysis.
Here are a few examples to get you started.
Example: Close-ended questions |
---|
How easy was it to find what you were looking for?
How likely are you to recommend this site to a friend or colleague? Tip: You can use a
Net Promoter Score℠1 question for this. For more
information, see
Create a Net Promoter Score question.
Did you notice if there was another way to [complete a specific step/task]? Was there anything about [task] that surprised you? |
Example: Open-ended questions |
---|
What are your thoughts on the layout of [section]? What frustrated you most about this task? If you had a magic wand, how would you improve this site? What did you like about the site? What came to mind when you had to [task]? What would you expect to happen once you've [task]? What would enable you to accomplish [task] more effectively? What are your overall impressions of the product or the session? What worked well/poorly during [task]? What difficulties did you have with [task]? What comments did you want to add during the test but didn't? Do you have any final thoughts on the task or on today's session? |