Unmoderated usability testing
Learn more about what unmoderated usability testing is, and how you can use it to develop and optimize the user experience for your intended audience.
Need to get started with unmoderated usability testing fast? Check out our quick start guides:
What is unmoderated usability testing?
An unmoderated usability test allows you to evaluate the user's experience while interacting with a product, website, app, feature, or prototype. It helps your designers, product teams, and key stakeholders assess how intuitive and easy to use products are. It also allows you to test assumptions and identify potential problems and areas for improvement.
The goal is to identify issues you might have missed by asking real users to complete a series of usability tasks. Then, you analyze the results, success rate, and paths taken to completion. Ultimately, you can use this data to improve your designs, help users achieve their objectives, and enable users to have a positive product experience.
- Unmoderated: The participant is completing the test on their own.
- Remote: The session is conducted virtually via desktop, camera, and microphone. The session is recorded.
- Self-paced: Participants complete the usability test in their own time, in their own environment, while the activity is open.
Alida's unmoderated usability testing solution lets you collect quantitative and qualitative data to assess your products' usability.
Quantitative data |
|
Qualitative data |
|
Differences between unmoderated and moderated usability tests
Unmoderated usability test | Moderated usability test | |
---|---|---|
Moderation |
|
|
Location |
|
|
Scheduling |
|
|
Participants |
|
|
Budget |
Tip: Consider budgeting for incentives.
Usability tests are more time-intensive than typical surveys, and incentives are a nice way to motivate participants and thank them for their participation. For more information about incentive options in the Alida platform, see Incentives. |
Tip: Consider budgeting for incentives.
Usability tests are more time-intensive than typical surveys, and incentives are a nice way to motivate participants and thank them for their participation. For more information about incentive options in the Alida platform, see Incentives. |
Benefits |
|
|
Limitations and considerations |
|
|
Best practices for unmoderated usability testing
In general, keep these considerations in mind as you conduct unmoderated usability tests:
- Challenge your
assumptions.
You may think you know how users will navigate through a design and where the pitfalls might be, but will your users think the same? This is your chance to test your hypotheses.
- Decide on your metrics to
measure.
Out of the box, you can report on the following metrics:
- Task success question:
Were participants successful in completing the task, yes or no?
Tip: The task success question is customizable. "Were you successful in completing the task?" is a common example, but this does not have to be the question you pose to participants.
- Task duration: What was the average time participants needed to complete the task, in minutes?
- Task success question:
Were participants successful in completing the task, yes or no?
- The metrics you need may influence your overall activity design. For example, if you want to filter results based on whether participants have prior experience using your product, you may need to include a "Have you used this product before?" question before the usability tasks.
- Get consent.
Unmoderated usability testing involves recording participants through their devices, so it's crucial that participants give their consent explicitly before they begin.
Luckily, collecting consent is built into Alida's unmoderated usability testing solution. Participants cannot proceed without doing so.
- Run a pilot test of your
activity.
We highly recommend finding internal stakeholders to do a dry run of your unmoderated usability test, or doing a soft launch before distributing the activity to your full list of participants. This way, you validate the clarity of the exercise and ensure you're capturing the metrics you need so that when you do the full distribution, you are not "wasting" completes.
- Be inclusive and recruit
participants that represent your entire customer base.
You want to make sure your designs are intuitive for all potential users of different backgrounds, demographics, and abilities.