The InterAccess approach to testing, is backed by many years of usability experience working directly with people with disabilities. Our main goals are to provide an environment for our panel to comfortably test your website, app or service in a way that can give the best results.
The remote dynamic means that these tests may be directly or indirectly facilitated. This depends on the members of the panel’s personal circumstances and situation.
The test process steps are outlined broadly as follows (with some more detail below):
- Test script design containing representative tasks and goals
- A test facilitator will introduce the test participant to the service to be tested, and
- Ask the test participant to follow a set of representative flows or tasks
- Video record the user performing the test (where possible).
- Post test questions
- These videos, from all tests, are then analysed by the test facilitator
- A usability test report is compiled (with video highlights and recommendations), and provided to the client
- Any actionable outcomes are discussed
InterAccess Usability Testing Panel
Our panel of usability testers are people with a range of disabilities who use assistive technologies. These are screen reader, screen magnifier, keyboard only, switch or voice access technologies. Some may use combinations of technologies, or have accessibility related needs where they do not use assistive technology at all. All our test participants all have varying levels of AT proficiency from beginner to power users.
InterAccess remote usability testing – the protocol
In a usability test, each user carries out a set of realistic tasks that have been agreed beforehand with our client. These will usually include the most common tasks for which the product is used, as well as the most critical tasks and any tasks that test facilitator feels will cause problems. Tests are carefully designed to yield the most realistic user behaviour and best results.
InterAccess mainly use a ‘think aloud’ protocol.
InterAccess remote usability testing – the process
The testing process is as follows:
- Design the test and write a test script: Firstly, how many people do we need? Usually between 5-8 participants is enough. We ask the test participants to complete the tasks with instructions, as needed. For some tests, there may be minimal instructions as the test may be designed to assess what initial affordances exist in the site being tested, and if they translate to a modality the test participant can understand.
- Recruit participants: We find suitable test participants from our panel. These may be people who have a range of disabilities and use assistive technologies such as screen readers, screen magnifiers, or who are keyboard only, use switch or voice access technologies. Our panel may have other accessibility related needs. They may or may not use assistive technology at all. Our test participants all have varying levels of AT proficiency from beginner to power users.
- Conduct think-aloud protocol: During the test, we initially ask for consent to take part, and then proceed by giving any suitable instructions. There are relevant ‘pre’ and ‘post’ test questions. During a facilitated test, we may ask open-ended questions, if a test is self-directed the panellist follows the test script supplied, and after they have completed the tasks, there are follow-up questions from the facilitator at the end.
- Analyse findings and present insights: Where possible the tests are recorded and the results are analysed. An experienced test facilitator can present useful insights and find common problems, as well as report on any usability related aspects of the findings. Finally, based on the test results, advice can be given for actions to either alter the design or code, in a way that will improve the user experience for people with disabilities.
When to do usability testing?
There are two main different types of tests that we can facilitate.
- Usability testing as an iterative part of accessible development sprints: For example, if you have some widgets that you need testing with real users, this can be a great way of running light tests, that focus just on the component in question. This can give you useful validation testing feedback.
- Usability testing at the end of an accessibility review: For example, if you have completed an accessibility review with InterAccess, which means we have audited your site, app, or service, provided you with fixes that you have then successfully implemented. We can then run a usability test which is designed to validate the good accessibility work that you have already added, and as a way of validating the quality of the user experience.
Is remote usability testing as good as ‘’real world” usability testing?
Our service uses well established techniques such working with experienced usability analyst test facilitators, the use of the ‘think aloud’ protocol within the practical limitations of our COVID era society to perform useful remote tests. Formal usability testing is very much associated with the ‘scientific method’ and while it is certainly valid and useful – it is not what InterAccess are primarily interested in. We therefore use the remote nature of our interactions to our advantage, as more people are comfortable using Zoom, and other tools to communicate, test and relate the findings to our clients.
Some potential advantages of remote testing are:
- Easier (and faster) recruitment: Remote testing allows us to gather a panel of participants in a short timeframe
- Participants can work through test tasks simultaneously in their own natural environment where they are most comfortable
- Reduced cost
- Initiate a high-value, or high-impact discussions about usability in your organisation There is also some evidence to suggests that remote testing is as valuable, and as results orientated, as ‘in the room’ testing, as well as having other advantages as being performed more quickly with less logistical overhead and cost.
Want to know more? Contact us for a quote hello(Replace this parenthesis with the @ sign)interaccess.ie or read more about expert usability analysis