Accessibility Study: Assistive Technologies
At Sky, accessibility is more than compliance — it’s about creating digital experiences that everyone can use.
As we prepared to launch a new customer satisfaction survey through Qualtrics, we wanted to be sure that people using assistive technologies could complete it independently.
Our question was simple:
Can customers using tools such as screen readers or voice control complete our survey without assistance?
The study
To find out, we ran a short observational study with two participants who use different assistive technologies.
Each participant was asked to imagine they were dissatisfied with the Sky.com website and wanted to leave feedback through the embedded Qualtrics survey.
We observed their experiences as they navigated the process, capturing what worked well and where barriers appeared.
Participant 1 introduction
Screen reader
- Screen reader user (JAWS and iPhone VoiceOver).
- Has severely limited vision and relies on high contrast and audio feedback.
Participant 2 introduction
Screen reader
- Voice control user (Dragon NaturallySpeaking, Alexa, Siri).
- Has no finger dexterity and uses voice and alternative input methods to navigate.
Screen reader challenges
What we observed
Unclear labelling
Radio button graphics were read as “quote quote”, and rating scales lacked meaningful context (“five stars” without description).
Distracting images
Inline graphics broke the reading flow and confused the screen reader.
Ambiguous navigation
A “Done” button between questions caused uncertainty about progress.
Tab order issues
When tabbing, options were skipped and focus jumped directly to buttons.
No end feedback
The “Next” button did not clearly indicate completion, and the final “Thank you” message was not read automatically.
Cookie modal overload
The cookie consent dialogue was read out in full with no ability to skip or focus on action buttons.
Voice control challenges
What we observed
“Show links” not working
Radio buttons had no link attributes, preventing selection using his preferred command.
Workarounds required
He had to use “show grid” or mouse commands — slower and less intuitive.
Partial voice access
Text fields and submit buttons responded to commands, but inconsistently. This caused him to rely on alternative commands such as the move mouse command to complete interactions.
Accessible usability scores
Participant 1
Screen Reader
Participant 2
Voice Control
The scores are based on the following statements:
-
I would like to use this website frequently, if I had a reason to.
-
I found the website unnecessarily complex.
-
I thought the website was easy to use.
-
I think that I would need the support of another person to use all of the features of this website.
-
I found the various functions of the website made sense and were compatible with my technology.
-
I thought there was too much inconsistency in how this website worked.
-
I would imagine that most people with my assistive technology would learn to use this website quickly.
-
I found the website very cumbersome or awkward to use.
-
I felt very confident using the website.
-
I needed to familiarize myself with the website before I could use it effectively.
Our recommendations
Based on the observations, we identified a number of improvements to make the survey more inclusive:
-
Add clear, contextual labels to all form controls (e.g. “1 – Very easy”, “5 – Very difficult”).
-
Remove meaningless alt text such as “quote quote”.
-
Replace the “Done” button with “Next question” for clarity.
-
Update the final call to action from “Next” to “Submit” to signal survey completion.
-
Automatically focus screen readers on success or confirmation messages.
-
Ensure voice control commands (“show links”, “click”) work consistently across elements.
-
Simplify or restructure cookie modals for easier navigation.
What we learned
Despite persistence and adaptability, both users experienced unnecessary friction completing what should have been a simple survey. Running the study reinforced several key lessons for designing accessible surveys and interfaces:
-
Keep it short and simple. Tasks take longer for assistive technology users — clarity is everything.
-
Expect hybrid setups. Many users combine multiple tools — screen readers, magnifiers, and voice controls — depending on the situation.
-
Diverse participants mean stronger insights. Real experiences differ widely; include users across devices and technologies.
-
Small changes, big difference. Fixing alt text or button labels can dramatically improve usability for everyone.
Testing with real people brought empathy and nuance to the process — reminding us that accessibility isn’t a checklist;
it’s an ongoing conversation between technology, design, and the people who rely on it most.
Final thoughts
Accessibility testing with platforms such as Fable made it possible to learn directly from users and identify meaningful improvements.
The findings didn’t just highlight barriers — they revealed opportunities to make every interaction smoother, clearer, and more human.
Good accessibility is good experience design.