Smartphones
This multifaceted and multifunctional invention is the portal into the digital world.
Unfortunately, a subset of people cannot use them to their full extent – People with Visual Impairments (PVI).
PVIs have the right to use smartphones and everything that comes with them just as much as visually abled people do.
Unfortunately, they do not have the necessary means to do so.
Accessibility
Current Accessibility Tools, like Voiceover, Talkback, or Screen Readers, work by layering on top of the device's already existing UI.
What these tools make 'accessible' is still the Information Architecture and 'Graphic' User Interface designed for sighted people.
These tools instruct users with visual impairments on how to use a GUI via audio feedback.
They are slow, limiting, inflexible, unintuitive, and cognitively demanding to use. Users get a sub-par interaction quality.
Solution
Sighted people use hearing, touch, and vision to operate smartphones, with vision being the primary mode of interaction. The challenge here is to create a solution (design) that removes the dependency on vision.
To redesign IA and UI from ground up to natively incorporate accessibility.
A redesign that focuses on priority and minimalism, on reduced complexity, time and effort, and on increased comprehension, intuitivity, autonomy, and usability.
Goals
To proficiently use her smartphone. To get the most out of it in as little time as possible.
Needs
No nesting in options and settings
Less and un-sequenced gestures and commands
Prioritization of information
Stripping of unnecessary attributes (could be visual elements or irrelevant information)
Optimized audio feedback
Reduced cognitive load.
Customizability
Pain Points
The system requires too many gestures.
Repetitive gestures and sequenced gestures complicate things for her. (Too much swiping in notifications or too much scrolling.)
The constant audio feedback repetitively recites all the elements on the screen and breaks focus.
Must hear everything when not all attributes are important.
Scrolling through lists is a nightmare. Must swipe for every entry.
The system is not flexible and lacks alternative ways to do things
Behaviour
She does it anyway cause she has practice. Sometimes, if she is lost, she starts again from square one. She has no choice and deals with it.
Goals
To learn a new way of interacting with his smartphone.
Needs
Less cognitive load
Optimized audio feedback
Better communication
Stripping of unnecessary attributes
Less fingers on the screen
No need to learn the system.
Behaviour
Ask someone about where he is on the smartphone or what is on the screen. Peeks through, removes the blindfold, or gives up.
Pain Points
Audio feedback gives a headache.
The words get jumbled, and there is no time to absorb what is happening.
The controls are too complicated. The system is hard to navigate.
Lack of proper communication about state change of the system.
It unnecessarily reads everything from top to bottom.
Usage of multiple fingers is confusing.
It is hard to keep switching the number of fingers on the screen as different gestures require different numbers.
Many a times, the system is irresponsive.
Elements on the screen are too close to each other
Why are Secondary users required?
Experience of these users represents the experience of people who lost their sight at an adult age and had to learn a new way of interacting with their smartphones. They also have an added benefit in user testing. When a sighted person who has never used assistive tools can quickly learn and use the new system when blindfolded, it suggests that the project is going in the right direction.
Says
I can do it.
It is not that hard.
I am used to it.
It is the only way I know.
Thinks
It is the only way I can do it.
This is annoying.
Where was I?
Where did the options go?
What does that mean?
Does
Start again from square one.
Does it anyway.
Drops it.
Feels
Frustrated
Impatient
Confused
Says
How hard can it be?
Thinks
What was the gesture to do this?
Did it work?
Where am I?
Does
Peeks through or removes the blindfold.
Drops it.
Feels
Annoyed
Impatient
I do not want to do this.
Non Visual
Minimalist
Practical
Priority
Intuitive
Inclusive
Visual Information Architecture
Sighted users perceive the visual information architecture all at once. All the information and options are readily available to them and they already know where everything is, at first glance, and can make an informed decision quickly.
On the other hand, a user who can't see the screen will have to traverse through all the information and options one by one to be able to perceive the screen. It will take them time to understand where everything is. The process of making an informed decision is considerably slower.
Non- Visual Information Architecture
To solve the above mentioned problem, I filtered the information based on what was immediately required and stripped the UI of all the secondary information and metadata.
In the new non-visual information architecture, the screen is divided into 2 columns. The left column is for apps and features, whereas the right is for action buttons.
When users select an app, they are presented them with sub-sections/features of the app on the right side. They can now choose what data to access. Finding a needle in a smaller haystack is easier than finding it in a bigger haystack.
Low-fidelity Wireframes
These wireframes depicts the Phone, Messages, Media, Control Center, and Notifications screens. All the apps are displayed in the left column, and their respective sub-sections/options are displayed in the right.
This is the stage where I decided that taps will only tell what's under the finger and drags will perform action (inspired by Moon Type).
Dial Pad screen and the feature of dragging to select and releasing touch to perform the action (inspired by rotary telephones) was also solidified here.
And, this is the stage I created Sticky Scroll (inspired by MS Word's middle button scroll).
High-fidelity Prototype
To create a functional prototype, I tried various UI/UX tools but none of them had the necessary features to prototype the system that I created. One solution was to go for gaming engines, but that would make the prototype too hefty, which it didn't need to be.
Hence, I chose to code the prototype from scratch using Flutter - Google's open-source, cross-platform UI software development kit.
You can download the app for the android to test it for yourself and head over to the How It Works page to see the prototype in action, with instructions on how to use it.
Download Android App