Think about the last time you did something seemingly simple on your phone, like booking a rideshare. To do this, you had to unlock your phone, find the right app, and type in your pickup location. The process required you to read and write, remember your selections, and focus for several minutes at a time. For the 630 million people in the world with some form of cognitive disability, it’s not that easy. So we’ve been experimenting with how the Assistant and Android can work together to reduce the complexity of these tasks for people with cognitive disabilities.
Back at I/O, we shared how Googler Lorenzo Caggioni used the Assistant to build a device called DIVA for his brother Giovanni, who is legally blind, deaf and has Down Syndrome. DIVA makes people with disabilities more autonomous, helping them interact with the Assistant in a nonverbal way. With DIVA, Giovanni can watch his favorite shows and listen to his music on his own.
DIVA was the starting point for Action Blocks, which uses the Google Assistant to make it easier for people who have a cognitive disability to use Android phones and tablets. With Action Blocks, you add Assistant commands to your home screen with a custom image, which acts as a visual cue.
The Action Block icon—for example, a photograph of a cab—triggers the corresponding Assistant command, like ordering a rideshare. Action Blocks can be configured to do anything the Assistant can do, in just one tap: call a loved one, share your location, watch your favorite show, control the lights and more.
Action Blocks is the first of our many efforts to empower people with cognitive disabilities, help them gain independence, connect with loved ones and engage in the world as they are.
The product is still in the testing phase, and if you’re the caregiver or family member of someone with a cognitive disability that could benefit, please join our trusted tester program. Follow us @googleaccess to learn more.by via The Keyword
Comments
Post a Comment