Speech Accessibility Project
Reading Time: < 1 minute

Apple and University of Illinois Collaborate with Meta, Google, and more on Speech Accessibility Project.

The goal of this initiative is to improve how AI algorithms can be tuned to help those with speech disabilities, including ALS and Down Syndrome. The team will study and evaluate the effectiveness of their work.

Earlier Engadget reported about the Speech Accessibility Project that has yet to go online at the time of writing. Many tech companies such as Amazon, Apple, Google, Meta, and Microsoft and non profits like Team Gleason, which empowers those living with ALS, and Davis Phinney Foundation for Parkinson’s have joined hands with the University of Illinois to work on the Speech Accessibility Project.

According to the National Institutes of Health, nearly tens of millions of people in the United States alone have diseases that affect speech. Over the years many tech companies have innovated the voice assistant space with tools like Siri, Amazon Alexa, Google Assistant and others.

Apple’s Accessibility features can be used to take advantage of Voice Control and VoiceOver, which are best-in-class for those who experience low vision or limited mobility.

Voice-driven features may not be as natural as they seem based on the underlying algorithms. They are only successful to the degree that their algorithms are accurate, which is critical for users who can’t speak due to conditions such as Lou Gehrig’s disease or cerebral palsy.

Updates will come when the Speech Accessibility Project is launched.

Related Articles:
Artificial Intelligence and Machine Learning Degree
Artificial Intelligence Applications in Oil and Gas Industry
Artificial Intelligence in Automotive Industry