Apple unveils new accessibility features for iOS 17 to benefit users with disabilities
Apple has announced a range of new software features for iOS 17 aimed at improving accessibility for users with cognitive, vision, hearing, and mobility disabilities. The company has also developed innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak.
These updates are based on advances in both hardware and software and include on-device machine learning to ensure user privacy. Apple has a longstanding commitment to making products accessible to everyone, and the company works closely with community groups representing a broad spectrum of users with disabilities to develop these features.
The new accessibility features, which include Assistive Access, Live Speech, and more, will be available later this year. The Assistive Access feature allows users to control their devices using facial expressions, while Live Speech provides real-time captions for conversations.
“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple's senior director of Global Accessibility Policy and Initiatives. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”
Apple's efforts to improve accessibility are commendable, and the company's dedication to working with disability communities to develop these features is a crucial aspect of their success. The new accessibility features announced by Apple will undoubtedly make a real impact on people's lives, allowing them to connect with the world in new and exciting ways.
