Highlights:
• New features of apple for disabilities
• Door Detection feature for blind Users
• Live Captions facilities for Deaf
Apple today previewed innovative software features that introduce new ways for users with disabilities to navigate, connect, and get the most out of Apple products.
These updates combine the company’s latest technologies to deliver unique and customizable tools for users, and build on Apple’s long-standing commitment to making products that work for everyone.
Apple introduced many advance features which will help many disable people . These features will be available later this year with software updates across Apple platforms.
Door Detection for Users Who Are Blind or Low Vision
Apple is introducing Door Detection, a cutting-edge navigation feature for users who are blind or low vision. Door Detection can help users locate a door upon arriving at a new destination, understand how far they are from it, and describe door attributes — including if it is open or closed, and when it’s closed, whether it can be opened by pushing, turning a knob, or pulling a handle. Door Detection can also read signs and symbols around the door, like the room number at an office, or the presence of an accessible entrance symbol. This new feature combines the power of LiDAR, camera, and on-device machine learning, and will be available on iPhone and iPad models with the LiDAR Scanner.
Advancing Physical and Motor Accessibility for Apple Watch
Apple Watch will also help physical and motor disable people with Apple Watch Mirroring, which helps users control Apple Watch using iPhone’s assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display. Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and more.
Live Captions Come to iPhone, iPad, and Mac for Deaf
For the Deaf and hard of hearing community, Apple is introducing Live Captions on iPhone, iPad, and Mac.3 Users can follow along more easily with any audio content — whether they are on a phone or FaceTime call, using a video conferencing or social media app, streaming media content, or having a conversation with someone next to them. Users can also adjust font size for ease of reading. Live Captions in FaceTime attribute auto-transcribed dialogue to call participants, so group video calls become even more convenient for users with hearing disabilities. When Live Captions are used for calls on Mac, users have the option to type a response and have it spoken aloud in real time to others who are part of the conversation. And because Live Captions are generated on device, user information stays private and secure.
Additional Features
• With Buddy Controller, users can ask a care provider or friend to help them play a game; Buddy Controller combines any two game controllers into one, so multiple controllers can drive the input for a single player.
• With Siri Pause Time, users with speech disabilities can adjust how long Siri waits before responding to a request.
• Voice Control Spelling Mode gives users the option to dictate custom spellings using letter-by-letter input.5
• Sound Recognition can be customized to recognize sounds that are specific to a person’s environment, like their home’s unique alarm, doorbell, or appliances.
• The Apple Books app will offer new themes, and introduce customization options such as bolding text and adjusting line, character, and word spacing for an even more accessible reading experience.