A number of new accessibility features for the iPhone, iPad, and Apple Watch will be available later this year for people with impairments, including additional navigation, health, and communication capabilities.
A new feature revealed by Apple today is specifically designed for blind or low-vision users. As soon as you arrive at a new location, you’ll want to know how far away you are from a door and whether or not it’s open or closed. Door Detection can help you do just that. It will tell you how far away you are from a door and how you can open it. Additionally, Door Detection can scan signs and symbols surrounding the door, such as the room number at an office or the existence of an accessible entrance mark on comparable iPhones and iPads with the LiDAR Scanner.
Apple Watch Mirroring, a new feature revealed today, would allow users to control their Apple Watch remotely from their associated iPhone. Apple Watch Mirroring allows users to manage their Apple Watch using iPhone assistive capabilities like Voice Control and Switch Control, as well as inputs such as voice commands, sound actions, and head tracking.
Apple is expected to introduce Live Captions for the iPhone, iPad, and Mac later this year. Any audio content, whether it be a phone call, video conferencing app, streaming media content, or a discussion with a person sitting next to them, will be easier for users to follow along with thanks to this new feature.
Users can also modify the font size to make reading easier. Live Captions in FaceTime assign automatically transcribed language to call participants, making group video chats for users who have hearing problems even more accessible. When Live Captions are enabled for calls on a Mac, users have the option to type an answer and have it read aloud to the other people who are participating in the discussion at the same time. In addition, because Live Captions are generated locally on the device, user information is kept confidential and safe.