Apple Previews New iPhone Accessibility Features: Live Captions, Door Detection, and Apple Watch Mirroring

Tablets

Apple has always prioritized accessibility, working to break down barriers so everyone can use their devices and benefit from innovative technologies. This commitment to inclusion took center stage at WWDC 2022, where Apple previewed impressive new accessibility features arriving soon for iPhone, Apple Watch, and more.

Headlining are Live Captions on iPhone, Apple Watch Mirroring, and a Door Detection app that leverages LiDAR. These powerful new tools continue Apple’s purposeful push to extend their life-changing devices to all who wish to use them. Read on for an in-depth look at how Apple’s latest accessible innovations open more possibilities for people with disabilities to connect and communicate.

Live Captions on iPhone

Apple is bringing Live Captions to iPhone, iPad, and Mac to automatically transcribe conversations and media in real time. Live Captions uses on-device processing to generate subtitles without an internet connection.

When enabled, speech is transcribed and displayed at the top of the screen in a continuous scroll. Captions will work across iOS for FaceTime calls, videos or voice messages, transforming audio conversations into accessible text.

Live Captions benefits people with hearing challenges by converting everything they hear into easy-to-read text. But it also assists anyone in loud environments where audio is unclear. And presenting words visually reinforces meaning for different learning styles.

Powered by advanced machine learning, Live Captions recognizes multiple speakers and languages. Captions can be tailored to emphasize key audio cues like speaker identification, laughter, music tags, and sound effects using iOS 16’s new recognize custom actions. Users control caption size, colors and transparency.

Live Captions will be available across Apple devices later this year and joins iOS 15’s real-time transcription features for FaceTime. Apple continues leading mobile accessibility by applying AI to open more communication avenues.

Apple Watch Mirroring

Those unable to physically interact with Apple Watch controls will soon gain independence through Apple Watch Mirroring. This clever accessibility feature wirelessly mirrors the entire Apple Watch interface to iPhone. That mirrored screen becomes a controller.

Using iPhone’s touch, voice commands, or supported switches and assistive devices, actions initiated on the mirrored Apple Watch exactly mimic on-watch manipulations. Everything from taps and swipes to Apple Pay and notifications can be controlled remotely.

Apple Watch Mirroring removes physical and motor barriers to accessing the Watch’s health insights, communication, personalization and more. Users unable to raise their wrist or tap the small display can leverage iPhone’s larger screen, voice input, and adaptive accessories to fully command Apple Watch without assistance.

Mirroring preserves privacy and security since all controls remain between the paired devices. To prevent misuse, Apple Watch can be instantly disconnected or deactivated through iPhone. By mirroring UI wirelessly, Apple Watch Mirroring grants interaction independence.

Door Detection via LiDAR on iPhone

iPhone’s LiDAR scanner gains a new accessibility use case with Door Detection, an app that helps blind or low vision users navigate unfamiliar indoor spaces. Door Detection uses LiDAR, camera, and on-device machine learning to identify doors, describe door attributes, and provide guidance.

As users move with iPhone in hand, the app scans surroundings then audibly indicates when a door is detected and estimates distance. Door descriptions include details like locations, labels, handles, and signs. The app guides users from door to door with directional wayfinding and instructions.

Door Detection provides key cues to safely navigate indoor areas with doors in the way. The LiDAR’s depth-sensing capacities, plus camera and ML intelligence, combine to convey actionable spatial awareness. Paired with Apple Maps for navigation and Magnifier to read signs, Door Detection fills in awareness gaps.

These new additions continue Apple’s admirable legacy of accessibility advancements spanning hardware, software, and services. But designing for inclusiveness requires understanding diverse needs. Apple maintains an open dialogue with the disability community to shape tools that truly empower.

Nothing About Us Without Us – Apple’s Panel of Experts

To build accessibility features that truly resonate with real user needs, Apple closely collaborates with their disability advisory panel spanning visual, hearing, mobility and cognitive focus areas.

Launched in 2017, the advisory panel meets multiple times annually to provide Apple with regular feedback on designing for disability. Apple engineers demonstrate upcoming features and solicit candid input to guide development and education.

Panel members openly share frustrations around accessibility shortcomings but also acknowledgments when Apple gets it right. Ongoing participation ensures Apple stays accountable to matching technological possibilities with actual community needs.

Apple also partners with accessibility nonprofits on developing guides and resources to boost adoption. And Apple’s dedicated Accessibility team continually evangelizes accessibility internally. Together these efforts reinforce inclusive design across everything Apple creates.

User response guides future refinements until accessibility feels seamless versus specialized. The advisory panel fulfills Apple’s “nothing about us without us” ethos for humanizing technology to serve everyone inclusively.

How Apple Built a Culture of Accessibility

Apple’s empathy-driven approach to accessibility took root under Steve Jobs, who saw technology’s potential for empowering those historically overlooked. That perspective evolved into systematic inclusion under CEO Tim Cook, himself open about hearing loss challenges.

Key moments in Apple’s accessibility journey include:

– 2001: Mac OS X debuts universal keyboard shortcuts benefitting motor impaired users

– 2005: Closed captions added to iTunes video downloads

– 2009: VoiceOver screen reader launches enabling visually impaired iPhone use

– 2016: Switch Control adapts devices for those with motor limitations

– 2018: Group FaceTime debuts with prominence auto-focusing speaker video

– 2020: Apple Watch gains Blood Oxygen app measuring oxygen saturation

– 2022 and beyond: Apple continues aggressive accessibility expansion with AI-driven features

Apple now employs dedicated engineers and designers strictly focused on accessibility. Making accessibility intrinsic across everything Apple does created a culture of inclusion that permeates the company.

iOS 16 Adds New Accessibility Customizations

Beyond the headline accessibility features previewed at WWDC 2022, Apple continues enhancing iOS 16 with deeper customizations for individual needs:

– Apple Books adds new font choices and spacing/bold text controls to tailor reading.

– Magnifier can detect door signs using Live Text and size/position descriptions.

– Sound Recognition identifies custom sounds like home alarms and notifies when detected.

– Dictation commands can run shortcuts for voice-activated automation.

– Siri Pause Time lets users adjust response waiting periods.

– Guided Access limits iPhone access for focus without distraction.

– Support for BRAILLEX braille displays expands tactile access.

Small touches add up to big improvements. Apple strives for not just accessibility but accessibility with personalization.

The Future Is Accessible Technology for All

As Apple proves, thoughtful inclusion uplifts innovation while expanding who participates. Accessibility features create ripple effects touching lives even beyond intended users. Live Captions assist noisy environments. Door Detection provides everyday navigation aid. Mirroring enables remote device control for everyone at times.

Though more work remains, Apple moves accessibility forward as an expectation not exception. As technology increasingly weaves into everyday environments, ensuring equal access, agency and dignity regardless of ability becomes ever more imperative.

Apple understands accessibility as a basic human right and social responsibility, not just a business vertical. By listening to and uplifting disabled voices, Apple leads in empowering technology consumers too often overlooked or underserved. We all move forward when tech opens opportunity to vulnerable communities.

Together with disability advocates, Apple continues pushing technology to reveal its best humanity. Their journey teaches how inclusion fuels impact. When designers broaden perspectives beyond personal experiences, understanding expands and accessibility elevates all.

Leave a Reply

Your email address will not be published. Required fields are marked *