As our lifespans stretch well beyond the traditional retirement horizon, the future of work is undergoing a profound transformation. With some individuals living well into their 110s—and babies born today potentially reaching 150—we're not just facing longer lives, but longer careers. Many of us may be working into our 80s or 90s. The implication is clear: technology must evolve to support not just the young and able but also the aging who will increasingly make up a significant portion of the global workforce.
Apple’s upcoming announcements at WWDC June 9-13, 2025, are expected to reveal a major shift in this direction. The company is expected to release a comprehensive set of accessibility features, launching later this year across iOS, macOS, visionOS, and more. These features reflect a broad commitment to inclusive design—one that will affect how users interact with enterprise software, productivity tools, and digital platforms at every stage of life.
Key Innovations Supporting Aging and Accessibility
- Accessibility Nutrition Labels for Apps
A major development is the introduction of Accessibility Nutrition Labels on the App Store. These will allow users to view detailed accessibility features available in each app—such as VoiceOver, Voice Control, Larger Text, Reduced Motion, Captions, and Sufficient Contrast—before downloading.
- Magnifier for Mac
Previously available on iPhone and iPad, the Magnifier app is coming to macOS. It allows users with low vision to zoom in on physical materials documents, screens, and whiteboards using their device's camera. It works with Continuity Camera, USB webcams, and Desk View for document reading.
- Accessibility Reader
The new Accessibility Reader is a systemwide reading mode available on iOS, iPadOS, macOS, and visionOS. It lets users adjust font size, spacing, colors, and spoken content to match their reading preferences. It can also launch directly from apps or be used via Magnifier to read physical text including menus or labels.
This benefits users with dyslexia, vision impairment, or reading fatigue, and supports inclusive documentation, UI text, and in-app content presentation.
- Braille Access
Braille Access turns Apple devices into full-featured braille notepad. It supports Braille Screen Input, connected braille displays, and direct editing of BRF (Braille Ready Format) files.
Notably, the system includes Nemeth Braille support for math and science, and real-time transcription of conversations with Live Captions on braille devices. This functionality improves participation in meetings, training sessions, and written communication for blind professionals.
- Live Captions Across Devices
Live Captions are now supported across more Apple devices, including Apple Watch. The iPhone can act as a microphone to stream audio and display captions in real time, even at a distance. This enables users with hearing loss to follow meetings, presentations, or conversations more easily in both in-person and remote environments.
- Updates to Apple Vision Pro
The Apple Vision Pro headset received powerful accessibility updates. Its advanced camera system now supports Passthrough Zoom, enabling users to magnify their environment. It also supports Live Recognition, using on-device AI to describe objects, scenes, and documents aloud.
A new API will allow approved apps to access Vision Pro's camera for live interpretation and guidance, expanding use cases in fields such as remote assistance, training, or field operations.
- Brain-Computer Interface Support
Apple is adding a protocol to support Brain-Computer Interfaces (BCIs) through Switch Control on iOS, iPadOS, and visionOS. This enables users with severe motor disabilities to operate Apple devices without physical movement, using brain signals and external assistive hardware.
- Personal Voice Enhancements
For users at risk of losing their ability to speak, Personal Voice creates a custom voice model using just 10 phrases. The new version makes this faster and more natural sounding. It now includes support for Spanish (Mexico), with more languages expected.
This helps users maintain communication, especially in customer service, caregiving, or remote work settings where voice is essential.
- Eye and Head Tracking Enhancements
Eye Tracking on iPhone and iPad now includes dwell and switch selection, making navigation more flexible. QuickPath and dwell timers improve keyboard use for non-touch input. Head Tracking expands similar capabilities to users who rely on subtle head movements for interaction.
- Sound and Name Recognition
Apple’s Sound Recognition feature now supports Name Recognition, alerting users when their name is spoken. This is useful in shared spaces or noisy environments, especially for users who are deaf or hard of hearing.
- Vehicle Motion Cues on Mac
To reduce motion sickness when using devices in moving vehicles, Vehicle Motion Cues are available to macOS, joining iOS and iPadOS. This helps users stay productive in transit without discomfort.
- Assistive Access for Apple TV
Assistive Access introduces a custom Apple TV app with a simplified media interface, improving usability for individuals with intellectual and developmental disabilities. Developers can now create tailored experiences using the Assistive Access API.
- Accessibility Sharing and Shortcuts
A new feature, Share Accessibility Settings, lets users temporarily transfer their settings to another Apple device—ideal for public kiosks, borrowed devices, or testing across environments.
Shortcuts now include Accessibility Assistant and tools such as Hold That Thought, making accessibility automation and personal productivity more seamless.
Why This Matters to the Future of Work
As more of the workforce experiences age-related changes in hearing, vision, dexterity, or cognition, applications must meet them where they are. Apple’s 2025 accessibility roadmap is a clarion call: If your digital experience isn’t accessible, it isn’t future-ready.