select between over 22,900 AI Tool and 17,900 AI News Posts.
This Thursday is Global Accessibility Awareness Day (GAAD), and as has been its custom for the last few years, Apple's accessibility team is taking this time to share some new assistive features that will be coming to its ecosystem of products. In addition to bringing "Accessibility Nutrition Labels" to the App Store, it's announcing the new Magnifier for Mac, an Accessibility Reader, enhanced Braille Access as well as a veritable cornucopia of other updates to existing tools.
According to the company's press release, this year in particular marks "40 years of accessibility innovation at Apple." It's also 20 years since the company first launched its screen reader, and a significant amount of this year's updates are designed to help those with vision impairments.
Magnifier for Mac
One of the most noteworthy is the arrival of Magnifier on Macs. The camera-based assistive feature has been available on iPhones and iPads since 2016, letting people point their phones at things around them and getting auditory readouts of what's in the scene. Magnifier can also make hard-to-read things easier to see, by giving you the option to increase brightness, zoom in, add color filters and adjust the perspective.
With Magnifier for Mac, you can use any USB-connected camera or your iPhone (via Continuity Camera) to get feedback on things around you. In a video, Apple showed how a student in a large lecture hall was able to use their iPhone, attached to the top of their MacBook, to make out what was written on a distant whiteboard. Magnifier for Mac also works with Desk View, so you can use it to more easily read documents in front of you. Multiple live session windows will be available, so you can keep up with a presentation through your webcam while using Desk View to, say, read a textbook at the same time.
Accessibility Reader
Magnifier for Mac also works with another new tool Apple is unveiling today — Accessibility Reader. It's a "new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision." Accessibility Reader will be available on iPhones, iPads, Macs and the Apple Vision Pro, and it's pretty much the part of Magnifier that lets you customize your text, with "extensive options for font, color and spacing." It can help minimize distractions by getting rid of clutter, for instance.
Accessibility Reader also supports Spoken Content, and as it's built into the Magnifier app, can be used to make real-world text like signs or menus easier to read as well. You can also launch it from any app, as it's a mode available at the OS level.
Apple
Braille Access
For people who are most comfortable writing in Braille, Apple has supported Braille input for years, and more recently started working with Braille displays. This year, the company is bringing Braille Access to iPhones, iPads, Macs and Vision Pros, and it's designed to make taking notes in Braille easier. It will come with a dedicated app launcher that allows people to "open any app by typing with Braille Screen Input or a connected braille device." Braille Access also enables users to take notes in braille format and use Nemeth code for their math and science calculations. Braille Access can open files in the Braille Ready Format (BRF), so you can return to your existing documents from other devices. Finally, "an integrated form of Live Captions allows users to transcribe conversations in real time directly on braille displays."
Apple Watch gets Live Captions; Vision Pro gets Live Recognition
Wrapping up the vision-related updates is an expansion of such accessibility features in visionOS. The Zoom function, for instance, is getting enhanced to allow wearers to magnify what they see in both virtual reality and, well, actual reality. This uses the Vision Pro's cameras to see what's in your surroundings, and Apple will make a new API available that will "enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes." Finally, Live Recognition is coming to VoiceOver in the Vision Pro, using on-device machine learning to identify and describe things in your surroundings. It can also read flyers or invitations, for example, and tell you what's on them.
For those who have hearing loss, the Live Listen feature that's already on iPhones will be complemented by controls on the Apple Watch, plus some bonus features. When you start a Live Listen session on your iPhone, which would stream what its microphone picks up to your connected AirPods, Beats headphones or compatible hearing aids, you'll soon be able to see Live Captions on your paired Apple Watch. You'll also get controls on your wrist, so you can start, stop or rewind a session. This means you can stay on your couch and start Live Listen sessions without having to go all the way over to the kitchen to pick up your iPhone and hear what your partner might be saying while they're cooking. Live Listen also works with the hearing health and hearing aid features introduced on the AirPods Pro 2.
Background Sounds, Personal Voice, Vehicle Motion Cues and Eye Tracking get updates
While we're on the topic of sound, Apple is updating its Background Sounds feature that can help those with tinnitus by playing white noise (or other types of audio) to combat symptoms. Later this year, Background Sounds will offer automatic timers to stop after a set amount of time, automation actions in Shortcuts and a new EQ settings option to personalize the sounds.
Personal Voice, which helps those who are at risk of losing their voice preserve their vocal identity, is also getting a major improvement. When I tested the feature to write a tutorial on how to create your personal voice on your iPhone, I was shocked that it required the user to read out 150 phrases. Not only that, the system needed to percolate overnight to create the personal voice. With the upcoming update, Personal Voices can be generated in under a minute, with only 10 phrases needing to be recorded. The resulting voice also sounds smoother and with less clipping and artifacts. Apple is also adding Spanish language support for the US and Mexico.
Last year, Apple introduced eye-tracking built into iPhones and iPads, as well as vehicle motion cues to alleviate car sickness. This year, it continues to improve those features by bringing the motion cues to Macs, as well as adding new ways to customize the onscreen dots. Meanwhile, eye-tracking is getting an option to allow users to dwell or use a switch to confirm selections, among other keyboard typing updates.
More across Apple TV, CarPlay, Head Tracking and Settings
Apple's ecosystem is so vast that it's almost impossible to list all the individual accessibility-related changes coming to all the products. I'll quickly shout out Head Tracking, which Apple says will enable people to more easily control their iPhones and iPads by moving their heads "similar to Eye Tracking." Not much else was shared about this, though currently head-tracking on iPhones and iPads is supported through connected devices. The idea that it would be "similar to Eye Tracking" seems to imply integrated support, but we don't know if that is true yet. I've asked Apple for more info and will update this piece with what I find out.
Speaking of connected devices, Apple is also adding a new protocol to Switch Control that would enable support for Brain Computer Interfaces (BCIs). Theoretically, that would mean brainwave-based control of your devices, and Apple lists iOS, iPadOS and visionOS as those on deck to support this new protocol. Again, it's uncertain whether we can go as far as to say brainwave-based control is coming, and I've also asked Apple for more information on this.
For those who use Apple TV, Assistive Access is getting a new custom Apple TV app featuring a "simplified media player," while Music Haptics on the iPhone will offer the option to turn on haptics for an entire track or just the vocals, as well as general settings to fine-tune the intensity of taps, textures and vibrations.
The Sound Recognition feature that alerts those who are deaf or hard of hearing to concerning sounds (like alarms or crying babies) will add Name Recognition to let users know when they are being called. Sound Recognition for CarPlay, in particular, will inform users when it identifies crying children (in addition to the existing support for external noises like horns and sirens). CarPlay will also get support for large text, which should make getting glanceable information easier.
Other updates include greater language support in Live Captions and Voice Control, as well as the ability to share accessibility settings quickly and temporarily across iPads and iPhones so you can use a friend's device without having to painstakingly customize it to your needs.
There are plenty more accessibility rollouts from Apple across its retail locations, Music playlists, Books, Podcasts, TV, News, Fitness+ and the App Store, mostly around greater representation and inclusion. There isn't much by way of exact release window for most of the new features and updates I've covered here, though they have usually showed up in the next release of iOS, iPadOS, macOS and visionOS.
We'll probably have to wait until the public rollout of iOS 19, iPadOS 19 and more to try these on our own, but for now, most of these seem potentially very helpful. And as always, it's good to see companies design inclusively and consider a wider range of needs.This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/apple-brings-magnifier-to-macs-and-introduces-a-new-accessibility-reader-mode-120054992.html?src=rss