In typical Apple fashion, accessibility isn’t just an afterthought, it’s front and center. And today, things just got a whole lot cooler (and a little sci-fi) with the company previewing even more accessibility features ahead of iOS 19, macOS 16, visionOS 3, and the rest of its upcoming software lineup.
Among the usual suspects, like Accessibility Nutrition Labels for apps and a Mac version of the Magnifier, Apple is working on something that feels straight out of a Black Mirror episode (but in a good way): brain control for your iPhone. Yes, you read that right.
Controlling Your iPhone With Your Brain? That’s Not Just a Concept Anymore
A new Wall Street Journal report takes a deeper dive into Apple’s brain-computer interface (BCI) work, specifically highlighting a partnership of sorts with a company called Synchron. They’re building a device known as the Stentrode, and it’s kind of amazing.
The Stentrode is a stent-like implant that sits in a vein near your brain’s motor cortex and reads brain signals. That data is then used to control Apple devices — like the iPhone, iPad, and even the Apple Vision Pro.
And yes, it’s already being tested in the real world. Here’s the most incredible part from the WSJ story:
Mark Jackson, an early tester of the Stentrode implant, was able to peer over the ledge of a mountain in the Swiss Alps and feel his legs shake. Jackson can’t stand up, and he wasn’t in Switzerland. He was wearing an Apple virtual-reality headset, which was connected to his implant
Jackson can’t travel from his home outside Pittsburgh because he has ALS. Still, he is learning how to control his iPhone, iPad and Vision Pro headset thanks to a connection between his Stentrode implant and Apple’s various operating systems.
Even though the tech is still early days and lacks cursor movement support, it’s a massive step toward giving people with conditions like ALS greater digital freedom.
What Else is Coming in iOS 19?
Apple’s not stopping there. With iOS 19, a new Switch Control protocol for BCIs will officially roll out. That means folks using brain-computer interfaces will be able to navigate their devices without any physical movement at all.
And if you’ve heard of Apple’s Personal Voice feature from iOS 17, the one that let you clone your voice by reading 150 phrases, there’s good news: iOS 19 cuts that down to just 10 phrases. Plus, your AI voice now gets generated in under a minute and sounds way more natural. Huge win for users with ALS or other speech-affecting conditions.
Related: iOS 19 To Add AI Battery Management, Big Boost for iPhone 17 Air
The Bottom Line
Brain-controlled iPhones. Smarter, faster AI voice tools. Apple’s accessibility vision is getting more futuristic by the minute, and thankfully, it’s rooted in real-world impact.
We’ll hear even more about these features during WWDC 2025 on June 9, and if this is just the teaser, we can’t wait to see what’s next.
What would you do first if you could control your phone with your mind?
Comments