Apple has just given us our first official look at iOS 16. Of course, Apple didn’t call it that—as it did last year, it just listed a few planned accessibility enhancements to its operating systems, adding that they’ll be available “later this year with software upgrades across Apple platforms.” This code applies to iOS 16, iPad OS 16, watchOS 9, and macOS 13.
While the capabilities introduced by Apple are fantastic for persons with various visual, speech, or movement impairments, they also signal certain broader advancements, notably in AI and machine learning, that we will most likely see throughout Apple’s operating systems in the future. Based on the announcements, here are a few of the significant enhancements we can expect in iOS 16:
Must Read: Everything You Need To Know About Apple M1 Ultra
Better speech recognition with live captions
Android has had live captioning since version 10, and Apple will get it three years later. When you select this setting, your iPhone or Mac (provided it has Apple hardware) will automatically produce subtitles for nearly any audio material, including videos, FaceTime calls, phone conversations, etc. It’s a natural extension of the on-device speech processing introduced in iOS 15 last year, but it significantly improves the feature’s complexity.
We hope this means Siri will be able to better comprehend your commands and dictation, but comparable enhancements may arise in other places. Consider the Notes app, which has a “transcribe” capability to create text from any audio or video clip. Suppose Live Caption is to be regarded as an accessibility feature. In that case, its transcription must be faultless, and it opens the door to a world of possibilities for the remainder of iOS 16.
Must Read: Everything you need to know about Mac Studio and Studio Display
AirPlay improvements courtesy of Apple Watch mirroring
Another accessibility feature that will be available later this year is the ability to mirror your Apple Watch on your iPhone and operate it using the iPhone’s display. It’s designed to make things easier for those with motor function difficulties, letting them benefit from all of the iPhone’s enhanced accessibility features.
This means that if Apple can mirror your Apple Watch to your iPhone and allow you to interact with it completely, it can probably mirror your Mac or iPad as well! That feature alone would be revolutionary.
Door detection Feature
Apple has announced that the Magnifier app will be upgraded with the ability to identify doors in real-time, estimate their distance, and read the writing on them. It is only available for LiDAR-enabled devices (which is how it determines range); however, it signifies a higher increase in object detection.
It’s reasonable to infer that Apple’s Door Detection function is a natural outgrowth of the company’s previous work in augmented reality scenarios and object detection. So don’t be shocked if new ARKit framework developer features are revealed at WWDC. It could start with new AR applications in iOS 16, but it will surely surface in far larger endeavours as Apple progresses.