Apple puts a lot of work into making its products accessible, and its latest feature will help users who are blind or have low vision navigate crowds of people.
We spotted this news at TechCrunch, reporting that the latest iOS beta packs in a feature that takes advantage of the LiDAR (or just lidar) tech used by the iPhone 12 Pro and Pro Max models.
If you’re wondering what the heck lidar is, the term refers to light detection and ranging, which uses lasers to bounce off objects and provide precise distance measurements.
When the new feature rolls out for everyone, it will be part of the Magnifier app, telling users if and how many people are within range of the phone’s wide-angle camera. This is done through various feedback options, including stereo voiceover and tones, plus haptic feedback depending on the user’s preferences and accessibility needs.
It seems like there’s a decent amount of customisation available depending on how you want to receive feedback or if you want to stay a specific distance away — like the recommended 1.5m social distancing here in Australia.
Lidar in the iPhone 12 Pro and Pro Max has previously been marketed for its use in AR and assisting the camera in autofocusing faster in low light. However, this accessibility feature is arguably its best use yet.
iPhone 12 Pro and Pro Max users on the beta iOS 14.2 client can check out the people detection features now. If you don’t fall into this category, it’s worth checking out Apple’s vision accessibility page to find out what other tools might come in handy.