Skip to content

Is LIDAR the Missing Link for Apple Glasses?

There was a very interesting new technology built into Apples recently released iPad Pro. Something that could pave the way for the much-rumoured Apple Glasses. Something that could prove to be very useful for, not just augmented reality, but accessibility, specifically mobility.

Say hello to LIDAR.

What Is LIDAR?

LIDAR stands for Light Detection and Range, and it’s a technology that has been around for quite awhile in one form or another. It’s used in many different fields, from geography, archaeology, accident and crime scene mapping and ocean exploration. You may have most recently heard of it in the context of self-driving cars, as it’s one of the core sensors that make autonomous driving possible.

In basic terms, it uses pulses of light to map the surrounding environment, resulting in a very accurate 3D model of the real world. Obviously, in terms of self-driving cars, it’s vital that the car knows exactly what objects and possible dangers are around it, and LIDAR can provide this environmental data many times a second. So just why is it used in an iPad?

Why Should We Care?

The ability to 3D map your real-world surroundings is something that is also very important when it comes to augmented reality.

So far Apple has used clever software algorithms and cameras to identify various surfaces, such as tables, walls etc, so that it can overlay digital content that seems to interact with real-world objects in a convincing manner. And it does work well, but it’s not perfect.

Using a LIDAR sensor means that a much more accurate and convincing mix of the digital and real world is possible, with game characters that can walk around, and interact with objects in the room you’re standing in. Of course, there are other applications for this feature, but augmented reality is definitely Apples focus for this technology. With Apple now using a LIDAR sensor on the new iPad Pro, it would seem reasonable to suggest that LIDAR would be an essential part of Apple Glasses.

The whole point of wearable glasses is augmented reality, being able to recognise and map the real-world and offer relative, contextual information from that data. Now, this is all great, but what does it mean for us as visually impaired people? Well, having a real-time precise map of our surroundings could obviously be incredibly useful for mobility, object recognition and more. Using a camera, software might detect a black circle on the ground in front of you, but it wouldn’t know if it’s a hole, a shadow or just a puddle. LIDAR on the other hand would know exactly it’s shape and dimensions and therefore know if it was something to be avoided.

Of course, how fast Apples LIDAR sensor can capture, and process data is currently probably not even close to real-time but, as we all know, technology never stands still and is always improving. Any technology that can give us more information about what’s around us is definitely something we should all be interested in.

In episode 127 of the Double Tap Canada Radio show, Steven, Shaun & Tim discuss the new LIDAR sensor and what this could mean for us and peoples acceptance of Apple glasses.

Listen To The Audio Below:

Shaun Preece View All

Co-host & audio producer on the Double Tap Canada radio show. Occasional contributor to Double Tap TV, full time shed resident.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: