What is LiDAR?
In essence, a LiDAR scanner measures the distance between itself and objects in its environment by firing lasers and measuring how long the light takes to bounce back. It’s similar in execution to radar, except you’re swapping out the long-range radio waves for infrared light – the infrared nature is why you won’t actually see the lasers doing their thing while you’re using the scanner, but there are ways to see it in action using camera tech.
By firing thousands of lasers per second, LiDAR scanners can provide not only real-time distance measurements, but object size measurements too. That data can be used to construct 3D models, improve autofocus capabilities of cameras, provide more accurate augmented reality experiences and more.
It’s used in self-driving cars, construction, 3D model construction and companies are coming up with more innovative ways to use the tech every day – the racetracks in racing sim Project CARS were scanned with LiDAR to provide an accurate digital representation, for example.
LiDAR generally works on a much smaller scale than radar, although the exact capabilities will depend on the LiDAR scanner in question. While the LiDAR scanner of the iPhone 12 Pro and iPad Pro ranges can detect objects up to 5 meters away, LiDAR scanners employed in self-driving cars have a much better range – it has to be able to see incoming traffic, pedestrians, traffic signs and potential hazards on the road, after all.
What does LiDAR do on the iPhone 12 Pro range?
The main reason Apple decided to include a LiDAR scanner on the iPhone 12 Pro was to improve augmented reality performance, providing apps with more information about the environment for a smoother, more spatially accurate AR experience. That means Pokémon in Pokémon Go will stay in place while you try to catch them, and it’ll also allow you to preview products from IKEA in your own home to give you an idea of how it’d look – and if it’ll fit.
The AR performance is great across the entire iPhone 12 range, but the LiDAR scanner of the Pro range does a notably better job at recognising the environment and anchoring virtual objects in the real world.
It’s not limited to improved AR performance either; the nature of the LiDAR scanner means that the Measure app is much more accurate on the iPhone 12 Pro range, and for the first time, you’ll be able to point your camera at somebody and get an accurate(ish) height measurement.
There are also improvements to the overall camera experience, especially when it comes to low-light photography. The rest of the iPhone 12 range relies on Apple’s “focus pixel” technology, essentially Apple’s version of phase-detect autofocus (PDAF) found on Android rivals. The problem is that like much of the competition, the tech relies on light to locate the focal point, which is why the dimmest night photos can sometimes appear blurry.
But, by using the LiDAR scanner to measure the distance between the iPhone and the subject of the photo, it doesn’t need to rely on environmental light to get an accurate focal point. This should make it much easier to get a great night-time shot on the first try, especially when sat on a tripod and using the dedicated Night mode.
Can you 3D scan with the iPhone 12 Pro?
While LiDAR does a great job at scanning buildings, environments and other objects, it’s not quite accurate enough to scan objects precisely – that’s according to the brains behind popular iPhone camera app Halide, Sebastiaan de With.
While developing a proof-of-concept app that uses the LiDAR scanner to 3D scan objects and print them in a 3D printer, de With discovered that “the mesh output by the system right now isn’t accurate enough”. It was tested with the iPad Pro, not the iPhone 12 Pro, but the two products share the same LiDAR system so results are expected to be identical.
But don’t be totally disheartened: in a blog post on the Halide website, de With claims that “it’s a great starting point for a 3D model, since all the proportions will be very accurate”. You’ll just have to clean them up before getting them printed.
So, while you won’t get high-end 3D models by using the LiDAR scanner on the iPhone, it could give you a base model to work with.
Is LiDAR used for Face ID?
No – while the Face ID scanner uses a 3D dot matrix to measure your facial proportions and confirm your identity, it doesn’t measure distance. It’s designed to be used in extremely close proximity with impressive detail, while the rear-facing LiDAR scanner is designed to identify and measure objects up to 5m away.
Will LiDAR come to other iPhones in the future?
In a word – yes. While it’s a feature exclusive to the Pro-level iPhones and iPads for now, it’s more than likely that Apple has long-term plans to bring the sensor to the rest of the iPhone range. When that might be, however, is a little more uncertain.
It’s likely that Apple will want to enjoy LiDAR exclusivity on the Pro-level devices for at least a year or two, but with all the new functionality on offer from the LiDAR scanner – including the ability to measure objects accurately and improve focus on low-light photography – it’ll make the standard iPhone a much more desirable purchase compared to Android alternatives that generally rely on comparatively basic ToF systems.
Tech has a way of trickling down from the ultra-high-end to the mass market within a few years of release, and it’s likely going to be the same with the LiDAR scanner.