Back to blog

What is LiDAR Sensor in iPhone 12 Pro and What It Can Do?

20.11.2020

To begin with, LiDAR is not new. It has been around for a while and was mainly used by scientists usually in the form of LiDAR drones to examine the surface of the Earth. It was used in conjunction with other sensors on various autonomous cars to help them get a better sense of objects that are around them.

LiDAR stands for Light Detection and Ranging and it is similar to radar but it uses light instead of radio waves. Radar stands for Radio Detection and Ranging. The way LiDAR works is it fires out rapid pulses of infrared laser light in quick succession (lasers are highly focused wavelength of light). This light then bounces off it and returns to the LiDAR sensor. The time it takes for the light to make the round trip is then calculated and used to create a 3D model of the object that can be used for measuring distance to the object, height and shape of the object etc. Radar, on the other hand, does the same thing but with radio waves. It sends out radio waves, which then bounce off the objects and the time it takes for them to return is measured to discern the distance and position of the objects.

One of the main differences between the LiDAR and the radar is that LiDAR is a lot more accurate and can create more defined 3D models, whereas radar rather sees objects in more of block-like shapes. Because of this fact, LiDAR can be used to easier determine things, for instance, the way a person faces a hand from a face, the number of branches or leaves on the tree etc., where radar would have difficulty performing the same task. Radar has a much further range than LiDAR and is much cheaper to use. It also is less affected by rain, fog or snow compared to LiDAR, since these conditions can interfere with the light. LiDAR sensor, however, emits its own light, so it can measure distance even when there is not much light around. The quality of the photo is limited to how good the image sensor and image processing software is. The LiDAR sensor can help with portrait mode photos but it will not improve the overall image quality of the photos. It will improve the auto focus speed and accuracy in both photos and videos though. iPhone 12 Pro and iPhone 12 Pro Max will have up to six times faster auto focus in low light by virtue of LiDAR sensor.

Another big use of the LiDAR sensor is the ability to measure accurately the size of objects with the measure app that comes pre-installed on the phone. The sensor has a limited range of about 5 meters, which means you need to be fairly close to the object to take advantage of this technology, but it seems a lot more accurate than the measure app on previous iPhones without LiDAR.

LiDAR and radar are used in conjunction with GPS, ultrasonic sensors, cameras when we talk about an autonomous car and LiDAR drones and planes to round out the system.

Why do we see it in consumer electronics these days? Time of flight or TOF systems have been used in phones to help with AR (augmented reality) and other things like portrait mode focusing for instance. Technically speaking, the time of flight is the method that is used to measure the time of flight of the light to and from the object to determine the distance. LiDAR is the sensor that is used to do this. If you are familiar with Android phones like Samsung or Motorola that have had time of flight sensors on their devices for a while, you may ask yourself what is the difference between Apple’s LiDAR and other phones. It is true to say that Android’s time of flight camera system and Apple’s time of flight LiDAR sensor both use infrared light that bounces off of things in front of them to determine distance and shape of objects. The difference between the two though is that Apple uses multiple points for reference whereas the Android time flight system essentially uses a blanket of infrared light for all of its measurements. The infrared light used by both of these systems is invisible to a human eye. However, if you use a camera with a night vision mode, it will see the infrared light. The Android phone dispenses a flashlight of infrared light. Regardless of the method, the software has to take all of the reflections and determine which of them are indirect and off-angle, remove them from the calculations, and then put the numbers together and the valid reflections. This is harder to do with a single flash versus the multiple points. The downside to LiDAR is this application is the same as most new technologies, but it is more expensive and the fact that the only real use of this technology is 3D effects like portrait mode or similar.

One big advantage of the LiDAR sensor though is that portrait mode photos should be much more accurate and it is the reason why Apple lets us take portrait mode photos with night mode on the 12Pro. LiDAR sensor is very useful for certain things but it is an unlikely essential tool to be used in everyday life. However, as these new technologies become more prevalent in our devices, we may start to see new use cases that we will be able to benefit from with time.

Many people anticipate that the data that Apple collects from the LiDAR sensors in the iPad Pro and the iPhone Pro models will help create a better AR experience for the Apple glasses that will be supposedly released in the next few years. Stay tuned and discover more with us as we continue reviewing the latest technologies that the industry has to offer.

Here, at Magnise we had successfully applied the LiDAR technology in developing robotics software for determining spatial orientation. If you have LiDAR ideas but not sure how to implement them, we would be delighted to develop a LiDAR processing software for you.

Tell Us About Your Project.

Stay tuned and discover more with us as we continue reviewing the latest technologies that the industry has to offer.

Content

Have A Question?