Currently it is very difficult to buy a smartphone with a camera that can be described as bad. This is due to the fact that this section is particularly one of the manufacturers who take the greatest care when creating a new terminal, because This is usually one of the features that consumers are looking for the most when buying a new device..
This has led many of us to roll our eyes every time we read or hear that the next generation of the iPhone will again include a better camera system, based on the fact that those of the iPhone 11 and 11 Pro current ones work very well. .
The root of this improved camera system, however, is given by more than the quality of the system, and that is if the rumors are true the next iPhone 12 Pro would incorporate a LiDAR sensor like the iPad Pro 2020 in your set of goals.
Since a lot of this LiDAR still looks like Chinese, let’s start at the beginning, and Let’s see why the next iPhone mounts a LiDAR sensor can be the key to this technology and so that the iPhone is one step ahead of its competitors.
What is a LiDAR sensor?
Let’s start with the name, and it is that LiDAR is the short form of light detection and telemetry, in Spanish the light and range detection, and it defines very well what this sensor does.
Its operation is quite simple to understand, this sensor shoots and receives laser beams to locate objects and structures, be able to compose complete scenarios thanks to the calculation of the distance we scan. The sensor sends a beam, which collides and bounces off an object, which returns the beam to the sensor.
All of this at incredible speed, so we don’t even notice how it goes, and what we get in response is the scene scanner on our screens.
This technology has been used for a long time in other fields such as aviation, both on the ground and in space or in military guidance applications.. In addition, more recently, this technology has managed to find its place in many homes thanks to well-known cleaning robots, which use these sensors to map our surfaces and thus be able to work, or cars with autonomous driving, in which they are used so that the car understands the elements surrounding it and can thus circulate safely on the tracks.
It was only a matter of time before this technology reached our smartphones. With more processing power every time, it’s no surprise that you now want to take advantage of it by integrating LiDAR sensors between your camera systems.
Why a LiDAR sensor on a smartphone?
Although we are talking about LiDAR sensors, a slightly simpler technology has been used in smartphones for a few years, it’s ToF or “Time of Flight” sensors. These sensors they work with infrared light in a similar way to LiDAR but with only one light emission, and calculate the depth of a scene by calculating the time it took for the beam to come and go to the object we are scanning.
So why use LiDAR on iPad Pro and iPhone 12 Pro? LiDAR sensors allow scanning much faster than ToF sensors because they use multiple lasers to compose the scene and, therefore, they can recognize more objects and collect more information than these. In practice, this results in better depth detection when, for example, we have an object in front of another, and it is easier for the LiDAR scanner to create richer and more detailed environments.
On top of all that, of course This technology can be used in conjunction with our smartphone cameras to take photos with a feeling of depth and a bokeh effect.. Apple, however, did not want to use the LiDAR sensor of the only device that has it, because the iPad Pro does not allow to use portrait mode with its rear camera, and preferred to focus it on the use in applications of augmented reality, a segment of that of Cupertino is betting very strongly. Thanks to this, we have applications with which this sensor works very well, such as IKEA Place, with which we can place furniture from the famous store in our homes to preview how they will fit.
Unfortunately, the iPad Pro LiDAR camera has a few limitations. Among these, the one that is most lacking is an API which allows to extract the information that LiDAR captures, for example to export it as 3D objects.
Apple has ARKit, which provides an augmented reality application development environment for its devices, although it currently appears that these applications are not entirely attractive.
Youtube / EverythingApplePro
LiDAR on iPhone 12 Pro
The move of installing a LiDAR sensor on the iPhone 12 Pro is a bet for the future. Today’s augmented reality apps deliver a mediocre experience that isn’t just charming, and that can change completely with the improved experience this sensor would offer. But how will Apple motivate the development of more RA applications?
Selling millions and millions of iPhones a year, what better way to stimulate developers to create augmented reality apps than by placing a sensor that brings out the best of this technology on our iPhone?
Apple wants to play the trick that since the iPhone will have this technology, eventually the developers will eventually develop to use it. If millions of users have a LiDAR scanner in their pockets, why shouldn’t they use it?
In the short term, the take-off of this technology will result in many other applications like IKEA Place to measure rooms and simulate environments, and perhaps a better experience in the field of mobile games with AR. However, in the long run, the use of this technology can result in many creative applications and services that ultimately do justice to what this scanner has to offer..
And the competition?
The next flagship we will see on the shelves is probably the Samsung Galaxy Note 20, which should have a very similar (if not the same) camera system as the one already fitted by the current Galaxy S20 Ultra. If so, it would continue to implement a ToF sensor instead of a LiDAR scanner, which would leave Apple as the only manufacturer with a LiDAR sensor in a smartphone on the market, and therefore the only one capable of offering the best possible experience in augmented reality using a system with much more resolution and potential.
Of course, this is all speculation and rumor, and finally the iPhone 12 Pro could end up mounting a fourth lens in its rear camera system and leaving that sensor aside. However Apple is generally a pioneer in the use of technologies like this, so a LiDAR scanner fits perfectly into an iPhone 12 Pro, which would serve as a ram to make this technology a part of our daily lives.