LIDAR AND YOUR DEVICES
Lidar is as a sensor is incorporated in the design features of devices primarily to capture more detail images of scenes on the fly. It is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. Lidar is commonly used to make high-resolution images, with far ranging applications. Incorporated in devices like cameras and robotic vacuum cleaners, it is capable of capturing images not only in three dimensions, but also at night with exceptional details. It’s kind of fly by night vision camera with a tremendous static and dynamic data capture capability.
When lidar is not incorporated in smartphones, its only capable of measure depth with a single light pulse, whereas a smartphone with this type of lidar technology sends waves of light pulses out in a spray of infrared dots and can measure each one with its sensor, creating a field of points that map out distances and can “mesh” the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you could see them with a night vision camera.
Lidar is a technology that’s sprouting up everywhere. It’s used for self-driving cars, or assisted driving. It’s used for robotics and drones. Augmented reality headsets like the HoloLens 2 have similar tech, mapping out room spaces before layering 3D virtual objects into them. There’s even a VR headset with lidar. But it also has a pretty long history.
Microsoft’s old depth-sensing Xbox accessory, the Kinect, was a camera that had infrared depth-scanning, too. In fact, PrimeSense, the company that helped make the Kinect tech, was acquired by Apple in 2013. Now, we have Apple’s face-scanning TrueDepth and rear lidar camera sensors.
Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro did the same. Apple promises better low-light focus, up to six times faster in low-light conditions. The lidar depth-sensing is also used to improve night portrait mode effects. So far, it makes an impact: read our review of the iPhone 12 Pro Max for more. With the iPhone 13 Pro or other similarly equipped smartphones, it’s a similar story: the lidar technology is the same, even if the camera technology is improved.
Better focus is a plus, and there’s also a chance the iPhone 12 Pro or other similarly equipped lidar smartphones could add more 3D photo data to images, too. Although that element hasn’t been laid out yet, Apple’s front-facing, depth-sensing TrueDepth camera has been used in a similar way with apps, and third-party developers could dive in and develop some wild ideas. It’s already happening.
Lidar allows the iPhone and iPad Pros and other similar equipped lidar smartphones to start augmented (AR) applications a lot more quickly, and build a fast map of a room to add more detail. A lot of Apple’s core augmented reality(AR) technology takes advantage of lidar to hide virtual objects behind real ones (called occlusion), and place virtual objects within more complicated room mappings, like on a table or chair.
it’s been tested out on an Apple Arcade game, Hot Lava, which already uses lidar to scan a room and all its obstacles. It had been able to place virtual objects on stairs, and have things hide behind real-life objects in the room. Expect a lot more augmented (AR) applications that will start adding lidar support like this for richer experiences.
But there’s extra potential beyond that, with a longer tail. Many companies are dreaming of headsets that will blend virtual objects and real ones: Augmented reality (AR) glasses, being worked on by Facebook, Qualcomm, Snapchat, Microsoft, Magic Leap and most likely Apple and others, will rely on having advanced 3D maps of the world to layer virtual objects onto.
Those 3D maps are being built now with special scanners and equipment, almost like the world-scanning version of those Google Maps cars. But there’s a possibility that people’s own devices could eventually help crowdsource that info, or add extra on-the-fly data. Again, augmented reality (AR) headsets like Magic Leap and HoloLens already prescan your environment before layering things into it, and Apple’s lidar-equipped augmented reality (AR) technology works the same way. In that sense, the iPhone 12 and 13 Pro and iPad Pro are like AR headsets without the headset part and by implication could pave the way for Apple’s first VR/AR headset in the future. For an example of how this would work, look to the high-end Varjo XR-3 headset, which uses lidar for mixed reality.
Lidar can be used to mesh out 3D objects and rooms and layer photo imagery on top, a technique called photogrammetry. That could be the next wave of capture tech for practical uses like home improvement, or even social media and journalism. The ability to capture 3D data and share that information with others could open up these lidar-equipped phones and tablets to be 3D-content capture tools. Lidar could also be used without the camera element to acquire measurements for objects and spaces.
We have tried a few early lidar-enabled 3D scanning apps on the iPhone 12 Pro with mixed success (3D Scanner App, Lidar Scanner and Record3D), but they can be used to scan objects or map out rooms with surprising speed. The 16-foot effective range of lidar’s scanning is enough to reach across most rooms in a house, but in bigger outdoor spaces it takes more moving around. Again, Apple’s front-facing TrueDepth camera already does similar things at closer range. Over time, it’ll be interesting to see if Apple and other similar equipped smartphones end up putting 3D scanning features into their own camera applications software, putting the technology more front-and-center. For now, 3D scanning is getting better, but remains a more niche feature for most people.
Google had this same idea in mind when Project Tango, an early augmented reality (AR) platform that was equipped on two phones it created. The advanced camera array also had infrared sensors and could map out rooms, creating 3D scans and depth maps for AR and for measuring indoor spaces. Google’s Tango-equipped phones were short-lived, replaced by computer vision algorithms that have done estimated depth sensing on cameras without needing the same hardware. This time, however, lidar is already finding its way into cars, AR headsets, robotics, and much more.
CULLED FROM CNET, WIKIPEDIA and edited with additional material from ETECHNOW