What does iPhone 12 Pro’s lidar feature actually do? Here it is in action

See what lidar can do on the iPhone 12 with this 3D-scanning app

The Canvas app scans homes using the 3D iPhone 12 Pro’s lidar. It is expected to do much more.


Iphone 12 pro Deep-scanning LIDAR sensor is set to open up a lot of possibilities for 3D-scanning apps on phones. A new one designed for home scanning, called Canvas, uses LIDAR for extra accuracy and detail. But the app will go back to work with non-pro iPhones iphone 8, Too.

The approach taken by Canvas describes how Lidar can run in the iPhone 12 Pro app. It can add greater accuracy and detail to processes already possible on non-LIDAR-equipped phones and tablets via other methods.

Read more: The lidar technology of the iPhone 12 does more than improve photos. Check out this cool party trick

The canvas, created by Boulder-based company Occipital, was originally launched earlier this year for the iPad Pro to take advantage of its LIDAR scanning. When I saw a demo of its possibilities back then, I saw it as an indication of how Apple’s depth-sensing technology could be applied to home-improvement and measurement applications. Updated app scans that are clear and crisp.

Since the launch of Lidar-equipped iPhones, a handful of customized apps have emerged offering 3D scanning of objects, large-scale space-scanning photography (photogrammetry) and augmented reality, which mapped-out to virtual objects Maps can be mixed. But the sample scan of Opital’s Canvas app on the iPhone 12 Pro, embedded below, is faster than the 3D scanning app I’ve played so far.

Apple’s iOS 14 According to the product’s Alex Asif and Anton Yakubenko’s OPPital, gives developers more raw access to the iPhone’s Lidar data. This has allowed Occipital to build its algorithm to make the best use of Apple’s LIDAR depth map. For non-LIDAR-equipped phones, this may allow OPPITAL to implement depth-mapping data for future improvements to its app.

It is possible to scan 3D space without specific depth-mapping LIDAR or time-of-flight sensors, and companies like 6d.ai (acquired by Niantic) are already using it. But Schiff and Yakubenko say LIDAR still provides a faster and more accurate upgrade to that technology. At this time the iPhone 12 version of Canvas takes a more detailed scan than the first version on the iPad Pro, mostly due to iOS 14 having a deeper access to Lidar information according to Oedipal. The latest LIDAR-enabled version is accurate within the 1% range, while non-LIDAR scans are accurate to within the 5% range (actually making the iPhone 12 Pro a pro upgrade for those who might need a boost ).

Apple’s iPad Pro Lidar provides 574 depth points per frame on a scan, according to Yakubenko’s previous measurements, but depth maps can jump up to 256×192 points in iOS 14 for developers. It creates more detail through AI and camera data.

Canvas room scans can convert to a workable CAD model, taking about 48 hours, but also work on performing more quick scans and adding semantic data (such as identifying doors, windows and other room details) with occipital AI. Has been doing.

As 3D scans and 3D data begin to reside on iPhones and iPads, it will also make sense for common formats to share and edit files. While iOS 14 uses a USDZ file format for 3D files, Occipital is its own format for more-in-scan scans, and when converted to a CAD model, .rvt, .ifc, .dwg, .skp, and Can output in .plan formats. . At some point, 3D scans may be standardized as PDFs. We are not there yet, but we may need to get there soon.

Be the first to comment

Leave a Reply

Your email address will not be published.