At the Worldwide Developers Conference, Apple announced a major update to RealityKit, a suite of technology that allows developers to start building AR (augmented reality) experiences. More control over images, sounds, and animations when working with AR experiences, but the most striking part of the update is how Apple’s new Object Capture API allows developers to create 3D models in minutes using just an iPhone.
Apple noted during the developers’ mention that one of the hardest parts of creating a great AR app is the 3D modeling process. These can take hours and thousands of dollars.
With Apple’s new tools, developers will be able to take a series of photos using just an iPhone (or iPad, DSLR, or even a drone. if they want) to capture a 2D image of the object from any angle, including below.
Apple explains using the Object Capture API on macOS Monterey, using just a few lines of code to create 3D models.
in the beginning The developer will start the session. New photogrammetry in RealityKit, which points to the folder where they’re captured. They then call process functions to create a 3D model at the desired level of detail. Object Capture allows developers to create USDZ files optimized for AR Quick Look, a system that allows developers to add virtual 3D objects to apps or websites on iPhone and iPad. 3D models can also be added. Dimensions in AR scenes in Reality Composer in Xcode.
Apple says developers like Wayfair, Etsy and others are using Object Capture to create 3D models of real-world objects. Which is an indication that online shopping is about to get a big AR upgrade.
For example, Wayfair is using Object Capture to develop tools for its manufacturers. to be able to create virtual trade shows This will allow Wayfair customers to preview more products in AR than they can today.
Apple also notes that developers, including Maxon and Unity, are using Object Capture to create 3D content within 3D content creation apps like Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the render pipeline to customize the look and feel of dynamic loading AR objects for assets. The ability to create your own entity composition system to organize assets in your AR scene, and the ability to create player-controlled characters. So users can jump, expand, and explore AR worlds in games that use RealityKit.
A developer named Mikko Haapoja from Shopify has been experimenting with the new technology (see below) and shared a real-world test in which he photographed objects using an iPhone 12 Max via Twitter.
Developers who want to test it themselves can use Apple’s preview apps and install Monterey on their Mac to try them out. They can use the Qlone camera app, or any other capture application they want to download from the App Store, to take the photos they need to capture objects, Apple said. In the fall, the Qlone Mac companion app will Take advantage of the Object Capture API as well.
Apple says there are more than 14,000 ARKit apps in the App Store today, created by more than 9,000 developers using more than 1 billion AR-enabled iPhones and iPads worldwide. It is therefore noted that Apple offers the largest AR platform in the world.