AR Quick Look, meet Object Capture

RSS for tag

Discuss the WWDC21 session AR Quick Look, meet Object Capture.

View Session

Posts under wwdc21-10078 tag

17 Posts
Sort by:
Post not yet marked as solved
1 Replies
187 Views
Hello, I struggling to integrate this code below in my Content View can someone give me help? import UIKit import QuickLook import ARKit class ViewController: UIViewController, QLPreviewControllerDataSource {     override func viewDidAppear(_ animated: Bool) {         let previewController = QLPreviewController()         previewController.dataSource = self         present(previewController, animated: true, completion: nil)     }     func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 }     func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {         guard let path = Bundle.main.path(forResource: "myScene", ofType: "reality") else { fatalError("Couldn't find the supported input file.") }         let url = URL(fileURLWithPath: path)         return url as QLPreviewItem     } }
Posted
by iRIG.
Last updated
.
Post not yet marked as solved
2 Replies
149 Views
Hi, I can't find the app that used object capture to create a 3 model in this video. https://developer.apple.com/videos/play/wwdc2021/10078/ I think the presenter has said it was "Clone" but there is no such app available in the App Store. Can anybody tell me if the app has changed its name since then?
Posted
by jnsu.
Last updated
.
Post not yet marked as solved
0 Replies
167 Views
Hello, I want to implement AR quick look to my app. I' don't want to use UIKit but SwiftUI. In one of my views, I want to press a button named "preview" and then it should open the AR quick look. Have someone an idea how I can do that? Sorry for that badly formatted code below. import UIKit import QuickLook import ARKit class ViewController: UIViewController, QLPreviewControllerDataSource { override func viewDidAppear(_ animated: Bool) { let previewController = QLPreviewController() previewController.dataSource = self present(previewController, animated: true, completion: nil) } func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 } func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem { guard let path = Bundle.main.path(forResource: "myScene", ofType: "reality") else { fatalError("Couldn't find the supported input file.") } let url = URL(fileURLWithPath: path) return url as QLPreviewItem } } [https://developer.apple.com/documentation/arkit/previewing_a_model_with_ar_quick_look)
Posted
by iRIG.
Last updated
.
Post marked as solved
1 Replies
312 Views
Can I take photos from an android device or any camera instead of an ipad or iphone while using the Object Capture API? What negative side effects can it have. Note: I'm asking this question because I noticed that the size of the usdz model has grown too much in my experiments. (For example 200mb instead of 20mb)
Posted
by mudur.
Last updated
.
Post not yet marked as solved
0 Replies
238 Views
Happens for scans where I flip the object on it's side to get the bottom. I know I can fix it afterwards in xcode, but the images apple uses for their scans come out fine. Screenshot of a scan of a basket, any ideas why it does this?
Posted
by Ross17.
Last updated
.
Post not yet marked as solved
0 Replies
282 Views
I'm making an app that captures data using ARKit and will ultimately send the images+depth+gravity to an Object Capture Photogrammetry agent. I need to use the depth data and produce a model with correct scale, so from what I understand I need to send the depth file + set proper exif data in the image. Since I'm getting the images+depth from ARKit I'll need to set the exif data manually before saving the images. Unfortunately the documentation on this is a bit light, so would you be able to let me know what exif data needs to be set in order for the Photogrammetry to be able to create the model with proper scale? If I try and set my Photogrammetry agent with manual metadata like this: var sample = PhotogrammetrySample(id: id, image: image)       var dict:[ String: Any ] = [:]      dict["FocalLength"] = 23.551325 dict["PixelWidth"] = 1920 dict["PixelHeight"] = 1440       sample.metadata = dict I get the following error in the output and depth is ignored: [Photogrammetry] Can't use FocalLenIn35mmFilm to produce FocalLengthInPixel! Punting...
Posted Last updated
.
Post not yet marked as solved
0 Replies
261 Views
Hello, I know that we can capture object using reality kit and create a 3D model of the real world object. Can we also use reality kit to scan a human and create a 3D model of the human along with the skeletal structure. So we can use the same scanned human in the motion capture and display the motion animation with the real person. Thanks & Regards, Rakesh
Posted
by RPiOS.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
In preview, the 3D model is visible in the background but once the camera button is pressed the image in the screenshot is just the 3D model with a black background which is also saved. QLPreviewController implementation: `func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) { thumbnailIndex = indexPath.item let previewController = QLPreviewController() previewController.dataSource = self previewController.delegate = self present(previewController, animated: true) } func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 } func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem { let url = Bundle.main.url(forResource: models[thumbnailIndex], withExtension: "usdz")! return url as QLPreviewItem }` console output is: 2021-10-07 21:26:30.567439+0200 CPARTest[522:28543] Writing analzed variants. 2021-10-07 21:26:30.620474+0200 CPARTest[522:28738] Metal GPU Frame Capture Enabled 2021-10-07 21:26:30.621447+0200 CPARTest[522:28543] Writing analzed variants. 2021-10-07 21:26:30.623724+0200 CPARTest[522:28738] Metal API Validation Enabled <SCNNode: 0x280e04d00 | no child> -> <ARImageAnchor: 0x2809085b0 identifier="64834AA5-7912-1102-CCC2-77992D9D8FE0" transform=<translation=(0.239411 -0.028021 0.027675) rotation=(84.68° -98.90° -2.02°)> referenceImage=<ARReferenceImage: 0x281600a80 name="cake" physicalSize=(0.050, 0.050)> tracked=YES> 2021-10-07 21:26:34.972808+0200 CPARTest[522:28772] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 11 ARFrames. This can lead to future camera frames being dropped. 2021-10-07 21:26:34.989003+0200 CPARTest[522:28741] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 12 ARFrames. This can lead to future camera frames being dropped. 2021-10-07 21:26:35.006186+0200 CPARTest[522:28772] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 13 ARFrames. This can lead to future camera frames being dropped. 2021-10-07 21:26:35.023000+0200 CPARTest[522:28750] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 14 ARFrames. This can lead to future camera frames being dropped. 2021-10-07 21:26:35.039644+0200 CPARTest[522:28739] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 15 ARFrames. This can lead to future camera frames being dropped. 2021-10-07 21:26:35.537424+0200 CPARTest[522:28741] [default] [self.extensionContext conformsToProtocol:auxHostProtocol.protocol] - /Library/Caches/com.apple.xbs/Sources/ExtensionFoundation/ExtensionKit-49.1/ExtensionFoundation/Source/NSExtension/NSExtensionSupport/EXExtensionContext.m:332: Class QLPreviewExtensionHostContext does not conform to aux host protocol: QLRemotePreviewHost
Posted
by DaliborK.
Last updated
.
Post not yet marked as solved
1 Replies
652 Views
Hey. Anyone have experience build AR quick look from Reality Composer. How can I set the object always facing the camera during the AR experience. For now I can only do scene start trigger which only play once. Is there any way we can keep it alway facing the camera? Thank.
Posted
by DDHTOM.
Last updated
.
Post not yet marked as solved
2 Replies
2.2k Views
From the keynote #10076 it was mentioned at the 3:00 mark, that USDZ, USDA and OBJ is supported, but I've not been able to find details on how to make the sample command-line app export .obj files. Only .usdz. Anyone have any information on that? Or does anyone have any tips on how to convert a .usdz to .obj? It doesn't seem to be very easy to do.
Posted
by Staus.
Last updated
.
Post not yet marked as solved
1 Replies
659 Views
My name is Daria. I represent a students team from Omsk, Russia. After WWDC21 we've decided to experiment with the Object Capture technology to reconstruct histrorical museum objects and place it as an art exhibition nearby the museum. We've talked with different museums. Our idea was supported by Vrubel museum (http://vrubel.ru). They provided us access to their historical sculptures (dated by 19th century). The following are reconstructed models, that we created with Object Capture technology: Young Woman Psyche Psyche with a butterfly Cupid's head Silvio Deer with a branch All together, we created the unique experience that available through iOS app to any peson walking around the museum. Video recording of the experience We would be glad to hear any feedback from Apple and scale our experiment to other museums!
Posted
by melamory.
Last updated
.
Post not yet marked as solved
3 Replies
2.1k Views
From my understanding you capture images on an iOS device and send it to macOS which uses photogrammetry with Object Capture API to process it to a 3D model… Is it possible to exclude macOS and pull the API within the app itself so it does the processing all within the app? From scanning to processing? I see on the AppStore, there’s Scanner apps already, so I know it is possible to create 3D models on the iPhone within an app— but can this API do that? If not, any resources to point me in the right direction? (I’m working on creating a 3D food app, that scans food items and turns them into 3D models for restaurant owners… I’d like the restaurant owner to be able to scan their food item all within the app itself)
Posted Last updated
.
Post not yet marked as solved
0 Replies
497 Views
Hey! Dose AR quick look support Mask Material? I have seen the the face tracking have face Mask out, which the face will create occlusion for the face. So the object like helmet wont look like just floating over the head. Now I wanted to take that mask material and apply to my other project, like world tracking and image tracking, is this possible? thanks.
Posted
by DDHTOM.
Last updated
.