Post not yet marked as solved
Hello, I struggling to integrate this code below in my Content View can someone give me help?
import UIKit
import QuickLook
import ARKit
class ViewController: UIViewController, QLPreviewControllerDataSource {
override func viewDidAppear(_ animated: Bool) {
let previewController = QLPreviewController()
previewController.dataSource = self
present(previewController, animated: true, completion: nil)
}
func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 }
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
guard let path = Bundle.main.path(forResource: "myScene", ofType: "reality") else { fatalError("Couldn't find the supported input file.") }
let url = URL(fileURLWithPath: path)
return url as QLPreviewItem
}
}
Post not yet marked as solved
Hi,
I can't find the app that used object capture to create a 3 model in this video.
https://developer.apple.com/videos/play/wwdc2021/10078/
I think the presenter has said it was "Clone" but there is no such app available in the App Store.
Can anybody tell me if the app has changed its name since then?
Post not yet marked as solved
Hello, I want to implement AR quick look to my app. I' don't want to use UIKit but SwiftUI. In one of my views, I want to press a button named "preview" and then it should open the AR quick look. Have someone an idea how I can do that?
Sorry for that badly formatted code below.
import UIKit
import QuickLook
import ARKit
class ViewController: UIViewController, QLPreviewControllerDataSource {
override func viewDidAppear(_ animated: Bool) {
let previewController = QLPreviewController()
previewController.dataSource = self
present(previewController, animated: true, completion: nil)
}
func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 }
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
guard let path = Bundle.main.path(forResource: "myScene", ofType: "reality") else { fatalError("Couldn't find the supported input file.") }
let url = URL(fileURLWithPath: path)
return url as QLPreviewItem
}
}
[https://developer.apple.com/documentation/arkit/previewing_a_model_with_ar_quick_look)
Can I take photos from an android device or any camera instead of an ipad or iphone while using the Object Capture API? What negative side effects can it have.
Note: I'm asking this question because I noticed that the size of the usdz model has grown too much in my experiments. (For example 200mb instead of 20mb)
Post not yet marked as solved
Is it possible to have multiple photogrammetry session running in parallel. I would like to process multiple sets of photos at the same time.
Thank you for your help.
Post not yet marked as solved
Happens for scans where I flip the object on it's side to get the bottom. I know I can fix it afterwards in xcode, but the images apple uses for their scans come out fine.
Screenshot of a scan of a basket, any ideas why it does this?
Post not yet marked as solved
I'm making an app that captures data using ARKit and will ultimately send the images+depth+gravity to an Object Capture Photogrammetry agent. I need to use the depth data and produce a model with correct scale, so from what I understand I need to send the depth file + set proper exif data in the image. Since I'm getting the images+depth from ARKit I'll need to set the exif data manually before saving the images. Unfortunately the documentation on this is a bit light, so would you be able to let me know what exif data needs to be set in order for the Photogrammetry to be able to create the model with proper scale?
If I try and set my Photogrammetry agent with manual metadata like this:
var sample = PhotogrammetrySample(id: id, image: image)
var dict:[ String: Any ] = [:]
dict["FocalLength"] = 23.551325
dict["PixelWidth"] = 1920
dict["PixelHeight"] = 1440
sample.metadata = dict
I get the following error in the output and depth is ignored:
[Photogrammetry] Can't use FocalLenIn35mmFilm to produce FocalLengthInPixel! Punting...
Post not yet marked as solved
Hello,
I know that we can capture object using reality kit and create a 3D model of the real world object. Can we also use reality kit to scan a human and create a 3D model of the human along with the skeletal structure. So we can use the same scanned human in the motion capture and display the motion animation with the real person.
Thanks & Regards,
Rakesh
Post not yet marked as solved
In preview, the 3D model is visible in the background but once the camera button is pressed the image in the screenshot is just the 3D model with a black background which is also saved.
QLPreviewController implementation:
`func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
thumbnailIndex = indexPath.item
let previewController = QLPreviewController()
previewController.dataSource = self
previewController.delegate = self
present(previewController, animated: true)
}
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = Bundle.main.url(forResource: models[thumbnailIndex], withExtension: "usdz")!
return url as QLPreviewItem
}`
console output is:
2021-10-07 21:26:30.567439+0200 CPARTest[522:28543] Writing analzed variants.
2021-10-07 21:26:30.620474+0200 CPARTest[522:28738] Metal GPU Frame Capture Enabled
2021-10-07 21:26:30.621447+0200 CPARTest[522:28543] Writing analzed variants.
2021-10-07 21:26:30.623724+0200 CPARTest[522:28738] Metal API Validation Enabled
<SCNNode: 0x280e04d00 | no child> -> <ARImageAnchor: 0x2809085b0 identifier="64834AA5-7912-1102-CCC2-77992D9D8FE0" transform=<translation=(0.239411 -0.028021 0.027675) rotation=(84.68° -98.90° -2.02°)> referenceImage=<ARReferenceImage: 0x281600a80 name="cake" physicalSize=(0.050, 0.050)> tracked=YES>
2021-10-07 21:26:34.972808+0200 CPARTest[522:28772] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 11 ARFrames. This can lead to future camera frames being dropped.
2021-10-07 21:26:34.989003+0200 CPARTest[522:28741] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 12 ARFrames. This can lead to future camera frames being dropped.
2021-10-07 21:26:35.006186+0200 CPARTest[522:28772] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 13 ARFrames. This can lead to future camera frames being dropped.
2021-10-07 21:26:35.023000+0200 CPARTest[522:28750] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 14 ARFrames. This can lead to future camera frames being dropped.
2021-10-07 21:26:35.039644+0200 CPARTest[522:28739] [Session] ARSession <0x104c1ff70>: ARSessionDelegate is retaining 15 ARFrames. This can lead to future camera frames being dropped.
2021-10-07 21:26:35.537424+0200 CPARTest[522:28741] [default] [self.extensionContext conformsToProtocol:auxHostProtocol.protocol] - /Library/Caches/com.apple.xbs/Sources/ExtensionFoundation/ExtensionKit-49.1/ExtensionFoundation/Source/NSExtension/NSExtensionSupport/EXExtensionContext.m:332: Class QLPreviewExtensionHostContext does not conform to aux host protocol: QLRemotePreviewHost
I have multiple requests in one session, how to stop or cancel the request which is running but not stop or cancel the whole PhotogrammetrySession?
Post not yet marked as solved
Hey. Anyone have experience build AR quick look from Reality Composer.
How can I set the object always facing the camera during the AR experience. For now I can only do scene start trigger which only play once. Is there any way we can keep it alway facing the camera? Thank.
Post not yet marked as solved
From the keynote #10076 it was mentioned at the 3:00 mark, that USDZ, USDA and OBJ is supported, but I've not been able to find details on how to make the sample command-line app export .obj files. Only .usdz.
Anyone have any information on that?
Or does anyone have any tips on how to convert a .usdz to .obj? It doesn't seem to be very easy to do.
Post not yet marked as solved
My name is Daria. I represent a students team from Omsk, Russia.
After WWDC21 we've decided to experiment with the Object Capture technology to reconstruct histrorical museum objects and place it as an art exhibition nearby the museum.
We've talked with different museums. Our idea was supported by Vrubel museum (http://vrubel.ru). They provided us access to their historical sculptures (dated by 19th century).
The following are reconstructed models, that we created with Object Capture technology:
Young Woman
Psyche
Psyche with a butterfly
Cupid's head
Silvio
Deer with a branch
All together, we created the unique experience that available through iOS app to any peson walking around the museum.
Video recording of the experience
We would be glad to hear any feedback from Apple and scale our experiment to other museums!
Post not yet marked as solved
Is there some code to create the example GUI app you used in the demo?
Post not yet marked as solved
From my understanding you capture images on an iOS device and send it to macOS which uses photogrammetry with Object Capture API to process it to a 3D model…
Is it possible to exclude macOS and pull the API within the app itself so it does the processing all within the app? From scanning to processing? I see on the AppStore, there’s Scanner apps already, so I know it is possible to create 3D models on the iPhone within an app— but can this API do that? If not, any resources to point me in the right direction?
(I’m working on creating a 3D food app, that scans food items and turns them into 3D models for restaurant owners… I’d like the restaurant owner to be able to scan their food item all within the app itself)
Post not yet marked as solved
I donnot know how to run it
Post not yet marked as solved
Hey! Dose AR quick look support Mask Material? I have seen the the face tracking have face Mask out, which the face will create occlusion for the face. So the object like helmet wont look like just floating over the head. Now I wanted to take that mask material and apply to my other project, like world tracking and image tracking, is this possible?
thanks.