AR Quick Look

RSS for tag

Allow users to see incredibly detailed object renderings in real-world surrounding, with support for audio playback, using AR Quick Look.

AR Quick Look Documentation

Posts under AR Quick Look tag

12 Posts
Sort by:
Post not yet marked as solved
0 Replies
480 Views
HI, I'm new to IOS Dev. I am developing an app with AR function. I found there are a few tutorials about AR Quick Look. However, they're all use storyboard. Is there any way to use swift ui to demonstrate AR Quick Look. ContentView.swift import SwiftUI //import QuickLook //import ARKit struct ContentView: View { @State private var isPresented = false var body: some View { VStack { Button { isPresented = true print("click") } label: { Text("Click to AR") .font(.title) .fontWeight(.bold) .padding() .background() .cornerRadius(16) } .sheet(isPresented: $isPresented) { ARView() } .padding() } } } #Preview { ContentView() } ARView.swift import SwiftUI struct ARView: UIViewControllerRepresentable { func makeUIViewController(context: Context) -> QuickViewController { QuickViewController() } func updateUIViewController(_ uiViewController: QuickViewController, context: Context) { uiViewController.presentARQuickLook() } typealias UIViewControllerType = QuickViewController } QuickViewController.swift import UIKit import QuickLook import ARKit class QuickViewController: UIViewController, QLPreviewControllerDelegate, QLPreviewControllerDataSource { // 有幾個模型要呈現 func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 } // 顯示模型 func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem { let url = Bundle.main.url(forResource: "bear", withExtension: "usdz")! // Load file url let preview = ARQuickLookPreviewItem(fileAt: url) return preview } func presentARQuickLook() { let previewController = QLPreviewController() previewController.dataSource = self present(previewController, animated: true) print("Open AR model!") } override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. } /* // MARK: - Navigation // In a storyboard-based application, you will often want to do a little preparation before navigation override func prepare(for segue: UIStoryboardSegue, sender: Any?) { // Get the new view controller using segue.destination. // Pass the selected object to the new view controller. } */ }
Posted Last updated
.
Post not yet marked as solved
2 Replies
626 Views
Hello Apple community, I hope this message finds you well. I'm writing to report an issue that I've encountered after upgrading my iPad to iPadOS 17. The problem seems to be related to the Quick Look AR application, which I use extensively for 3D modeling and visualization. Prior to the upgrade, everything was working perfectly fine. I create 3D models in Reality Composer and export them as USDZ files for use with Quick Look AR. However, after the upgrade to iPadOS 17, I've noticed a rather troubling issue. Problem Description: When I view my 3D models using Quick Look AR on iPadOS 17, some of the 3D models exhibit a peculiar problem. Instead of displaying the correct textures, they show a bright pink texture in their place. This issue occurs only when I have subsequent scenes added to the initial scene. Strangely, the very first scene in the sequence displays the textures correctly. Steps to Reproduce: Create a 3D model in Reality Composer. Export the model as a USDZ file. Open the USDZ file using Quick Look AR. Observe that the textures appear correctly on the initial scene. Add additional scenes to the model. Navigate to the subsequent scenes. Notice that some of the 3D models display a pink texture instead of the correct textures (see picture). Expected Behavior: The 3D models should consistently display their textures, even when multiple scenes are added to the scene sequence. Workaround: As of now, there doesn't seem to be a viable workaround for this issue, which is quite problematic for my work in 3D modeling and visualization. I would greatly appreciate any insights, solutions, or workarounds that the community might have for this problem. Additionally, I would like to know if others are experiencing the same issue after upgrading to iPadOS 17. This information could be helpful for both users and Apple in addressing this problem. Thank you for your attention to this matter, and I look forward to hearing from the community and hopefully finding a resolution to this Quick Look AR issue. Best regards
Posted
by delcasda.
Last updated
.
Post not yet marked as solved
0 Replies
548 Views
Hi there Hosting in my server a no-doubt-well-formed AR file, as is the "CosmonautSuit_en.reality" from Apple's examples (https://developer.apple.com/augmented-reality/quick-look/) the infamous and annoying "Object requires a newer version of iOS." message appears, even when I'm running iOS 17.1 in my iPad. That is, the very last available version. All works flawless in uOS16 and below. Of course, my markup is following the required format, namely: <a rel="ar" href="https://artest.myhost.com/CosmonautSuit_en.reality"> <img class="image-model" src="https://artest.myhost.com/cosmonaut.png"> </a> Accessing this same .reality file from the aforementioned Apple's site page works fine. Why is not working in my hosting server? For you rinformation, when I use in my server a USDZ instead, also from the Apple's web page of examples, as is the toy_drummer_idle.usdz file, all works flawless. Again, I'm using the same markup schema: <a rel="ar" href="https://artest.myhost.com/toy_drummer_idle.usdz"> <img class="image-model" src="https://artest.myhost.com/toy_drummerpng"> </a> Also, when I delete the rel="ar" option, AR experience is launched, but by means of an extra step, that implied go thought an ugly poster (generated by QLAR on-the-fly), that ruins all the UX/UI of my webapp. This bahavior is, by the way, the same that you can experience when accessing directly the .realiity file by typing its URL in the Safari browser box. Any tip on this? Thanks for your time.
Posted
by paleRider.
Last updated
.
Post not yet marked as solved
0 Replies
435 Views
Using the face anchor feature in Reality Composer, I'm exploring the potential for generating content movement based on facial expressions and head movement. In my current project, I've positioned a horizontal wood plane on the user's face, and I've added some dynamic physics-enabled balls on the wood surface. While I've successfully anchored the wood plane to the user's head movements, I'm facing a challenge with the balls. I'm aiming to have these balls respond to the user's head tilts, effectively rolling in the direction of the head movement. For instance, a tilt to the right should trigger the balls to roll right, and likewise for leftward tilts. However, my attempts thus far have not yielded the expected results, as the balls seem to be unresponsive to the user's head movements. The wood plane, on the other hand, follows the head's motion seamlessly. I'd greatly appreciate any insights, guidance, or possible solutions you may have regarding this matter. Are there specific settings or techniques I should be implementing to enable the balls to respond to the user's head movement as desired? Thank you in advance for your assistance.
Posted
by ropims.
Last updated
.
Post not yet marked as solved
2 Replies
1.1k Views
Is there support for using multiple UV channels in AR QuickLook in iOS17? One important use case would be to put a tiling texture in an overlapping tiling UV set while mapping Ambient Occlusion to a separate unwrapped non-overlapping UV set. This is very important to author 3D content combining high-resolution surface detail and high-quality Ambient Occlusion data while keeping file size to a minimum.
Posted
by psqv.
Last updated
.
Post not yet marked as solved
2 Replies
490 Views
Hi! I created manually USDZ with one cube to anchoring on wall (vertical plane) #usda 1.0 ( defaultPrim = "Root" metersPerUnit = 1 upAxis = "Y" ) def Xform "Root" ( assetInfo = { string name = "Root" } kind = "component" ) { def Xform "Geom" ( prepend apiSchemas = [ "Preliminary_AnchoringAPI" ] ) { # token preliminary:anchoring:type = "plane" # token preliminary:planeAnchoring:alignment = "vertical" matrix4d xformOp:transform = ( (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0.2, 1) ) def Xform "Group" { def Cube "cube_0" { float3[] extent = [(-1, -1, -1), (1, 1, 1)] uniform bool doubleSided = 1 rel material:binding = </Root/Materials/material_0> matrix4d xformOp:transform = ( (0.1, 0, 0, 0), (0, 0.1, 0, 0), (0, 0, 0.01, 0), (0, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } } } } It's displayed correct in ARQuickLook When I add second cube to USD(z) #usda 1.0 ( defaultPrim = "Root" metersPerUnit = 1 upAxis = "Y" ) def Xform "Root" ( assetInfo = { string name = "Root" } kind = "component" ) { def Xform "Geom" ( prepend apiSchemas = [ "Preliminary_AnchoringAPI" ] ) { # token preliminary:anchoring:type = "plane" # token preliminary:planeAnchoring:alignment = "vertical" matrix4d xformOp:transform = ( (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0.2, 1) ) def Xform "Group" { def Cube "cube_0" { float3[] extent = [(-1, -1, -1), (1, 1, 1)] uniform bool doubleSided = 1 rel material:binding = </Root/Materials/material_0> matrix4d xformOp:transform = ( (0.1, 0, 0, 0), (0, 0.1, 0, 0), (0, 0, 0.01, 0), (0, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } def Cube "cube_1" { float3[] extent = [(-1, -1, -1), (1, 1, 1)] uniform bool doubleSided = 1 rel material:binding = </Root/Materials/material_0> matrix4d xformOp:transform = ( (0.1, 0, 0, 0), (0, 0.1, 0, 0), (0, 0, 0.01, 0), (0.3, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } } } } ARQuickLook display scene ~ 10 cm from wall, and then scene have more cubes we see increased distance from wall. Here for two cubes I also tryed recreate scene in Reality Composer on iPhone. Everything ok for one cube, and ok when preview in App(ARKit?) for two cubes, but when export scene in RealityComposer macOS to USDZ we again see wrong distance for two cubes and more. For tests I use iPhone 13 Pro Max with iOS 16.3.1
Posted
by olegl.
Last updated
.
Post not yet marked as solved
0 Replies
551 Views
I’ve poured over the session “Discover Quick Look for spatial computing” and I was really impressed about “Windowed Quick Look” and how items can be opened in their own Volume and stay open even if the app/website they were opened from was closed. I had an additional question–how long do items in “Windowed Quick Look” remain in the Shared Space after the app or web page they were opened from is closed? I’m imaging something like a how-to document or diagram that users could be consulting visually but not interacting with, will visionOS purge it from memory at some point or will it persist indefinitely until the user manually closes them? Same example if I was using a how-to document or diagram to help me work or learn alongside the app/site I was working or learning in, it would be more convenient if it was still open so I could continue right where I left off.

Therefore, if a user were to take off the Vision Pro for the evening and then put it back on in the morning, would the item they opened in Windowed Quick Look persist alongside the other apps/windows/volumes they had open and were working with in the Shared Space?
Posted Last updated
.
Post not yet marked as solved
0 Replies
669 Views
Hello, I've been working on an AR project where I need to display a stepped animation in AR Quick Look. I've created the animation in my 3D modeling software and exported it as a USDZ file. The animation is designed to move in distinct steps, but when I preview it in AR Quick Look, it appears to be interpolating between frames, resulting in a smooth animation instead of the stepped one I created. Has anyone else encountered this issue? Is there a way to control the animation playback in AR Quick Look to make it stepped rather than smooth? Any tips, workarounds, or guidance would be greatly appreciated. Thanks in advance for your help!
Posted Last updated
.
Post not yet marked as solved
0 Replies
667 Views
Hi, I have one question. When creating a web page, is there a way to determine that it is being accessed from Safari on visionOS? I would also like to know the user agent for Safari on visionOS. If there is more than one way to determine this, such as JavaScript and web server, please tell us all. Cases where it is used include changing the page layout in the case of Safari on visionOS, changing the processing method when dynamically generating HTML pages on a web server, and judging Quick Look. Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
Posted Last updated
.
Post not yet marked as solved
2 Replies
726 Views
If the anchor picture is out of the camera view, the AR experience disappears. Both with USDZ files, created with Reality composer, directly opened on iPhone or iPad (with AR Quick Look), or with Adobe Aero. So I suppose the bug is due to ARKit.
Posted
by EMD24Fr.
Last updated
.