Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

108 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

What to return from func focusItems(in rect: CGRect) -> [any UIFocusItem]
I have a warning on my AR app Virtual Tags that since July does not show its camera information telling: "ARCL.SceneLocationView implements focusItemsInRect: - caching for linear focus movement is limited as long as this view is on screen." I tried inserting the function, but documentation does not explain what it should return and how to generate it. Is someone able to help me? Thanks,
0
0
111
1w
Entity HoverEffect Fired When inside Another Entity Collider
Hi folks, I’m new to Vision Pro stack, still trying to learn all the nuances. Here is a problem I can’t seem to find an answer. I placed entity A( a small .02 radius sphere) inside entity B( size:.1 box). Both entities have HoverEffectComponent, and both inputcomponent is set to .direct. Entity A is NOT a child of Entity B. When I direct touch Entity B, I noticed that Entity A’s hover effect is fired as well. This only happens if Entity A‘s position is inside Entity B. The gesture that is only targeted at Entity A doesn’t work either. I double checked Entity A collider which sits inside entity B collider, my direct touch shouldn’t have trigger its hove effect. Having one collider inside another seems to produce unpredictable behavior? Thanks in advance 🙏🙏🙏 Context: I’m trying to create an invisible bound around Entity A, so when my hand approaches the bound to grab Entity A, a nice spotlight hover effect would fire first on the bound before hand reaching entity A.
2
0
142
1w
VisionOS Main Camera Enterprise API: Development license into distribution for Business Store
Hello, We've been working for months now on an App for the Vision Pro. (it's been great btw!) We already have an App in the App Store for iOS, and have been migrating our platform from the Microsoft Hololens 2 to the AVP: https://apps.microsoft.com/detail/9NPPP031VHD1 We require the Main Camera access and already have gotten the Enterprise.license for development purposes. Unfortunately, we cannot publish our Business App (which uses an Enterprise API) under the same Name/Bundle ID as our iOS App because it would conflict with our current Distribution Method. We arrived at the conclusion that we need a new Enterprise.license under a different Bundle ID to create a new App for the Business Store. Has anyone been in the same boat as us, and tried to publish to the Business Store while already having an App in the Public App Store under the same name? We applied to get another license for distribution under another name (with "Pro" at the end), but it's been stuck in limbo for over a month now (probably because the new bundle ID doesn't have any track record). Anyhow, thanks for any help, we're open to suggestions as to how to proceed!
0
0
209
2w
SceneKit Transparent Material Self-Overlapping Issue (Front Face Overlapping)
Description: I'm developing an AR effect using SceneKit and applying a transparent material to a face mesh. However, I'm facing an issue where the front faces of the mesh overlap each other, causing incorrect rendering. Problem: The front faces of the mesh overlap with each other when transparency is applied. This causes areas like the cheeks to be visible through the nose, even though they should be occluded. Expected Behavior: The material should behave as if it were opaque to itself—that is, overlapping front faces should be occluded properly, while still allowing transparency for background elements. Actual Behavior: The mesh renders its own front faces incorrectly, making parts of the face visible through others when they should be blocked. What I Have Tried: testMaterial.writesToDepthBuffer = true testMaterial.readsFromDepthBuffer = true Question: 👉 How can I prevent SceneKit's transparent material from rendering overlapping front faces? 👉 Is there a way to force SceneKit to treat its own mesh as opaque for itself while still being transparent to the background? 👉 Does SceneKit support a proper depth pre-pass or an equivalent to Unity’s ZWrite shaders to solve this issue? Attached screenshots demonstrate the problem visually. Any help would be greatly appreciated! 🚀
0
0
225
3w
Automatic Plane Measurements like in the Apple Measure App
I’m working on an iOS app that needs to measure the area of planes or surfaces, like the length and width of objects, just like the Apple Measure app does. I’ve been exploring ARKit, but I’m curious if there are any APIs or techniques that can help automate the process of detecting and measuring planes. Specifically, I’m looking for a way to automatically detect and measure planes (e.g., from a top-down view). For example: Measuring a box width and length. I have attached a screenshot and a video of the Apple Measure App doing it. Does Apple provide any tools or APIs for this, or are there any best practices I should know about? I’d love to hear from anyone who’s tackled something similar. Video: https://drive.google.com/file/d/1BxM7fIbFxsCsYwY7w8ZxIeq_4WTGkkwA/view?usp=drive_link
5
0
406
4w
Request for gaze data in fully immersive Metal apps
Hi, We are trying to port our Unity app from other XR devices to Vision Pro. Thus it's way easier for us to use the Metal rendering layer, fully immersive. And to stay true to the platform, we want to keep the gaze/pinch interaction system. But we just noticed that, unlike Polyspatial XR apps, VisionOS XR in Metal does not provide gaze info unless the user is actively pinching... Which forbids any attempt to give visual feedback on what they are looking at (buttons, etc). Is this planned in Apple's roadmap ? Thanks
3
0
323
2w
Slow Auto Focus on iPhone 16 Pro with ARkit camera
I have recently started testing ARKit on an iPhone 16 Pro and I have noticed that the AutoFocus reaction on this device is much slower than other devices. For example, if I point the camera to a close object AutoFocus takes 4-5 seconds to stabilize, the focal length is adjusted very very slowly. In some cases (although this is rare) AutoFocus seems almost stuck and requires a bit of device movement to trigger. This is quite problematic when using some ARKit features like Image and Object detection as the detection algorithms struggle with out-of-focus images. This problem is limited to ARKit. AutoFocus is significantly more responsive when the standard AVFoundation Camera API is used. This behavior is easy to reproduce with any of the ARKit samples like https://developer.apple.com/documentation/arkit/arkit_in_ios/content_anchors/tracking_and_visualizing_planes Is anybody else experiencing this problem?
3
0
391
Jan ’25
AudioPlaybackController stop playing when .plain window is closed
Suppose there was an immersiveSpace, and an Entity() being added to the space as child entity of the content. This entity is responsible for playing background music by calling prepareAudio, gaining a controller and play the music. (check the basic code below) When it was playing music, a .plain window and an immersiveSpace are both presented. I believe this immersiveSpace is holding the handle of the controller so as long as immersiveSpace is open, the music won't stop. However if I close the .plain window (by closing system-level close button), the music just stopped. But the immersiveSpace is still open. If right now I check the value of controller.isPlaying, it was still true. But you just cannot hear the music anymore. To reproduce, simply open an visionOS template App project, selecting volume and full immersive, and replace some code inImmersiveView.swift with the code below. Also simply drag any .mp3 file and replace the AudioFileResource's name. And you could reproduce this bug. RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) // Put skybox here. See example in World project available at // https://developer.apple.com/ if let audioResource = try? await AudioFileResource(named: "anyMP3file.mp3") { let ent = Entity() immersiveContentEntity.addChild(ent) let controller = ent.prepareAudio(audioResource) controller.play() } } } I wonder why this happen? I mean how should I keep the music playing when I close the .plain window? Thanks!
1
1
302
Jan ’25
Persisting Anchors in RealityView with ARMode on iOS
Platform: iOS18 Tech: RealityView Hi! I was wondering if RealityView now provides ways for their session to persist Anchor data in a world such that the anchor locations in one session can be saved and loaded in a another session that persists the exact same anchor positions. I know that ARWorldMap in ARKit does that, but I was not able to find a way to use it with RealityView. I think it's because RealityView has ARKit under its hood but does not expose the ARKit session info publicly to the client code. So I was wondering if there's a SwiftUI + RealityView approach that can help me to achieve a similar goal: Come back to the same location and see the object in exactly the same place. Thanks!
0
1
331
Jan ’25
Billboard Entity with AttachmentView
Hey Everyone, Happy New Year! I wanted to see if you have seen this before. I have added an attachment to the RealityView as a child on an entity that has a Billboard component set on it. I wanted to create the effect that the attachment is offset by .5 meters from center and follows the device as you move around it. IT works great until you try click a button. The attachment moves with the billboard, but the collision box around the attachment is not following it. If I position myself perfectly it works. Video Example: https://youtu.be/4d9Vx7K8MmU // // ImmersiveView.swift // Billboard Attachment // // Created by Justin Leger on 1/3/25. // import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { var rootEntity = Entity() var body: some View { RealityView { content, attachments in // Add the initial RealityKit content let sphereEntity = ModelEntity(mesh: .generateSphere(radius: 0.1), materials: [SimpleMaterial(color: .red, roughness: 1, isMetallic: false)]) sphereEntity.position = [0.0, 1.0, -2.0] let controlsPivotEntity = Entity() controlsPivotEntity.components[BillboardComponent.self] = .init() // Extract the attachemnt entity and disable it before its used. if let controlsViewAttachmentEntity = attachments.entity(for: PlacedThingControls.attachmentId) { controlsViewAttachmentEntity.position.z = 0.5 controlsPivotEntity.addChild(controlsViewAttachmentEntity) sphereEntity.addChild(controlsPivotEntity) } content.add(sphereEntity) } attachments: { Attachment(id: PlacedThingControls.attachmentId) { PlacedThingControls() } } } } #Preview(immersionStyle: .mixed) { ImmersiveView() .environment(AppModel()) } struct PlacedThingControls: View { static let attachmentId = "placed-thing-3D-controls" var body: some View { VStack { HStack(spacing: 0) { Button { print("🗺️🗺️🗺️ Map selected pieces") } label: { Text("\(Image(systemName: "plus.square.dashed")) Manage Mesh Maps") .fontWeight(.semibold) .frame(maxWidth: .infinity) } .padding(.leading, 20) Spacer() Button(role: .destructive) { print("🗑️🗑️🗑️ Delete selected pieces") } label: { Label { Text("Delete") } icon: { Image(systemName: "trash") } .labelStyle(.iconOnly) } .padding(.trailing, 20) } .padding(.vertical) .frame(minWidth: 320, maxWidth: 480) } .glassBackgroundEffect() } }
5
0
536
Jan ’25
Scanning Smaller Objects with RoomPlan (Light Switches or Sockets)
Hi everyone, I’m currently developing an app using Apple’s RoomPlan framework, and so far, everything is working great! However, I’d like to extend the functionality to include scanning smaller objects, such as light switches or power outlets, in addition to the walls and larger furniture that RoomPlan already supports. From what I understand, based on the documentation, RoomPlan doesn’t natively support the detection or measurement of smaller objects like these. Is that correct? If that’s the case, does anyone have suggestions or ideas on how this could be achieved? Perhaps by integrating another framework or technology alongside RoomPlan? I’d appreciate any insights or advice from those who have worked on similar use cases. Thanks in advance!
1
0
329
Jan ’25
RealityKit and MacOS - How to load Reality Composer Scene? Documentation Incomplete/Innacurate
"Although Xcode generates loading methods for all Reality Composer files in your Xcode project" I do not find this to be true, sadly. Does anyone have any luck or insight on how one can build just a simple MacOS app that will import a scene from a Reality File? The documentation suggests that the simple act of bringing a .Reality File in (What about .realitycomposerpro?) will generate code, but that doesn't seem to happen. The sample code (Spaceship) does not compile for MacOS. I'd really love just the most generic template of an Xcode Project that compiles with a button that pops open a scene., Like the VisionOS default immersive project.
2
0
553
Dec ’24
BillboardComponent causing Model Entity tap recognition issues on iOS 18
Hi, When I attach BillboardComponent to anchor entities, I am no longer able to retrieve the tapped entity anymore because the collision shapes of the entity are messed up due to always orienting it towards the camera. And it does not updated the collision shapes because if I try pressing everywhere that is not my model entity, I get a hit out of nowhere. I tried updating the collision shapes of the entity every frame: for child in existingPassport.mainEntity!.children { child.generateCollisionShapes(recursive: true) } However, nothing comes of it, and it is not a smart solution in the first places because it is too heavy to recreate the shapes every frame. I am using the usual AR View Controller that works when I comment out the BillboardComponent line just fine: private func setupTapRecognizer() { let tapRecognizer = UITapGestureRecognizer(target: self, action: #selector(handleTap)) arView.addGestureRecognizer(tapRecognizer) } @objc func handleTap(_ recognizer: UITapGestureRecognizer) { print("handle tap URL 1") let location = recognizer.location(in: arView) if let entity = arView.entity(at: location) { print("handle tap URL 2") // Assuming each entity has a URL stored in a component if let urlComponent = entity.components[URLComponent.self] { webViewPresenter?.presentFullScreenWebView(url: urlComponent.url) print("handle tap URL: \(urlComponent.url)") } } } How should we tackle this issue on iOS 18? Thanks!
1
0
500
Dec ’24
Disable Object Occlusion on iOS
I’m developing an app using RealityKit and RealityView. On newer iPhones, such as the iPhone 15 Pro, Object Occlusion appears to be enabled by default, which causes 3D entities to be hidden behind real-world objects in the scene. However, I need to disable this behavior to ensure proper rendering of my 3D content. This issue does not occur on older devices like the iPhone 13, where the app works as intended. I haven’t been able to find a solution to explicitly disable object occlusion on the newer devices for RealityView. Any guidance or suggestions to resolve this issue would be greatly appreciated! Thanks!
1
0
399
Dec ’24