Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

120 Posts
Sort by:
Post not yet marked as solved
1 Replies
1.4k Views
Object Tracking not working from Unity github repo  https://github.com/Unity-Technologies/arfoundation-samples. It seems ARTrackedObjectManager is not getting properly instantiated (added one debug program) Tested with Unity 2019.3 with AR foundation 4.1.1 and 2021.10a2 with AR foundation 4.1.1, xcode version used 12.1 on Iphone7 with iOS14.2, Though PlaneDetection and ImageTracking are working fine from the same github sample repo. For Reference object I could 3D scan and generate reference object using the AR foundation code. The scanning experience is good and gives good results while testing with the scanning application.The reference object along with other sample reference object is included in the reference object library as advised in the manuals. I am now having a doubt only the right combination of Unity version, AR foundation version, Phone version , iOS verion and potentially Xcode version works. Any help or pointers will be highly appreciated. Has anybody been able to implement the same successfully.
Posted
by
Post not yet marked as solved
30 Replies
25k Views
I'm very excited about the new AirTag product and am wondering if there will be any new APIs introduced in iOS 14.5+ to allow developers to build apps around them outside the context of the Find My network? The contexts in which I am most excited about using AirTags are: Gaming Health / Fitness-focused apps Accessibility features Musical and other creative interactions within apps I haven't been able to find any mention of APIs. Thanks in advance for any information that is shared here. Alexander
Posted
by
Post not yet marked as solved
3 Replies
4.9k Views
Hello, I am new to this amazing AR Developing world. wanted to know if I desire to develop an app for the AR Glasses that are about to be launched in the future - should I use the ARkit?
Posted
by
Post not yet marked as solved
2 Replies
1.5k Views
I am attempting to build an AR app using Storyboard and SceneKit. When I went to run an existing app I have already used it runs but nothing would happen. I thought this behavior was odd so I decided to start from scratch on a new project. I started with the default AR project for Storyboard and SceneKit and upon run it immediately fails with an unwrapping nil error on the scene. This scene file is obviously there. I am also given four build time warnings: Could not find bundle inside /Library/Developer/CommandLineTools failed to convert file with failure reason: *** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[0] Conversion failed, will simply copy input to output. Copy failed file:///Users/kruegerwilliams/Library/Developer/Xcode/DerivedData/ARtest-bjuwvdjoflchdaagofedfxpravsc/Build/Products/Debug-iphoneos/ARtest.app/art.scnassets/ship.scn -> file:///Users/kruegerwilliams/Library/Developer/Xcode/DerivedData/ARtest-bjuwvdjoflchdaagofedfxpravsc/Build/Products/Debug-iphoneos/ARtest.app/art.scnassets/ship.scn error:Error Domain=NSCocoaErrorDomain Code=516 "“ship.scn” couldn’t be copied to “art.scnassets” because an item with the same name already exists." UserInfo={NSSourceFilePathErrorKey=/Users/kruegerwilliams/Library/Developer/Xcode/DerivedData/ARtest-bjuwvdjoflchdaagofedfxpravsc/Build/Products/Debug-iphoneos/ARtest.app/art.scnassets/ship.scn, NSUserStringVariant=( I currently am unsure how to fix these errors? It appears as if they must be in the command line tools because after moving the device support files back to a stable version of Xcode the same issue is present. Is anyone else having these issues?
Posted
by
Post not yet marked as solved
1 Replies
2k Views
SPECIFIC ISSUE ENCOUNTERED I'm playing VR videos through my app using Metal graphics API with Cardboard XR Plugin for Unity. After the recent iOS 16 update (and Xcode 14 update too), videos in stereoscopic mode were flipped upside down and backwards. After trying to change sides manually in code, I only managed to show correct sides (it's not all upside down anymore), but when I turn the phone UP, the view is moving DOWN to the ground, and vice versa. Same issue for left-right phone moving. Also Unity-made popup question is shown on the wrong side (backside - shown in the video attachment) Here is the video of the issue for inverted (upside down flip) view: https://www.dropbox.com/s/wacnknu5wf4cif1/Everything%20upside%20down.mp4?dl=0 Here is the video of inverted moving: https://www.dropbox.com/s/7re3u1a5q81flfj/Inverted%20moving.mp4?dl=0 IMPORTANT: I did manage few times fixing it to work on local build, but when I build it for TestFlight, it is always inverted. WHAT I SUSPECT I found numerous other developers encountered this issue when they were using Metal. Back in the days when OpenGL ES 2 and 3 were still supported by Apple, it did fix the issue switching on one of those. But now since only Metal is supported with new Unity, there is no workaround, and also I would like to use Metal. DEVICE multiple iPhones running multiple iOS 16 versions has this issue. Specific OS version is 16.1 EXPECTED BEHAVIOR VR videos should show right side (not upside down image) and moving up should show upper part of the video, and vice versa. Same goes for left and right move. Currently everything is flipped, but not every time the same kind of flip. Sometimes in rare cases it's even shown correctly. VERSIONS USED What version of Google Cardboard are you using? Cardboard XR Plugin 1.18.1 What version of Unity are you using? 2022.1.13f1
Posted
by
Post not yet marked as solved
2 Replies
1.3k Views
Hi Apple community, Currently developing an iOS application using Room Plan API, I'd like to remove real world objects detected with Room Plan to my ARView. I already tried to use the following code but it deletes only the anchor entities (customText, UI instructions...) attached to the Anchor : arView.scene.removeAnchor(anchor) My aim is to delete real world object content to my ARView like in this example : (I have an error when uploading files like png, jpg or pdf so there is a link) https://ibb.co/yR8CRVy Is there any way to do that using Room Plan API and ARKit ? Thanks in advance, Goat
Posted
by
Post not yet marked as solved
1 Replies
955 Views
Pre-planning a project to use multiple 360 cameras setup un a grid to generate an immersive experience, hoping to use photogrammetry to generate 3D images of objects inside the grid. beeconcern.ca wants to expand their bee gardens, and theconcern.ca wants to use it to make a live immersive apiary experience. Still working out the best method for compiling, editing, rendering; have been leaning towards UE5, but still seeking advice.
Posted
by
Post not yet marked as solved
0 Replies
560 Views
hi.I'm going to hold an interactive exhibition through ar content. I want to make an animation in Unreal Engine 5 and let users experience AR through the Reality Composer app. So I wonder if animations and nodes made by Unreal Engine are linked to Reality Composer. (The reason I use Unreal Engine5 is to make an animation that changes through people's interaction.)
Posted
by
Post not yet marked as solved
0 Replies
725 Views
I'm working on an iOS project that is almost entirely SwiftUI, save for the UIViewRepresentable ARView that I'm using. I'm using RealityKit. When calling MeshResouce.generateText(), fairly often some or all Entities will fail to generate their Mesh with the proper font and will instead render with SF Pro Regular. The font size is not lost, nor any other attributes of the entity. The same data model that generates the entity also provides for a 2D representation of the model, in which the font is never lost. If an entity is generated during makeUIView() of the ARView or during onAppear() of its parent view, the font will never be lost. The font is only lost when the entity is generated in response to user input. Results are unfazed by using .ttf vs .otf for the font files. Very often (maybe always?), once one entity fails to render with its provided font, the subsequently generated entities will also fail to render with their font. I have successfully rendered an Entity then rerendered it and it has lost its font. Possibly related, generating text entities always throws the error CoreText performance note: Client called CTFontCreateWithName() using name "CUSTOM FONT NAME" and got font with PostScript name "CUSTOMFONT-NAME". For best performance, only use PostScript names when calling this API. This error will not be thrown if the first text entity (and therefore all subsequent entities) fails to render properly. If a single text entity gets generated successfully, then subsequent failing entities are more likely to also throw the error. Also always get error warning: using linearization / solving fallback when starting AR Session, but it doesn't seem to be related . Also often (always?) get error 2023-03-27 15:10:30.938146-0700 appName[38594:50629667] [Technique] ARWorldTrackingTechnique <0x15b8d0570>: World tracking performance is being affected by resource constraints [25] Also often (always?) get error 2023-03-27 15:10:30.060163-0700 appName[38594:50629490] [TraitCollection] Class CKBrowserSwitcherViewController overrides the -traitCollection getter, which is not supported. If you're trying to override traits, you must use the appropriate API. I don't know if any of those other errors are related but I figured I should include them. This has been happening intermittently for a long time and with a number of different fonts. I created a very simple version of this in a separate project that will eventually reproduce the error (if you have enough patience). You'll just need to add a non-Apple font to the project. It should render in the provided font sometimes, but when you rerun the project, or if you add the entity enough times, it should fail. It's more likely to fail with the first entity than with subsequent entities, so running the project repeatedly is the most efficient way to reproduce the bug. import SwiftUI import RealityKit class MockRenderQueue: ObservableObject { @Published var renderActions: [(_ arView: ARView, _ cameraAnchor: AnchorEntity) -> Void] = [] } struct ContentView : View { @StateObject var mockRenderQueue = MockRenderQueue() var body: some View { ZStack(alignment: .bottom) { ARViewContainer(renderQueue: mockRenderQueue).edgesIgnoringSafeArea(.all) Button(action: { mockRenderQueue.renderActions.append( addTextEntity ) }) { ZStack { RoundedRectangle(cornerRadius: 20) .frame(height: 48) Text("Add Text Entity") .bold() .foregroundColor(.white) } }.padding() } } func addTextEntity(to arView: ARView, cameraAnchor: AnchorEntity) { let fontName = "Font-Name" // Put your font name here guard let customFont = UIFont(name: fontName, size: 0.1) else { print("Error: Could not find font") return } // Make text Entity let textMesh = MeshResource.generateText("hello world", extrusionDepth: 0.001, font: customFont) let textEntity = ModelEntity(mesh: textMesh, materials: [UnlitMaterial(color: .black)]) // Make an anchor and position it 1m back and centered, facing the user let anchorEntity = AnchorEntity() anchorEntity.look(at: [0,0,0], from: [textMesh.bounds.extents.x / -2,0,-1], relativeTo: cameraAnchor) anchorEntity.transform.rotation *= simd_quatf(angle: .pi, axis: [0,1,0]) // Add TextEntity to anchor anchorEntity.addChild(textEntity) // Add the anchor to the camera anchor cameraAnchor.addChild(anchorEntity) } } struct ARViewContainer: UIViewRepresentable { @ObservedObject var renderQueue: MockRenderQueue @State private var cameraAnchor = AnchorEntity(.camera) func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.scene.addAnchor(cameraAnchor) return arView } func updateUIView(_ arView: ARView, context: Context) { for renderAction in renderQueue.renderActions { renderAction(arView, cameraAnchor) } } } I've tried a handful of different implementations, exposing the ARView through a Coordinator, as a singleton, and other things and I just can't break this bug. Does anyone else have this problem? It is a significant detraction from the user experience and I really need to find out how to fix this. Thank you in advance.
Posted
by
Post not yet marked as solved
1 Replies
849 Views
I am working on an augmented reality application and I am sort of overwhelmed re: what AWS database set up to use for my app. Yes I HAVE to use AWS because I have an interview with them soon and they will ask me about it I am sure of that. It's not a job interview, I just need to use ANY AWS database. It's a grant program the requirement is that I need to have an AWS database. It doesn't have to be the most complex and I don't need to be an expert which obviously I am not LOL. I just need a very basic database. In the past I used Google's Firebase which is A LOT simpler. Well, AWS is NOT simple. There are just way too many options. The issue is, I am making an multipeer app and users will have 3D objects connected to their IDs and also text and images. I am looking at Amplify ( the 'simplified' command line database ) or DynamoDB... I don't love using command line apps or nodejs but I can deal with it for now, I am leaning towards amplify because it appears to be the simplest. Do you happen to use an AWS database for your AR app? If so, what is your set up? Alternatively, I have not seen sample code AR apps with databases connected, if you happen to know of any please share the link and I can figure it out from there. THANK YOU.
Posted
by
Post not yet marked as solved
1 Replies
1.5k Views
I'm startinfg to developp an AR app for my Iphone trough Unity. When I build in Unity there is no bug. Opening it in XCode, the builds fails with several errors of this sort : double-quoted include *** in framework header, expected angle-bracketed instead. I'm under Unity 2022.2, XR Foundation and AR Kit plugins up to date (5.0) and XCode Version 14.3 (14E222b). I have already crawled the internet, but I was not very sucessfull. Changing the double-quoted to angle brackets doesn't change anything. Turning of the warnings under Build Settings didn't solve the problem also. Any clue ?
Posted
by
Post not yet marked as solved
0 Replies
575 Views
I got an Apple Studio (M1 Ultra chip) with hopes that it would dramatically cut down on the ObjectCapture rendering. I am using a benchmark to compare same rendering process on M1 Ultra and an M2 Pro. The times are the same almost to the second. (btw, the render quality lately is getting better and better and it's amazing, thanks for work) For speed/performance, I am wondering - with a 10 core processor and all of the extra resources on Studio/M1 Ultra - should I be modifying my code to make use of all 10 cores? And/or is there a config setting or something? GPUs? Could 'external graphics processor' be part of the solution? I'm not a seasoned Apple developer, so my apologies if this has obvious answer...
Posted
by
Post not yet marked as solved
0 Replies
427 Views
I'm creating a RealityKit game that requires an object to be spawned in midair, fired in a certain direction using an impulse, and then bounce around the cubic container that surrounds it without any other forces being applied to it. Think the DVD logo screensaver but in 3D. However when I spawn the object in with this code, it falls straight down due to a force of gravity being applied. // Create the object entity let object = ModelEntity(mesh: MeshResource.generateBox(size: [0.07, 0.01, 0.14]), materials: [SimpleMaterial(color: .white, isMetallic: false)]) // Define object shape let objectShape = [ShapeResource.generateBox(size: [0.07, 0.01, 0.14]] // Create the object PhysicsBody var objectPhysicsBody = PhysicsBodyComponent(shapes: objectShape, mass: 1, material: .default, mode: .dynamic) // Create the object CollisionComponent let objectCollision = CollisionComponent(shapes: objectShape) // Set components to the object object.components.set(objectPhysicsBody) object.components.set(objectCollision) // Attach the object to the anchor anchor.addChild(object) I've tried applying a constant a negating force upwards (quite messy and didn't work), and also changing the mode to kinematic and manually handling the collisions with the container (same again). However if I could remove the gravitational force from the object then the game would function as intended, but I've not been able to find any solutions or documentation. My project has been stuck here for some time now and I really don't know how to resolve this. If anybody has any suggestions or insights I'd be extremely grateful. Cheers.
Posted
by
Post not yet marked as solved
0 Replies
512 Views
I'm looking to make an app that uses AR technologies, I need to know if its possible to gather point cloud information on an iPhone that doesn't have LiDAR. Ideally, if this information comes with depth / confidence information too. I do know that ARCore supports this with the depth and raw depth API for android phones without depth sensors, but can ARKit also achieve this? In fact, given ARCore supports iOS, is it possible to use the ARCore depth API on iOS? Also, does anyone know how accurate ARKit point clouds are compared to ARCore point clouds? Thanks to everyone who is able to respond!
Posted
by