Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

93 Posts
Sort by:
Post not yet marked as solved
0 Replies
166 Views
It seems weird to me that OcclusionMaterial hides objects for points of view on both sides, but other Materials like SimpleMaterial are unidirectional, only anticlockwise polygon meshes are visible by default. It's inconvenient in case when you create two polygons of same geometry and position but with opposite top faces, and one of these layers is OcclusionMaterial. As a result, Occlusion Material gets stuck with the second layer. Here is my example, animated piston. Version with Occlusion Material replaced with red material. Version with Occlusion Material. Is it possible to make Occlusion Material one side visible? Was this feature made on purpose? Thanks! .
Posted
by
Post not yet marked as solved
1 Replies
266 Views
Hi, I'm using the provided CaptureSample together with HelloPhotogrammetry to create a 3D model. Although I am making sure that there are depth images within the folder (TIF) - after building the model, the model size is tiny. How can I make sure that the model is built in the real world size? or - how can i resize the model to real world size? Thanks.
Posted
by
Post not yet marked as solved
1 Replies
267 Views
We need to develop an iOS app that will make heavy use of ARSessions with ARFaceTrackingConfiguration, in more or less 100% of the screens. Our main concern is related to battery usage. We're in discovery phase, so I'm trying to find/research all possible risks related to it. For now, what I found is: ARKit sessions don't "drain" the battery (I did some testing with my phone), they just consume more that common apps (more or less the same amount of battery than having the iPhone camera open for a long period of time, with the addition that ARSession is continually creating frames in order to track the face -real time video processing). I'm aware of run(_:options:) and pause() methods on ARSession, in order to pause the tracking functionality when it's not necessary. I'm aware about lowPowerModeEnabled and we can react to changes in that property. I searched Apple Developer for some official article that provides general considerations about battery usage in an ARKit session, but I couldn't find anything dedicated to the topic. I'm wondering if there is a really important thing that I should be aware and I'm missing when implementing the feature. It's the first time I'll work with ARKit and it's critical for the project. Thanks in advance!
Posted
by
Post not yet marked as solved
0 Replies
170 Views
Hi, I am a unity programmer who is new to using AR. I would like to be able to create points (like those already present on the AR face) and place them on the face. I have tried a technique using x,y,z coordinates but the problem is that since I am creating new points in relation to the AR face points, all I have to do is move my head and my points are no longer in the same place (so they are not fixed on my face like the basic points are). How can I make sure that my points can be fixed on the face and that they follow the movement of the head?
Posted
by
Post not yet marked as solved
0 Replies
184 Views
Hi, I am developing an AR application, using an ARSession as camera source and a Metal view as rendering target on iPhone (12 Pro Max as test device and on Xcode 13.3 (13E113)). Since I am trying to render a 3D model with high polygon count, I want to set up a remote server for handling actual render to reduce load on iPhone. Are there any examples out there with this approach for ARKit? I am looking for something like Microsoft’s remoting method (which can be found here: https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/holographic-remoting-player). Any example similar to this, utilizing (if allowed by API) re-projection like HoloLens 2 for iOS would be great.
Posted
by
Post marked as solved
5 Replies
427 Views
I am fairly new to ios development, and i have a rather simple question (i guess): I exported an FBX model from blender with attached child elements (some empty axis). I need those transforms for a certain purpose in my app. However, they won't show up when im placing my model entity in the AR Scene and try to access it's child elements. However, they are showing up in the preview view in xcode when i click on the model file. Can someone please explain why this is and how I can access them? Would be lovely. Thanks for your help in advance!
Posted
by
Post not yet marked as solved
0 Replies
215 Views
Hi Community, I am using Googles Model Viewer (https://modelviewer.dev/) to display 3D-models in the webbrowser. It also enables the user to open the 3D-models in AR-Applications. On iOS, Quick Look will be opened. It all works fine and well, except when the web space directory is protected by HTTP Basic Auth (protection of the usdz-model-files is a requirement of my client). As I understand it, Quick Look looses (or does not have any access to) the HTTP session, and therefore is not authenticated when trying to access the usdz-file on the protected web space directory. The error message in Quick Look is: "Object requires a newer version of iOS". Is there any way to open an usdz-file that is on a protected web space directory with Quick Look?
Posted
by
Post not yet marked as solved
1 Replies
262 Views
Hello, I tried to implement a game in arkit, when user tap the screen, the bullet will be fired from the touched position. I know how to get 2D position on the screen as below void HandleTapGesture(UITapGestureRecognizer sender) { SCNView areaPanned = sender.View as SCNView; CGPoint point = sender.LocationInView(areaPanned); but I don't know how to transfer this position into the world position. Then I can use it in below scenario let bullet = SCNNode(geometry: SCNSphere(radius: 0.08)) bullet.position = ? That position type is open var position: SCNVector3 Can anyone share any opinion? Many thanks.
Posted
by
Post not yet marked as solved
1 Replies
350 Views
I have a plane AnchorEntity with a child raycast AnchorEntity and my ObjectEntities (USDZ & Mesh) as children of that. On devices up to iOS 14 I get grounding shadows, but on iOS 15 these don't appear. My testing shows that grounding shadows will only appear on iOS 15 if I add ObjectEntities directly as children of the plane? Is there some way in iOS 15 to get grounding shadows to pass through the intermediate raycast AnchorEntity to the plane underneath as they do in iOS14? Thanks. Here's my test code: let query = arView.makeRaycastQuery(from: arView.center, allowing: .estimatedPlane, alignment: .horizontal) let result = arView.session.raycast(query).first let raycastAnchor = AnchorEntity(raycastResult: result) let planeAnchor = AnchorEntity(plane: .horizontal, classification: .any, minimumBounds: [0.2, 0.2]) let robot = (try? AnchorEntity.loadModel(named: "robot", in: nil)) // ** This gives Grounding Shadow on iOS 14 but not iOS 15 planeAnchor.addChild(raycastAnchor) raycastAnchor.addChild(robot!) arView.scene.anchors.append(planeAnchor)  // ** This gives Grounding Shadow on both iOS 14 and 15 raycastAnchor.addChild(planeAnchor) planeAnchor.addChild(robot!) arView.scene.anchors.append(raycastAnchor)
Posted
by
Post not yet marked as solved
0 Replies
252 Views
In our AR app and appclip made with SceneKit, we experience z-fighting with particles and another transparent object (a transparent plane faking a bloom effect around one of our models). We tried to change the sorting mode without success. Moreover the z-fighting appears only on iPhone (tested with an iPhone X) and not on iPad (tested with an iPad Pro 2021). Here is a video showing the problem and a screenshot of our rendering settings for the particle system : https://drive.google.com/drive/folders/1fvyHRVfprw606qnQSRaLuDjtY1hN8_f2?usp=sharing
Posted
by
Post not yet marked as solved
2 Replies
293 Views
Hello everyone, I created a 3D character animation in Blender and I would like to import it in Reality Composer. However i export the animation from Blender, it wont show in Composer, just the static object. My usdz scene has character parrented on animated armature. is there any way to import 3D character animation (made in 3D software) to Reality Composer? thnx
Posted
by
Post marked as solved
6 Replies
1.2k Views
Does Apple have any documentation on using Reality Converter to convert FBX to USDZ on an M1 Max? I'm trying to convert an .fbx file to USDZ with Apple's Reality Converter on an M1 Mac (macOS 12.3 Beta), but everything I've tried so far has failed. When I try to convert .fbx files on my Intel-based iMac Pro, it succeeds. Following some advice on these forums, I tried to install all packages from Autodesk https://www.autodesk.com/developer-network/platform-technologies/fbx-sdk-2020-0 FBX SDK 2020.0.1 Clang FBX Python SDK Mac FBX SDK 2020.0.1 Python Mac FBX Extensions SDK 2020.0.1 Mac Still no joy. I have a work around - I still have my Intel-based iMac. But I'd like to switch over to my M1 Mac for all my development. Any pointers? Note: I couldn't get the usdzconvert command line tool to work on my M1 Mac either. /usr/bin/python isn't there.
Posted
by
Post not yet marked as solved
2 Replies
485 Views
In our AR app and appclip made with SceneKit, we experience very significant drops in framerate when we make our 3D content appear at different steps of the experience. For now all of our 3D objects are in our Main Scene. Those which are supposed to appear at some point in the experience have their opacity set to 0.01 at the beginning and then fade in with a SCNAction (the reason why we tried setting their opacity to 0.01 at start was to make sure that these objects are rendered from the start of the experience). However, if the objects all have their opacity set to 1 from the start of the experience, we do not experience any fps drop. It is worth noting that the fps drops only happen the first time the app is opened. If I close it and re-open it, then it unfolds without any freeze. What would be the best way to load (or pre-load) these 3D elements to avoid these freezes? We have conducted our tests on an iPhone X (iOS 15.2.1), on an iPhone 12 Pro (iOS 14), and on an iPad Pro 2020 (iPad OS 14.8.1).
Posted
by
Post not yet marked as solved
4 Replies
510 Views
We create reality files on Reality Composer on an iPad Pro 2020 (iPad OS 14.8.1). They can be opened in AR correctly on iOS 14 devices (our iPad & an iPhone 12 Pro). However on our iOS 15 iPhone X (iOS 15.2.1), AR Quick Look opens but it displays an error message saying “the object cannot be opened”. I have tried creating a .reality file on Reality Composer directly on my iOS 15 iPhone X (I just exported the template project for horizontal plane tracking), but the export fails to open on the very same device. However, the file generated on my iOS 15 iPhone X opens fine on my iPad Pro (iPad OS 14.8.1). USDZ files work fine on iOS though. Any lead on how to solve this issue?
Posted
by
Post not yet marked as solved
1 Replies
690 Views
Hello, newbie here I am getting these errors while trying to build an Unity ARKit project from Xcode: error: Cannot initialize a parameter of type 'id _Nonnull' with an rvalue of type 'Class' Following is the code where the error points to: code: [nativeSession->_classToCallbackMap setObject:anchorCallbacks forKey:[ARPlaneAnchor class]];
Posted
by
Post not yet marked as solved
2 Replies
443 Views
Hello, I have added the .usdz file in my website for iOS for a 3D model but it keeps giving me the error’ object requires a new iOS version’ when I click on the AR button. I have tried on multiple apple devices. Works great on android. Tried through safari as well as chrome. Any help would be appreciated. Thanks
Posted
by
Post not yet marked as solved
1 Replies
356 Views
Will iPad ever receive these tools for object capture? Or at the very least xCode for the ability to use the command line apps for it? I have an M1 iPad Pro that should be able to do all that the M1 Macs can but it’s being held back by software limitations.
Posted
by