Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

93 Posts
Sort by:
Post not yet marked as solved
0 Replies
191 Views
Dear all, In "Explore advanced rendering with RealityKit 2," Courtland presents how one can efficiently leverage dynamic meshes in RealityKit and update them at runtime. My question is quite practical: Say, I have a model of fixed topology and a set of animations (coordinates of each vertex per frame, finite duration) that I can only generate at runtime. How do I drive the mesh updates at 60FPS? Can I define a reusable Animation Resource for every animation once at startup and then schedule their playback like simple transform animations? Any helpful reply pointing me in the right direction is appreciated. Thank you. ~ Alexander
Posted
by AlexLike.
Last updated
.
Post not yet marked as solved
1 Replies
1k Views
It seems that on iOS 13 & 14 environment probes are notably too dark, and never reach an acceptable brightness that matches the surrounding environment. Is there any way to get "reasonable" IBL lighting in QuickLook that is not like 50% gray all the time, without resorting to hacks such as using emissive colors/textures? Clearly there must be something wrong with the IBL estimation, as the same scene in Google SceneViewer is very bright and nice under the same circumstances. The issue reproduces for example with the QuickLook gallery; the ceramics piece there is nearly 100% white as per the USD file but renders dull and gray next to a physical ceramics piece. Video of the issue: drive.google.com/file/d/14mVQFTNe6pO_4tYNIvpZa9eAS2YzoVPO/view?usp=sharing More pictures: drive.google.com/drive/folders/1ej6g-gpBAu53z2Zn08eQFNZAkTmDA_XJ?usp=sharing Note that in those pictures, all spheres in that grid are purely white, with varying degrees of metallic and roughness being the only difference. My expectation would be that the diffuse ones would appear "white" and not dark grey; seems impossible to get a "realistic" picture. Happens here as well: https://developer.apple.com/augmented-reality/quick-look/models/cupandsaucer/cup_saucer_set.usdz
Posted
by herbst.
Last updated
.
Post not yet marked as solved
2 Replies
244 Views
I wrote a simple ARKit app that has a hardcoded "virtual" hat into the Experience.rcproject. The hat is a .usdz file. I want to add the functionality for the users of the app to import their own .usdz hats, in that they are not pre-hardcoded in the app. There would be a button "Upload your hat", user would click and import the .usdz file, and the hat would end up being on the user's head. From the code perspective, the Experience model is strongly typed, so not sure how that could work: func updateUIView(_ uiView: ARView, context: Context) { let arConfiguration = ARFaceTrackingConfiguration() uiView.session.run(arConfiguration, options:[.resetTracking, .removeExistingAnchors]) let arAnchor = try! Experience.loadHat() // I want this to happen dynamically depending on the imported file from the UI uiView.scene.anchors.append(arAnchor) } Is adding a .usdz file dynamically to Experience somehow possible?
Posted
by wissil.
Last updated
.
Post not yet marked as solved
8 Replies
1.4k Views
Hello, in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app. This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work. I've created a little test project and the error message log is not really helpful. 2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene. 2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259' 2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259' 2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 ▿ Failed to load loadRequest.   - generic: "Failed to load loadRequest." Basic code structure that is used for loading: cancellable = Entity.loadAsync(named: entityName, in: .main)     .sink { completion in         switch completion {         case .failure(let error):             dump(error)             print("Done")         case .finished:             print("Finished loading")         }     } receiveValue: { entity in         print("Entity: \(entity)")     } Is there anyway to force it to load in a mode that enforces compatibility? As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14). Thanks for any help!
Posted Last updated
.
Post not yet marked as solved
0 Replies
165 Views
It seems weird to me that OcclusionMaterial hides objects for points of view on both sides, but other Materials like SimpleMaterial are unidirectional, only anticlockwise polygon meshes are visible by default. It's inconvenient in case when you create two polygons of same geometry and position but with opposite top faces, and one of these layers is OcclusionMaterial. As a result, Occlusion Material gets stuck with the second layer. Here is my example, animated piston. Version with Occlusion Material replaced with red material. Version with Occlusion Material. Is it possible to make Occlusion Material one side visible? Was this feature made on purpose? Thanks! .
Posted
by hexwhyzet.
Last updated
.
Post not yet marked as solved
1 Replies
266 Views
Hi, I'm using the provided CaptureSample together with HelloPhotogrammetry to create a 3D model. Although I am making sure that there are depth images within the folder (TIF) - after building the model, the model size is tiny. How can I make sure that the model is built in the real world size? or - how can i resize the model to real world size? Thanks.
Posted
by iddog.
Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
Hello, I am new to this amazing AR Developing world. wanted to know if I desire to develop an app for the AR Glasses that are about to be launched in the future - should I use the ARkit?
Posted Last updated
.
Post not yet marked as solved
1 Replies
690 Views
Hello, newbie here I am getting these errors while trying to build an Unity ARKit project from Xcode: error: Cannot initialize a parameter of type 'id _Nonnull' with an rvalue of type 'Class' Following is the code where the error points to: code: [nativeSession->_classToCallbackMap setObject:anchorCallbacks forKey:[ARPlaneAnchor class]];
Posted Last updated
.
Post not yet marked as solved
1 Replies
267 Views
We need to develop an iOS app that will make heavy use of ARSessions with ARFaceTrackingConfiguration, in more or less 100% of the screens. Our main concern is related to battery usage. We're in discovery phase, so I'm trying to find/research all possible risks related to it. For now, what I found is: ARKit sessions don't "drain" the battery (I did some testing with my phone), they just consume more that common apps (more or less the same amount of battery than having the iPhone camera open for a long period of time, with the addition that ARSession is continually creating frames in order to track the face -real time video processing). I'm aware of run(_:options:) and pause() methods on ARSession, in order to pause the tracking functionality when it's not necessary. I'm aware about lowPowerModeEnabled and we can react to changes in that property. I searched Apple Developer for some official article that provides general considerations about battery usage in an ARKit session, but I couldn't find anything dedicated to the topic. I'm wondering if there is a really important thing that I should be aware and I'm missing when implementing the feature. It's the first time I'll work with ARKit and it's critical for the project. Thanks in advance!
Posted
by jmgentili.
Last updated
.
Post not yet marked as solved
0 Replies
170 Views
Hi, I am a unity programmer who is new to using AR. I would like to be able to create points (like those already present on the AR face) and place them on the face. I have tried a technique using x,y,z coordinates but the problem is that since I am creating new points in relation to the AR face points, all I have to do is move my head and my points are no longer in the same place (so they are not fixed on my face like the basic points are). How can I make sure that my points can be fixed on the face and that they follow the movement of the head?
Posted
by Austine.
Last updated
.
Post not yet marked as solved
0 Replies
183 Views
Hi, I am developing an AR application, using an ARSession as camera source and a Metal view as rendering target on iPhone (12 Pro Max as test device and on Xcode 13.3 (13E113)). Since I am trying to render a 3D model with high polygon count, I want to set up a remote server for handling actual render to reduce load on iPhone. Are there any examples out there with this approach for ARKit? I am looking for something like Microsoft’s remoting method (which can be found here: https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/holographic-remoting-player). Any example similar to this, utilizing (if allowed by API) re-projection like HoloLens 2 for iOS would be great.
Posted Last updated
.
Post not yet marked as solved
6 Replies
1.5k Views
Hello there! After update of iOS15 quick-look does not work properly with SafariServices that we use on our native web. For example; if you open this link on twitter https://developer.apple.com/augmented-reality/quick-look/ just click model to use quick-look, you will see AR part is not working..
Posted Last updated
.
Post marked as solved
5 Replies
426 Views
I am fairly new to ios development, and i have a rather simple question (i guess): I exported an FBX model from blender with attached child elements (some empty axis). I need those transforms for a certain purpose in my app. However, they won't show up when im placing my model entity in the AR Scene and try to access it's child elements. However, they are showing up in the preview view in xcode when i click on the model file. Can someone please explain why this is and how I can access them? Would be lovely. Thanks for your help in advance!
Posted
by MiriamJo.
Last updated
.
Post not yet marked as solved
2 Replies
662 Views
So, I've modified the CaptureSample IOS app to take photos using the truedepth front camera. It worked perfectly, and I have TIF depth maps together with the gravity vector and the photos I took. Using the HelloPhotogrammetry command line, I created the meshes without any problems. I notice the meshes have a consistent size between then, for example, creating a mesh of my face and a mesh of my nose, the nose mesh fits perfectly on top of the nose on the face mesh! Great! BUT, when I open the meshes in Maya, for example, they are really really tiny! I was expecting to see the objects in the proper scale, and hopefully bee able to even take measurements in maya to see if they would match the real measurements of the scanned object, but they don't seem to come on the right size at all. I tried set Maya to meters, centimetres and milimetres, but it always imports the meshes really tiny. I have to apply a scale of 100 to be able to see the meshes. But then they don't measure correctly. By try and error, I was able to find that scaling the meshes by 86 would make then match the real world scale in centimetres. Is there a proper space conversion that needs to be applied to the mesh to convert it to the real world scale? Would the problem be that I'm using the truedepth camera instead of the back camera, and the depth map value is coming in a different scale than what HelloPhotogrammetry expects?
Posted
by rhradec.
Last updated
.
Post not yet marked as solved
0 Replies
215 Views
Hi Community, I am using Googles Model Viewer (https://modelviewer.dev/) to display 3D-models in the webbrowser. It also enables the user to open the 3D-models in AR-Applications. On iOS, Quick Look will be opened. It all works fine and well, except when the web space directory is protected by HTTP Basic Auth (protection of the usdz-model-files is a requirement of my client). As I understand it, Quick Look looses (or does not have any access to) the HTTP session, and therefore is not authenticated when trying to access the usdz-file on the protected web space directory. The error message in Quick Look is: "Object requires a newer version of iOS". Is there any way to open an usdz-file that is on a protected web space directory with Quick Look?
Posted Last updated
.
Post not yet marked as solved
2 Replies
867 Views
Hi, In SceneKit I pass custom parameters to a metal shader using a SCNAnimation, for example: let revealAnimation = CABasicAnimation(keyPath: "revealage") revealAnimation.duration =  duration revealAnimation.toValue = toValue let scnRevealAnimation = SCNAnimation(caAnimation: revealAnimation) material.addAnimation(scnRevealAnimation, forKey: "Reveal") How would I do similar to a metal shader in RealityKit? I saw in the Octopus example: //int(params.uniforms().custom_parameter()[0]) But it's commented out and there is no example how to set the custom variable and animate it? (unless I missed it) Great session BTW Thanks
Posted
by IainA.
Last updated
.
Post not yet marked as solved
1 Replies
262 Views
Hello, I tried to implement a game in arkit, when user tap the screen, the bullet will be fired from the touched position. I know how to get 2D position on the screen as below void HandleTapGesture(UITapGestureRecognizer sender) { SCNView areaPanned = sender.View as SCNView; CGPoint point = sender.LocationInView(areaPanned); but I don't know how to transfer this position into the world position. Then I can use it in below scenario let bullet = SCNNode(geometry: SCNSphere(radius: 0.08)) bullet.position = ? That position type is open var position: SCNVector3 Can anyone share any opinion? Many thanks.
Posted
by htcsharp.
Last updated
.