Post not yet marked as solved
I'm currently trying to project a transparent .png file on to a flat surface (table/paper), but the shadow is giving me a gray box.
I am a huge fan of the shadows in Reality Composer, however I'm searching for an option to adjust the opacity or turn off the shadow.
Post not yet marked as solved
Hey, its showing me that There was a problem running this page after adding more than 2 layers to my AR project. Please how can I solve this and be able to have more layers?
my code:
Post not yet marked as solved
I have a plane AnchorEntity with a child raycast AnchorEntity and my ObjectEntities (USDZ & Mesh) as children of that.
On devices up to iOS 14 I get grounding shadows, but on iOS 15 these don't appear.
My testing shows that grounding shadows will only appear on iOS 15 if I add ObjectEntities directly as children of the plane?
Is there some way in iOS 15 to get grounding shadows to pass through the intermediate raycast AnchorEntity to the plane underneath as they do in iOS14?
Thanks.
Here's my test code:
let query = arView.makeRaycastQuery(from: arView.center, allowing: .estimatedPlane, alignment: .horizontal)
let result = arView.session.raycast(query).first
let raycastAnchor = AnchorEntity(raycastResult: result)
let planeAnchor = AnchorEntity(plane: .horizontal, classification: .any, minimumBounds: [0.2, 0.2])
let robot = (try? AnchorEntity.loadModel(named: "robot", in: nil))
// ** This gives Grounding Shadow on iOS 14 but not iOS 15
planeAnchor.addChild(raycastAnchor)
raycastAnchor.addChild(robot!)
arView.scene.anchors.append(planeAnchor)
// ** This gives Grounding Shadow on both iOS 14 and 15
raycastAnchor.addChild(planeAnchor)
planeAnchor.addChild(robot!)
arView.scene.anchors.append(raycastAnchor)
Post not yet marked as solved
How can I move my career from an iOS Dev to working in AR/VR?
Post not yet marked as solved
In our AR app and appclip made with SceneKit, we experience z-fighting with particles and another transparent object (a transparent plane faking a bloom effect around one of our models).
We tried to change the sorting mode without success. Moreover the z-fighting appears only on iPhone (tested with an iPhone X) and not on iPad (tested with an iPad Pro 2021).
Here is a video showing the problem and a screenshot of our rendering settings for the particle system : https://drive.google.com/drive/folders/1fvyHRVfprw606qnQSRaLuDjtY1hN8_f2?usp=sharing
Post not yet marked as solved
In our AR app and appclip made with SceneKit, we experience very significant drops in framerate when we make our 3D content appear at different steps of the experience.
For now all of our 3D objects are in our Main Scene. Those which are supposed to appear at some point in the experience have their opacity set to 0.01 at the beginning and then fade in with a SCNAction (the reason why we tried setting their opacity to 0.01 at start was to make sure that these objects are rendered from the start of the experience).
However, if the objects all have their opacity set to 1 from the start of the experience, we do not experience any fps drop.
It is worth noting that the fps drops only happen the first time the app is opened. If I close it and re-open it, then it unfolds without any freeze.
What would be the best way to load (or pre-load) these 3D elements to avoid these freezes?
We have conducted our tests on an iPhone X (iOS 15.2.1), on an iPhone 12 Pro (iOS 14), and on an iPad Pro 2020 (iPad OS 14.8.1).
Post not yet marked as solved
Hello everyone,
I created a 3D character animation in Blender and I would like to import it in Reality Composer.
However i export the animation from Blender, it wont show in Composer, just the static object.
My usdz scene has character parrented on animated armature.
is there any way to import 3D character animation (made in 3D software) to Reality Composer?
thnx
Post not yet marked as solved
We create reality files on Reality Composer on an iPad Pro 2020 (iPad OS 14.8.1).
They can be opened in AR correctly on iOS 14 devices (our iPad & an iPhone 12 Pro).
However on our iOS 15 iPhone X (iOS 15.2.1), AR Quick Look opens but it displays an error message saying “the object cannot be opened”.
I have tried creating a .reality file on Reality Composer directly on my iOS 15 iPhone X (I just exported the template project for horizontal plane tracking), but the export fails to open on the very same device.
However, the file generated on my iOS 15 iPhone X opens fine on my iPad Pro (iPad OS 14.8.1).
USDZ files work fine on iOS though.
Any lead on how to solve this issue?
Post not yet marked as solved
I'm not sure what happened, I'm pretty sure I wasn't having issues importing these file types a few weeks ago. Just to be sure I checked the webpage for Reality Converter and yup, GLTF is listed...
Simply drag-and-drop common 3D file formats, such as .obj, .gltf and .usd
I haven't updated anything on my computer, so I dunno how anything could have changed. Anyone know what the deal could be?
Post not yet marked as solved
Hi,
Is the LiDAR scanner on the new iPad pro en iPhone 12 series a good device to make a 3D scan of an object? How high res would this be? And what is de ideal object size?
And also: can the camera system and LiDAR sensor work together to achieve a 3D model with texture?
Any help is much appreciated.
Kind regards, Sybren
Post not yet marked as solved
Hi there,
I have got an idea to develop an iOS VR app to provide virtual tour experience of a few attractions in my town. I've purchased a 360 camera to capture the panoramic views of the sites. The next step is to build an iOS VR app to host and present those 360 degree images/videos. I did some research online on the starter guide but only found resources like AR(ARcore) and Google Cardboard Unity for iOS. Could you please let me know if there is any official or non-official but well written tutorial on how to make an iOS VR app? Alternatively, a list of to-do items or instructions are equally welcomed. Thank you!
Post not yet marked as solved
Hello,
We are having issue with an app that we have developed for our client which is having AR+VR experiences in one app for iPhone.
The app is working flawlessly on iPhones till iOS 13, but the VR is broke on iOS 14. After launching VR experience on an iPhone having iOS 14 the view keeps spinning rapidly.
We have noticed many people are facing same issue with their VR apps after updating their iPhone OS to iOS 14.
We have used Google VR SDK for VR development and EasyAR SDK for AR development and developed the app using Unity 2019.3 game engine.
After going through Google VR's website we noticed that they are pointing to a new open-source SDK/Plugin, i.e. Cardboard XR which works with iPhone having iOS 14. We have tried the open-source plugin with their own sample scene and observed that, although the VR works with iOS 14, there are other bugs/glitches within the plugin with which we will not be able to release the app.
This issue of iOS 14 being incompatible with Google VR SDK came at the end moment when we were releasing final distribution builds. We have waited for long from September end to see if the issue get's resolved in the next update of iOS 14 but even the current iOS 14.2.1 version is having the same issue.
So, could you please let us know by when or in which update, this issue will get resolved with Google VR SDK apps, developed using Unity.
If you have any alternate solution to this, please let us know.
I appreciate your quick response as our client is awaiting app release.
Thanks!
Post not yet marked as solved
Hello,
I have added the .usdz file in my website for iOS for a 3D model but it keeps giving me the error’ object requires a new iOS version’ when I click on the AR button. I have tried on multiple apple devices. Works great on android. Tried through safari as well as chrome. Any help would be appreciated. Thanks
Post not yet marked as solved
I don't know if this is an issue with Apple's Reality Converter app or Blender (I'm using 3.0 on the Mac), but when I export a model as .obj and import it to Reality Converter, the scale is off by a factor of 100.
That is, the following workflow creates tiny (1/100 scale) entities:
Blender > [.obj] > Reality Converter > [USDZ]
But this workflow is OK:
Blender > [.glb] > Reality Converter > [USDZ]
Two workarounds are:
export as .glb/.gltf,
when exporting .obj set the scale factor to 100 in Blender
Is this a known issue, or am I doing something wrong?
If it is an issue, should I file a bug report?
Post not yet marked as solved
Will iPad ever receive these tools for object capture? Or at the very least xCode for the ability to use the command line apps for it? I have an M1 iPad Pro that should be able to do all that the M1 Macs can but it’s being held back by software limitations.
I am finding some unexpected behavior with lights I've been adding to a RealityKit scene.
For example, I created 14 PointLights, but only 8 appeared to be used to illuminate the scene.
In another example, I created 7 PointLights and 7 SpotLights, and the frame rate dropped quite a bit.
Are lights computationally expensive, causing some adaptive behavior by RealityKit?
Should I be judicious in my use of lights for a scene?
(Note: I set arView.environment.lighting.resource to a Skybox with a black image; my goal was to completely control the lighting. I don't know if that added to the computational load)
Post not yet marked as solved
I am developing a multiplayer AR game based on this repo https://github.com/Unity-Technologies/arfoundation-samples/tree/main/Assets/Scenes/ARKit/ARCollaborationData and this demo https://developer.apple.com/documentation/arkit/creating_a_collaborative_session?language=objc
When I had two devices in the game, everything was good. But when I increased the number of devices to four, the devices got over-heated quickly and the network got jammed very often. Since it is a peer-to-peer network, each device send its ARCollaborationData to all other devices every frame, I don't know if MultipeerConnectivity is capable of handling such amount of data in realtime with low energy consuming.
I have tested the Round Trip Time of MultipeerConnectivity in this question https://developer.apple.com/forums/thread/694882, and found it is not very stable for a realtime game.
My question is that, is MultipeerConnectivity a good choice for a realtime AR game with more than 4 devices? I know Airdrop is implemented using this technology, so I guess maybe it is designed for high throughput instead of low latency and stability? Should I use another network solution like Photon-Realtime?
Post not yet marked as solved
We are able generate 3D mesh model, but it appears white , as we didn't get texture files in .mtl file. We found way to generate texture model from set of images at the below links
https://developer.apple.com/documentation/realitykit/creating_3d_objects_from_photographs
LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model
However photogrammetry(object capture API) it works on MAC, we want to achieve this into iPhone and iPad.
We could see this happening in "3D scanner app" and "Polycam app".
Please suggest how we can resolve this.
Thanks in Advance.
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes)
But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered.
Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)?
Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles?
(The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
Hi
I'm trying to become familiar with RealityKit 2. I'm trying to build the code in the session, but I'm getting compile errors.
Any advice?
Link to the sample code below
https://developer.apple.com/documentation/realitykit/building_an_immersive_experience_with_realitykit