3D Graphics

RSS for tag

Discuss integrating three-dimensional graphics into your app.

Posts under 3D Graphics tag

32 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

GroundingShadowComponent tanks performance even when set to false
Working on a vision OS app. I've noticed that even when castsShadow is false, performance goes down the drain when there are more than a few dozen entities that have GroundingShadowComponent. I managed to hard crash the Vision Pro with about 200 or so entities that each had two ModelEntities with GroundingShadowComponent attached but set to castShadows = false. My solution is to add and remove the GroundingShadowComponent from entities as needed, but I thought maybe someone at Apple might want to look into this. I don't expect great performance with that many entities casting shadows, but I'd think turning the shadow off would effectively disable the component and not incur a performance penalty.
0
0
253
1w
Loading a .scnz file in Xcode / Displaying it in a view using Swift
Hello! I need to display a .scnz 3D model in an iOS app. I tried converting the file to a .scn file so I could use it with SCNScene but the file became corrupted. I also tried to instantiate a SCNScene with the .scnz file but that didn't work either (crash when instantiating it). After all this, what would be the best way to use this file knowing that converting it or exporting it to a .scn file with scntool hasn't worked? Thank you!
0
0
174
3w
Applying post-processing to SceneKit's Scene and saving it to a USDZ file
I am fairly new to 3D model rendering and do not know where to start. I am trying to, ideally with ARKit & RealityKit or SceneKit, do a scan of an environment. This includes: Applying realistic textures to the model. Being able to save it as a .usdz file (to be able to open it within the App itself) Once it is save do post-processing measurements within the model. I would prefer to accomplish this feature by using a mesh, instead of the pointCloud that is used in the sample project of apple. Would this be doable using Apple's APIs and on a mobile device or would it be necessary to use a third party program? I have managed to create a USDZ file using SceneKit's .scene.write(to:,delegate:) method. However the saved file is a "single object" and it is not possible to use raycasting to do post-processing measurements in the model.
0
0
192
3w
Glb files failing to load texture
I have a glb model that is loading absolutely fine, repeatedly, in safari or chrome. There is only one texture that is 8192x8192 it never has a problem when loading in browser. when we embed the url into an app, the model loads the first few times (exiting the model and going back to the main menu and then reloading the model) but, after a few attempts, the texture fails to load. The model and all data is visible but the texture, itself, is black. why could this be happening? Is there something in the iOS code that is breaking it? Is the iOS code trying to automatically cache the texture and it’s running out of memory? anyone who can provide the help and support that we require will be much appreciated. thank you advance.
0
0
166
Apr ’24
Developing a cosmology app for Vision Pro
Greetings, I'm a new developer and would like to understand exactly how XCode, SwiftUI, Reality Kit, ARKit, Reality Composer Pro and Unity work together to create a cosmology app in 3D? I have created a working solar system using Javascript and html and WebGL for the 3D stuff. I would now like to carry that over to the Apple Vision Pro. Can someone tell me what software frameworks, and api's in the Apple ecosystem I can use to code that? Many thanks
0
0
329
Mar ’24
How to Specify Pixel-Specific Depths of Views in xrOS?
With the advent of the third dimension, I wanted to know wether if it's currently possible to display the flat swiftUI Views with some thickness in xrOS? While the .frame(depth: CGFloat?) does the job for Views in general, I am eager for a more granular level of control at the pixel-specific level. I was hoping that there are lower level APIs to achieve this & I've looked into the fairly new layerEffect shader API, yet it seems it's incapable of setting the depths of pixels...
0
0
380
Mar ’24
PortalComponent – allow world content to peek out
Hello, I've been tinkering with PortalComponent on visionOS a bit but noticed that the content of the WorldComponent is always clipped to the mesh geometry of whatever entities have the PortalComponent applied. Now I'm wondering if there is any way or trick to allow contents of the portal to peek out – similar to the Encounter Dinosaurs experience on Vision Pro (I assume it also uses PortalComponent?). I saw that PortalComponent has a clippingPlane property (https://developer.apple.com/documentation/realitykit/portalcomponent/clippingplane-swift.property). But so far I haven't been able to achieve a perceptible visual difference with it. If possible I would like to avoid hacky tricks using duplicate meshes or similar to achieve this. Thanks for any hints!
4
0
600
Feb ’24
Scene Kit Rotation - rotating around X and Y axis only, causing Z rotation
I am trying to control the orientation of a box in Scene Kit (iOS) using gestures. I am using the translation in x and y to update the x and y rotation of the SCNNode. After a long search I have realised that x and y rotation will always lead to z rotation, thanks to this excellent post: [https://gamedev.stackexchange.com/questions/136174/im-rotating-an-object-on-two-axes-so-why-does-it-keep-twisting-around-the-thir?newreg=130c66c673f848a7be2873bf675573a9) So I am trying to get the z rotation causes, and then remove this from my object by applying the inverse quaternion however when I rotate the object 90 deg around x, and then 90 deg around Y it behaves VERY weirdly. It is almost behaving as it is in gimbal lock, but I did not think that using quaternion in the way that I am would cause gimbal lock in this way. I am sure it is something I am missing, or perhaps I am not able to remove the z rotation in this way. Thanks! I have added a video of the strange behaviour here [https://github.com/marcusraty/RotationExample/blob/main/Example.MP4) And the code example is here [https://github.com/marcusraty/RotationExample)
0
0
692
Dec ’23
Browser memory limit on IOS
Hi everyone, I hope this a right place to ask questions like this. I have an app which uses WEBGL scene implemented with three.js. At some point of loading, the app crashes (page reloads), which usually indicates that device ran out of memory reserved for this tab. This webgl scene however is fairly light compared to other scenes that load without any issues. How do I debug this? Is it possible to reallocate more memory before page is loaded, or is there a simple way to reduce memory consumption? I have very limited control over 3d scene, and it doesn't use heavy assets (mostly simple geometry with textures on them)
0
0
793
Nov ’23
Leaking WebGLRenderer when rerendering on iOS
I have created a react-three-fiber web-app which uses webgl canvas. When the canvas is forced to rerender due to external change, a new canvas context is being created and the previous one is not lost. This leads to safari refresh and crash as the no of active canvas contexts goes beyond the max limit alongwith the memory. This issue is specific to Safari only and (chrome only on iOS). Does Safari have a different garbage collection mechanism and not include webgl context clear automatically? If it doesnt, is there an API to invoke the same?
0
0
652
Nov ’23
Rotating SceneKit IBL lighting environment
I have a spherical HDR image that is being used for environment lighting in a SceneKit scene. I want to rotate the environment image. To set the environment lighting, I use the lightingEnvironment SCNMaterialProperty. This works fine, and my scene is lit using the IBL. As with all SCNMaterialProperty, I expect that I can use the contentsTransform property to rotate or transform the HDR. So I set it as follows: lightingEnvironment.contentsTransform = SCNMatrix4MakeRotation((45.0).degreesAsRadians, 0.0, 1.0, 0.0) My expectation is that the lighting environment would rotate 45 degrees in Y, but it doesn't change at all. Even if I throw in a completely random transform on all axis, there is no apparent change. To test if there is a change, I added a chrome ball and a diffuse ball to my scene and I'm comparing reflections on the chrome ball, and lighting on the diffuse ball. There is no change on either. It doesn't matter where I set the contentsTransform, it doesn't work. I had intended to set it from the renderer(_:updateAtTime:) method on the SCNRendererDelegate, so that I can rotate the IBL to match the point of view of the scene, but even if I transform the environment immediately after it is set, there is never a change. Is this a bug? Or am I doing something entirely wrong? Has anyone on here ever managed to get this to work?
0
0
582
Nov ’23
AR viewer is Bugging out in iPhone 12 and Above (For models with Glass in 3D model)
While experimenting with AR view for different Product We came across an issue with Apples AR viewer where for glass (PBR Opacity) it is causing black patch to appear behind (Maybe Shadow). https://sketchfab.com/3d-models/welcome-5ba96662ba8d4774951f33fead4bf9db https://sketchfab.com/3d-models/candel-91b2059634e0478eb93777b0b2a726e9 We Tried to find work around but after doing multiple test but with all our experiments we came to the conclusion that Apple AR viewer is not able recognize the glass material and adjust the ground shadow as required.
0
0
596
Nov ’23
iOS 17 SceneKit normalmap & morphtarget causes lighting/shading issue
After the iOS 17 update, objects rendered in SceneKit that have both a normal map and morph targets do not render correctly. The shading and lighting appear dark and without reflections. Using a normal map without morph targets or having morph targets on an object without using a normal map works fine. However, the combination of using both breaks the rendering. Using diffuse, normal map and a morpher: Diffuse and normal, NO morpher:
5
1
1.5k
Oct ’23
Transitioning from SceneKit to RealityKit - shadows and custom shaders
We have a content creation application that uses SceneKit for rendering. In our application, we have a 3D view (non-AR), and an AR "mode" the user can go into. Currently we use a SCNView and an ARSCNView to achieve this. Our application currently targets iOS and MacOS (with AR only on iOS). With VisionOS on the horizon, we're trying to bring the tech stack up to date, as SceneKit no longer seems to be supported, and isn't supported at all on VisionOS. We'd like to use RealityKit for 3D rendering on all platforms; MacOS, iOS and VisionOS, in non-AR and AR mode where appropriate. So far this hasn't been too difficult. The greatest challenge has been adding gesture support to replace the allowsCameraControl option on the SCNView, as no such option on ARView. However, now we get to control shading, we're hitting a bit of a roadblock. When viewing the scene in Non-AR mode, we would like to add a ground plane underneath the object that only displays a shadow - in other words, it's opacity would be determined by light contribution. I've had a dig through the CustomMaterial API and it seems extremely primitive - there doesn't seem any way to get light information for a particular fragment, unless I'm missing something? Additionally, we support a custom shader that we can apply as materials. This custom shader allows the properties of the material to vary depending on the light contribution, light incidence angle...etc. Looking at the CustomMaterial, the API seems to be defining a CustomMaterial, whereas as guess we want to customise the BRDF calculation. We achieve this in SceneKit using a series of shader modifiers hooked into the various SCNShaderModifierEntryPoint. On VisionOS of course the lack of support for CustomMaterial is a shame, but I would hope something similar can be achieved with RealityComposer? We can live with the lack of custom material, but the shadow catcher is a killer for adoption for us. I'd even accept a different limited features on VisionOS, as long as we can matching our existing feature set on existing platforms. What am I missing?
1
1
694
Oct ’23
MapKit Custom 3D Models
Dear Apple Team and everyone who has experience with MapKit. I am building an app where I need to hide some 3D models and replace them with my custom 3D meshes using SceneKit. Up until now I was using Mapbox it allows to get mesh row data to reconstruct all maps 3D. Is there something like this possible with MapKit? Use cases Say you navigated to Kennedy Space Center Launch Complex 39 and there is no 3D model of actual building. I would like to be able to hide simple massing and replace it with my model. In 3D Satellite VIew some areas have detailed meshes. Say London The Queen's Walk. I would like to make specific area flat so I can place my 3D model on top of Satellite 3D View to illustrate new structure or building. Last one. Is it possible to change existing buildings colours? I know it is possible transparency Thank you @apple
2
1
1.9k
Oct ’23
Spatial framework, Rotation3D eye target up (look at), Quaternion flip
Hi, I want to begin by saying thank you Apple for making the Spatial framework! Please add a million more features ;-) I'm using the following code to make an object "look at" another point, but at a particular rotation the object "flips" its rotations. See a video here: https://www.dropbox.com/s/5irxt0gxou4c2j6/QuaternionFlip.mov?dl=0 I shake the mouse cursor when it happens to make it obvious to you. import Spatial let lookAtRotation = Rotation3D(eye: Point3D(position), target: Point3D(x: 0, y: 0, z: 0), up: Vector3D(x: 0, y: 1, z: 0)) myObj.quaternion = lookAtRotation.quaternion So my question is why is this happening, and how can I fix it? thx
1
0
1.3k
Oct ’23