Post not yet marked as solved
Is there support for using multiple UV channels in AR QuickLook in iOS17?
One important use case would be to put a tiling texture in an overlapping tiling UV set while mapping Ambient Occlusion to a separate unwrapped non-overlapping UV set. This is very important to author 3D content combining high-resolution surface detail and high-quality Ambient Occlusion data while keeping file size to a minimum.
Post not yet marked as solved
I am trying to follow the WWDC 2022 video Explore USD tools and rendering: https://developer.apple.com/videos/play/wwdc2022/10141/
I followed the steps here to create an Xcode project that uses OpenUSD to load scenes. https://developer.apple.com/documentation/metal/metal_sample_code_library/creating_a_3d_application_with_hydra_rendering?language=objc
After installing OpenUSD and generating an Xcode project, I opened Xcode, set the scheme to hydraplayer and clicked the build button. The code compiles but fails to link with a bunch of undefined symbols errors like this one:
Undefined symbol: pxrInternal_v0_23__pxrReserved__::GfMatrix4d::SetDiagonal(double)
I tried to tag this post wwdc2022-10141, but the tag was not found so I tagged a related session.
Post not yet marked as solved
I made a scene in Reality Composer Pro and used the „Light Blue Denim Fabric“ material.
Saving the scene and exporting to USDZ resulted in that material using this line:
asset inputs:file = @0/LightBlueDenimFabric_basecolor.png@ (
colorSpace = "Input - Texture - sRGB - sRGB"
)
Question 1:
Was this colorSpace wrongfully exported from a different tool ?
The updated Apple page https://developer.apple.com/documentation/realitykit/validating-usd-files
mentions these new known colorSpace(s): “srgb_texture”, “lin_srgb”, “srgb_displayp3”, or “lin_displayp3”.
To be honest, this confused me even more.
My understanding is that the prefix "lin_" means linear (aka no gamma) and the established "srgb_" prefix means gamma corrected.
But the word srgb could also describe the encoding (sRGB vs DisplayP3)
Post not yet marked as solved
Hello,
I'm a novice developer for iOS using SwiftUI. I want to create a way for users of my app to upload 3D models onto their account and dynamically load them into a scene upon selecting it from their library of saved 3D models. Is there a way to do this?
I'm aware of the pass-through method, but this requires having models installed into the build of the app before launching it. Can someone help or point me in the right direction?
Thank you!