Post not yet marked as solved
Got Sonoma, got Xcode 15 with Mac and iOS sdks.
Launched XCOde but no way to find Reality Composer pro in the Open Developer Tool or in other items.
Post not yet marked as solved
Do I understand correctly that to use Unity (Universal Render Pipeline) for Vision Pro's fully immersive apps, we can normally use Unity's Shader Graph to make custom shaders; but for immersive mixed reality apps, we cannot use Shader Graph anymore, instead, we have to create shaders for Unity in Reality Composer?
How does bringing Reality Composer's shader into Unity work? Is it simply working in Unity or will it require special adaptation for Unity?
Are there some cases to avoid using Reality Composer, and use Unity's Shader Graph for immersive Vision apps? For instance, we may lose real-time lighting adaptation for virtual objects, but on the other hand, we will be able to use Shader Graph.
I saw the at the WWDC23 session "Meet Object Capture for iOS" that the new tool that was released today along with Xcode 15 beta 2 called "Reality Composer Pro" will be capable of creating 3D models with Apple's PhotogrammetrySession. However, I do not see any of its features on the tool. Has anyone managed to find the feature for creating 3D models as shown in the session?
Will the diorama scene, assets, and final Xcode be made available to developers to use and examine?
Post not yet marked as solved
The version of Reality Composer Pro is 1.0 from Xcode 15 beta 2. Every time I click 'Particle Emitter' button, it will crash. I can't open the Diorama, the demo from document, neither.
Post not yet marked as solved
I'm currently testing Photogrametry by capturing photos with sample project
https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture
Then use them on my laptop with https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app
It worked perfectly until the latest updates of Sonoma BETA.
It started by warning logs in the console saying I lacked depthMap in my samples and now it just refuse to create samples from my HEIC files.
I tried to create HEIC files with and without Depth data to check if it's a bad format of these depth data but it seems it's just the HEIC format itself that is not accepted anymore. I've also just imported HEIC files captured with the standard iOS app and transferred what Photo app and they doesn't work either so it's not an issue of poorly formatted files.
If I convert the files in PNG, it works again but of course, as announced during WWDC 2023, I expect to get the photogrammetry pipeline leverage the LIDAR data !
I check every BETA update waiting for an improvement. I can see the photogrammetry logs are never the same so I guess the apple teams are working on it.
Of course, the object capture model from Reality Composer pro, also doesn't accept HEIC files anymore.
If there are some workarounds, please advise !