Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Apple's Choice: USDZ over Other 3D File Formats like GLTF
Hello Dev Community, I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice. USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility. Here are some of my questions toward the use of USDZ: Why did Apple choose USDZ over other 3D file formats like GLTF? What benefits does USDZ bring to Apple's AR and 3D content ecosystem? Are there any limitations of USDZ compared to other file formats? Could factors like compatibility, security, or integration ease have influenced Apple's decision? I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
2
0
2.4k
Jun ’23
Confusing colorSpace naming in Reality Composer Pro
I made a scene in Reality Composer Pro and used the „Light Blue Denim Fabric“ material. Saving the scene and exporting to USDZ resulted in that material using this line: asset inputs:file = @0/LightBlueDenimFabric_basecolor.png@ ( colorSpace = "Input - Texture - sRGB - sRGB" ) Question 1: Was this colorSpace wrongfully exported from a different tool ? The updated Apple page https://developer.apple.com/documentation/realitykit/validating-usd-files mentions these new known colorSpace(s): “srgb_texture”, “lin_srgb”, “srgb_displayp3”, or “lin_displayp3”. To be honest, this confused me even more. My understanding is that the prefix "lin_" means linear (aka no gamma) and the established "srgb_" prefix means gamma corrected. But the word srgb could also describe the encoding (sRGB vs DisplayP3)
1
0
760
Jun ’23
Manual for Shader Graph in Reality Composer Pro
Hi, I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials. Thanks. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
7
1
2.7k
Mar ’24
runtime texture input for ShaderGraphMaterial?
For the MaterialX shadergraph, the given example hard-codes two textures for blending at runtime ( https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro#Build-materials-in-Shader-Graph ) Can I instead generate textures at runtime and set what those textures are as dynamic inputs for the material, or must all used textures be known when the material is created? If the procedural texture-setting is possible, how is it done, since the example shows a material with those hard-coded textures? EDIT: It looks like the answer is ”yes” since setParameter accepts textureResources https://developer.apple.com/documentation/realitykit/materialparameters/value/textureresource(_:)?changes=l_7 However, how do you turn a MTLTexture into a TextureResource?
1
1
878
Jul ’23
Augmented Reality App crash, Apple vision
I have a newly installed Xcode 15 beta 2 on two machines: Apple Studio with 13.4.1, and M1 Mini with 14.0 beta. With both machines when I start a new project as an iOS Augmented Reality App and -- without doing anything else-- try to run with the Apple Vision Pro simulator, the simulation crashes. A short section of the crash report is below. I've sent these to Apple, from both machines. i'm at a loss. I can't imagine that this is the case for everyone, but I'm seeing it on both of my machines. (It also crashes when aimed at an iPad 17.0 beta simulator as well.) Does anyone have suggestions of how to get past this starting point? Thanks, in advance. Translated Report (Full Report Below) Incident Identifier: B19370C3-21F5-4AA8-A977-BFF69FD9732A CrashReporter Key: 3BA15700-B5BA-5C3B-0F30-39509BCFDE58 Hardware Model: Macmini9,1 Process: Tet5 [66050] Path: /Users/USER/Library/Developer/Xcode/UserData/Previews/Simulator Devices/3CC9EFB1-EFDA-4622-A04F-067FE72BDF40/data/Containers/Bundle/Application/54BB34F8-0D07-4ED5-9F2E-6FA61E0990B6/Tet5.app/Tet5 Identifier: com.apple-bbbbbb.Tet5 Version: 1.0 (1) Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd_sim [65662] Coalition: com.apple.CoreSimulator.SimDevice.3CC9EFB1-EFDA-4622-A04F-067FE72BDF40 [84827] Responsible Process: SimulatorTrampoline [14070] Date/Time: 2023-06-30 11:39:35.9221 -0400 Launch Time: 2023-06-30 11:39:33.7148 -0400 OS Version: macOS 14.0 (23A5276g) Release Type: User Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: DYLD 4 Symbol missing Symbol not found: _$s21DeveloperToolsSupport15PreviewRegistryPAAE04makeD0AA0D0VyKFZ Referenced from: <9E12C915-D6B1-3DD3-83B8-6921502C7F73> /Users/USER/Library/Developer/Xcode/UserData/Previews/Simulator Devices/3CC9EFB1-EFDA-4622-A04F-067FE72BDF40/data/Containers/Bundle/Application/54BB34F8-0D07-4ED5-9F2E-6FA61E0990B6/Tet5.app/Tet5 Expected in: <31FB64EE-D651-3287-9607-1ED38855E80F> /Library/Developer/CoreSimulator/Volumes/xrOS_21N5165g/Library/Developer/CoreSimulator/Profiles/Runtimes/xrOS 1.0.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/DeveloperToolsSupport.framework/DeveloperToolsSupport (terminated at launch; ignore backtrace)
1
0
781
Jun ’23
Original Reality Composer (non pro) in Vision OS
When I create an USDZ file from the original Reality Composer (non pro) and view it in the Vision OS simulator, the transforms and rotations don't look similar. For example a simple Tap and Flip behaviour does not rotate similar in Vision OS. Should we regard RC as discontinued sw and only work with RC-pro? Hopefully Apple will combine the features from the original RC into the new RC pro !
1
1
708
Aug ’23
Play Audio on window change
I been playing around with VisionOS and currently trying to play with Audio, but somehow following the steps in wwdc isn't working for me. what I'm trying to do is to play audio when you navigate to a certain view. Currently in wwdc, they're doing this to programmatically set up audio let audioSource = Entity() audioSource.spatialAudio = SpatialAudioComponent(directivity: .beam(focus: 0.75)) audioSource.orientation = .init(angle: .pi, axis: [0, 1 , 0]) if let audio = try? await AudioFileResource(named: "audioFile", configuration: .init(shouldLoop: true)) { } But I'm getting these errors and cannot proceed 'AudioFileResource' cannot be constructed because it has no accessible initializers I couldn't quite follow on how to do it from Reality Composer pro either.. therefore I'm stuck. Any help is really appreciated.
0
1
478
Jul ’23
Sample Code / "Work with Reality Composer Pro content in Xcode"
@Apple Really excited by the updates on the Reality Composer and Object Captures side - is it possible to publish the sample Apps and project material referenced in both the Reality Composer Pro sessions as well as the "Meet Object Capture for iOS" sessions. Looks like it hasn't yet been updated to the Sample Code site. Thank you! https://developer.apple.com/videos/play/wwdc2023/10273/ https://developer.apple.com/videos/play/wwdc2023/10191/
1
0
642
Jul ’23
Vison pro simulator
Hi Folks, I have two question if you can help me, first is as I tested none of env trackers work in simulator I know Reality kit needs to get data from LIDAR or camera to detect env and depth the question is: the env in xcode env is a just a HDR image? means we can not make any app for VisionPro until we get a device? as I watch Apple videos all are using real device and none of them are in xcode simulation when tracking world. Second question is are we able to add light or remove default light from scene? whatever I did it doesn't have any effect on my scene. I'm 3d model maker too with Blender so I can understand the graphs in composer pro but many of those doesn't reflect any effect. Thank you so much!
0
0
644
Jul ’23
RealityKit cannot load Entity
Very often when I try to load an Entity in RealityView the system give me an error: The operation couldn’t be completed. (Swift.CancellationError error 1.) RealityView { content in do { let scene = try await Entity(named: "Test_Scene", in: realityKitContentBundle) content.add(scene) } catch { debugPrint("error loading scene", error.localizedDescription.debugDescription) } } The scene I created contain a basic square This is the main file struct first_app_by_appleApp: App { var body: some Scene { WindowGroup { ContentView() }.windowStyle(.volumetric) } } and inside info.plist file I setup UIApplicationPreferredDefaultSceneSessionRole with UIWindowSceneSessionRoleVolumetricApplication I think that could be a bug of the system but I'm not surprised since it's the first beta. If so, do you know any workarounds?
2
0
829
Jul ’23
Exported .usdz scenes are not compatible with common tools
If you have a scene with a simple custom .usda material applied to a primitive like a cube, the exported (.usdz) material definition is unknown for tools like Reality Converter Version 1.0 (53) or Blender Version 3.6.1. Reality Converter shows up some warnings "Missing references in USD file", "Invalid USD shader node in USD file". Even Reality Composer Pro is unable to recreate the material correct with it's own exported .usdz files. Feedback: FB12699421
3
0
1.1k
Aug ’23
The text "%%USE_I18N_STRING_ERROR%%" is displayed on the application page for the Apple Vision Pro Developer Kit.
Hello, I am writing to you today to inquire about my application for the Apple Vision Pro Developer Kit. I submitted my application three times, but each time the application page displayed the text "%%USE_I18N_STRING_ERROR%%". After I submitted my application, the page I was redirected to said "We'll be back soon". I am not sure if my application was successful. I have not received any email confirmation. Would you be able to check the status of my application and let me know if there is anything else I need to do? Thank you for your time and consideration. Sincerely, Sadao Tokuyama https://twitter.com/tokufxug https://www.linkedin.com/in/sadao-tokuyama/
4
0
860
Jul ’23
How Does Freeform on visionOS Get A Larger Size Initial Window?
Hello community, I'm trying to open a window with larger than the original initial dimensions. Because the native Freeform app doing it just fine with wider-than-usual window dimensions. But I found on the documentation saying that "In visionOS, all windows open with the same initial dimensions." I tried to set a greater width on the window frame but there indeed seems to be a hard fixed max length on it since I can make the window to be smaller but not bigger programmatically. Am I doing something wrong or missing anything? What are your thoughts on how the larger window of Freeform even gets implemented? Or it could even not be a window for example might be a layered on a large reality view? Any input would be appreciated!
0
0
521
Jul ’23