Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Entity not reacting to light and not casting shadow in VIsionOS Simulator
Hi Guys, I've been trying to put my model to react to light in visionOS Simulator by editing the component in Reality Composer Pro and also modifying it by code, but I can only put the shadow if I put it as an usdz file, it's not as reflective as when I see it on reality converter or reality composer pro, does someone have this problem too? RealityView { content in if let bigDonut = try? await ModelEntity(named: "bigdonut", in: realityKitContentBundle) { print("LOADED") // Create anchor for horizontal placement on a table let anchor = AnchorEntity(.plane(.horizontal, classification: .table, minimumBounds: [0,0])) // Configure scale and position bigDonut.setScale([1,1,1], relativeTo: anchor) bigDonut.setPosition([0,0.2,0], relativeTo: anchor) // Add the anchor content.add(anchor) // Enable shadow casting but this does not work bigDonut.components.set(GroundingShadowComponent(castsShadow: true)) } }
1
0
447
Jan ’24
Can't play Audio in RealityComposerPro
I hope to be able to display the USDA model in RealityComposerPro and play the Spatial Audio. I used RealityView to implement these contents: RealityView{ content in do { let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle) content.add(entity) guard let entity = entity.findEntity(named: "SpatialAudio"), let resource = try? await AudioFileResource(named: "/Root/isWateringBasinAudio_m4a", from: "isWateringBasin.usda", in: RealityKitContent.realityKitContentBundle) else { return } let audioPlaybackController = entity.prepareAudio(resource) audioPlaybackController.play() } catch { print("Entity encountered an error while loading the model.") return } } but when I ran it, I found that although can displayed the model normally, Spatial Audio failed to play normally. I hope to get guidance, thank you!
1
0
389
Jan ’24
How to add .hoverEffect to a single entity in a realityview with multiple entities.
How can I add .hoverEffect to a single entity in a realityview with multiple entities? I want selectable object to highlight themselves when looked at or hovered over. How can I easily make this happen? There is a working example of this within the Swift Splash demo but I don't know what part of the code is creating that feature. So far I've been able to get .hoverEffect to work on Model3D(), but I want to have multiple entities with a few that are selectable.
1
0
385
Jan ’24
Play Spatial Audio
I want to play RealityKitContent USDA model's Spatial Audio, I use this code: RealityView{ content in do { let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle) let audio = entity.spatialAudio entity.playAudio(audio) content.add(entity) } catch { print("Entity encountered an error while loading the model.") return } } entity.playAudio(audio) this code need add a 'AudioResource' back of audio, Excuse me, what should AudioResource be?
0
0
324
Dec ’23
Model3D with deep stretching
I used Model3D to display a model: Model3D(named: "Model", bundle: realityKitContentBundle) { phase in switch phase { case .empty: ProgressView() case .failure(let error): Text("Error \(error.localizedDescription)") case .success(let model): model.resizable() } } However, when I ran, I found that the width and length were not stretched, but when I looked at the depth from the side, they were seriously stretched. What should I do? Note: For some reason, I can't use the Frame modifier. Image: width and length error depth
0
0
318
Dec ’23
Can’t find Reality Composer Pro
I just updated xcode 15.2 and I want to try to use Reality Composer Pro, I saw on the Apple developer video that it should be under Xcode -> developer tool -> Reality Composer Pro but when I open that I don't have Composer. On the Apple webpage for Rality Composer is written "Reality Composer for macOS is bundled with Xcode, which is available on the Mac App Store." Where i can find the Composer Pro? Thanks
0
0
572
Dec ’23
Model3D and Reality Composer Pro
I created a RealityKitContent in the Packages folder of the visionOS app project. At first, I tried to add a USDA model directly to its rkassets. I used Model3D(named: "ModelName", bundle: realityKitContentBundle) can The model is displayed normally, but then when I add a folder in rkassets and then put the USDA model in that folder, use Model3D(named: "ModelName", bundle: realityKit ContentBundle) cannot display the model normally. What should I do? If you know how to solve the above problems, please let me know and check if you know how to solve the following problems. If you know, please also tell me the answer. Thank you! The USDA model I mentioned in the question just now contains an animation, but when I used Model3D(named: "ModelName", bundle: realityKitContentBundle) , I found that the animation was not played by default, but needed additional code. Is there any documentation, video or listing code in this regard?
0
0
428
Dec ’23
Multiple root level objects Error [USDZ/Reality Composer Pro]
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter. See Attached image: It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear. Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
1
0
844
Dec ’23
Convert ARKit App (UIKit) to VisionOS RealityKit app
Is there a way of integrating the RealityKitContent to an app created with Xcode12 using UIKit? The non AR parts are working ok in VisionOS, the AR parts need to be rewritten in SwiftUI. In order to be able to do so,I need to access the RealityKit content and be able to work it seamlessly with Reality Composer Pro, but unsure how to integrate RealityKitContent is such pre-SwitftUI/VisionOS project. I am using Xcode 15 Thank you.
0
0
359
Dec ’23
Seeking Clarification on UsdPrimvarReader Node Functionality
Hello fellow developers, I am currently exploring the functionality of the UsdPrimvarReader node in Shader Graph Editor and would appreciate some clarification on its operational principles. Despite my efforts to understand its functionality, I find myself in need of some guidance. Specifically, I would appreciate insights into the proper functioning of the UsdPrimvarReader node, including how it should ideally operate, the essential data that should be specified in the Varname field, and the Primvars that can be extracted from a USD file. Additionally, I am curious about the correct code representation of a Primvar in USD file to ensure it can be invoked successfully. If anyone could share their expertise or point me in the right direction to relevant documentation, I would be immensely grateful. Thank you in advance for your time and consideration. I look forward to any insights or recommendations you may have.
1
0
590
Dec ’23
Materials are not importing into Reality Composer Pro as expected from USDA files
When creating a USDA file in a DCC, I want RCP to import it as expected with materials assigned. However, I’m finding that the material is not imported correctly, despite it rendering correctly in the preview pane and the textures being pulled in. The workaround is to recreate the material in the shader tree, but then I override any material changes I do on the original UDSA. Please advise me on what I need to be doing here, to correctly import materials into RCP. Using USDZ files is not ideal, as I want to make sure changes can easily be made upstream. Sorry about the link, but I can't seem to upload it to the post. https://pasteboard.co/bmhl3t004APu.png Any guidance here is much appreciated!
1
1
469
Dec ’23
RealityView fit in volumetric window
Hey guys How I can fit RealityView content inside a volumetric window? I have below simple example: WindowGroup(id: "preview") { RealityView { content in if let entity = try? await Entity(named: "name") { content.add(entity) entity.setPosition(.zero, relativeTo: entity.parent) } } } .defaultSize(width: 0.6, height: 0.6, depth: 0.6, in: .meters) .windowStyle(.volumetric) I understand that we can resize a Model3D view automatically using .resizable() and .scaledToFit() after the model loaded. Can we achieve the same result using a RealityView? Cheers
1
1
646
Dec ’23
Detecting Tap on an entity that has a videoMaterial
Hello Everyone, I'm currently facing a challenge related to detecting taps on an entity that features video material. Based on the information I found online, it appears that in order to enable touch functionality, the recommended approach is to clone the entity and add an InputTargetComponent while also enabling collision shapes. Here's a snippet of my code: RealityView { content, attachments in // The following code doesn't trigger the tapGesture let videoEntity = ImmersivePlayerEntity(configuration: configuration) content.add(videoEntity) // if let attachment = attachments.entity(for: "player-controls") { anchorEntity.addChild(attachment) content.add(anchorEntity) } /* This code triggers the tapGesture let boxResource = MeshResource.generateBox(size: 2) let itemMaterial = SimpleMaterial(color: .red, roughness: 0, isMetallic: false) let entity = ModelEntity(mesh: boxResource, materials: [itemMaterial]).addTappable() content.add(entity) */ } update: { _, _ in } attachments: { Attachment(id: "player-controls") { ImmersivePlayerControlsView(coordinator: coordinator) .frame(width: 1280) .opacity(areControlsVisible ? 1 : 0) .animation(.easeInOut, value: areControlsVisible) } } .gesture( SpatialTapGesture() .targetedToAnyEntity() .onEnded { value in areControlsVisible.toggle() } ) extension Entity { func addTappable() -> Entity { let newModelEntity = self.clone(recursive: true) newModelEntity.components.set(InputTargetComponent()) newModelEntity.generateCollisionShapes(recursive: true) return newModelEntity } } I'm seeking guidance and assistance on how to enable touch functionality on the video entity. Your insights and suggestions would be greatly appreciated. Thank you in advance for your help!
0
0
260
Dec ’23
Vision Pro Dev Kit question
Hi guys, has any individual develper received Vision Pro dev kit or is it just aimed at big companies? Basically I would like to start with one or 2 of my apps that I removed from the store already, just to get familiar with VisionOS platform and gain knowledge and skills on a small, but real project. After that I would like to use the Dev kit on another project. I work on a contract for mutlinational communication company on a pilot project in a small country and extending that project to VisionOS might be very interesting introduction of this new platform and could excite users utilizing their services. However I cannot quite reveal to Apple details for reasons of confidentiality. After completing that contract (or during that if I manage) I would like to start working on a great idea I do have for Vision Pro (as many of you do). Is it worth applying for Dev kit as an individual dev? I have read some posts, that guys were rejected. Is is better to start in simulator and just wait for actual hardware to show up in App Store? I would prefer to just get the device, rather than start working with the device that I may need to return in the middle of unfinished project. Any info on when pre-orders might be possible? Any idea what Mac specs are for developing for VisionOS - escpecially for 3D scenes. Just got Macbook Pro M3 Max with 96GB RAM, I'm thinknig if I should have maxed out the config. Anybody using that config with Vision Pro Dev kit? Thanks.
0
0
944
Dec ’23