As you can see, it is a transparent spherical shell model with a ball inside. Everything is normal on the front side, but there are strange mesh triangles on the side and back view. I don't know if this is as expected and what I need to do to remove these strange effects.
Reality Composer
RSS for tagPrototype and produce content for AR experiences using Reality Composer.
Posts under Reality Composer tag
47 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
After upgrading to Xcode 16 my app, which utilizes imported project files from my iPad's Reality Composer app, now has two issues that I have found so far. I am using an ARView as a UIViewRepresentable with SwiftUI. (Prior to upgading to Xcode 16 everything worked well.)
First, there are now several duplicate rcp_export.usdz resources in the "Copy Bundle Resources" build phase section. Even though each file is in a separate folder with a unique UUID, it was causing a compile error saying there are duplicate files. I was able to open the RC project folder and delete the older rcp_project versions which now allows the app to compile. I mention it as it may or may not be related to the second issue.
Second, Xcode isn't generating the project code for rcproject, so when I call the RCProject.loadSceneAsync function I am getting an error that says "Cannot find 'RCProject' in scope"
In RealityKit, I know that an HDR image is pre-calculated, and through the settings of the ImageBasedLight Component, a specified specular object can reflect the content of the HDR image.
If a mirror object is originally very large, such as a large-area continuous glass door, after specifying an IBL image for these glass doors, the image reflected by the mirror will be obviously deformed when it moves in space. Because IBL is a picture of the surrounding environment at a point, while the glass door is a surface.
Is there a truly real-time specular reflection calculation setup in RealityKit that can reflect the model on the opposite side of the glass door?
Using Xcode 15.4, I have successfully built and run my app using Reality Composer Pro Version 1.0 package. I then successfully submitted that app version for release. Now, using Xcode 16 Beta 6, I've created a new branch repository for updating my app for iOS/iPadOS 18 and visionOS 2. However, once I created and switched to the new branch and did a build, I get build errors. It seems to be regarding the package manifest that relates to my Reality Composer Pro package that is part of my app. When I go to the package file in my project navigator and click the Open in Reality Composer Pro button, my package opens in Reality Composer Pro 2.0, which makes sense since it it the version for Xcode 16. However, I don't know how to address/get rid of the build errors.
I've added and image of my build errors.
Hello,
I downloaded the most recent Xcode 16.0 beta 6 along with the example project located here
Currently I am experiencing the following build failures:
RealityAssetsCompile
...
error: [xrsimulator] Component Compatibility: BlendShapeWeights not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Component Compatibility: AudioLibrary not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults")
error: Tool exited with code 1
I saw that there is a similar issue reported. As a test I downloaded that project compiled as expected.
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre.
I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject.
I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
I'm following WWDC for interactive 3D content in reality composer pro and apple's document
https://developer.apple.com/wwdc24/10102
https://developer.apple.com/documentation/realitykit/implementing-systems-for-entities-in-a-scene#Retrieve-entities-with-an-entity-query
However, this simple code to declare a dummy Component and System has compile error
/Users/Workspaces/repository/Packages/RealityKitContent/Sources/RealityKitContent/RobotComponent.swift:18:24 Static property 'query' is not concurrency-safe because non-'Sendable' type 'EntityQuery' may have shared mutable state
// Define a query to return all entities with a MyComponent.
private static let query = EntityQuery(where: .has(MyComponent.self))
// Initializer is required. Use an empty implementation if there's no setup needed.
required init(scene: Scene) { }
// Iterate through all entities containing a MyComponent.
func update(context: SceneUpdateContext) {
for entity in context.entities(
matching: Self.query,
updatingSystemWhen: .rendering
) {
// Make per-update changes to each entity here.
}
}
}
I'm using XCode beta3 and project target visionos 2
The object capture feature in Reality Composer App is only available in iOS and iPadOS at the moment, would this feature be available for visionOS in near future?
Reality Composer App Store
https://apps.apple.com/us/app/reality-composer/id1462358802
I can‘t Figure Out How to Get My Earth Entity to Rotate on its Axis. This is a follow up post from a previous Apple Developer forum post.
How would I have the earth (parent) entity rotate CCW underneath the orbiting starship child?
I tried adding the following code block to the RealityView but it is not working:
if let rotatingEarth = starshipEntity.findEntity(named: "Earth") {
rotatingEarth.transform.rotation = simd_quatf.init(angle: 360, axis: SIMD3(x: 0, y: 1, z: 0))
if let animation = try? AnimationResource.generate(with: rotatingEarth as! AnimationDefinition) {
rotatingEarth.playAnimation(animation)
}
}
Any advice on getting the earth to rotate?
I tried reviewing the Hello World WWDC23 project code, but I was unable to understand the complexity and how that sample project got the earth to rotate.
i want to do this for visionOS 1.2. I realize there are some new animation and possible other capabilities coming up in vision 2.0 but I want to try to address this issue in the current released visionOS version.
Hey, is there a way to create a good ground shadow shader? I'm using a ground with an unlit material and I can't get the ground shadow to work properly. If I use a PBR texture it works better, but i can barely see it and I want to control the intensity more.
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/
Firebase looks like the most promising at this point??
Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
The 3D object capture feature doesn’t seem to work on my iphone 12 pro. The circle that is supposed to show up when you begin to begin to move around the object doesnt show up so object capture doesn’t even begin. It says ‘more light..’ or ‘move closer’ but this doesnt happen on my iphone 14 pro. Works perfectly fine on that even with the same lighting. How can this be fixed?
Hi
I installed Xcode. I found to anchor plane option which seen in Reality Composer. but in pro version, can't found. where are the options?
thanks
Hello just getting this error as I am trying to run the any new project this error is looking up
Thanks
Zipzy Games
X
I created an app for visionOS, using Reality Composer Pro. Now I want to turn this app into a multi-platform app for iOS as well.
RCP files are not supported on iOS, however. So I tried to use the "old" Reality Composer instead, but that doesn't seem to work either. Xcode 15 does not include it anymore, and I read online that files created with Xcode 14's Reality Composer cannot be included in Xcode 15 files. Also, Xcode 14 does not run on my M3 Mac with Sonoma.
That's a bummer. What is the recommended way to include 3D content in apps that support visionOS AND iOS?!
(I also read that a solution might be using USDZ for both. But how would that workflow look like? Are there samples out there that support both platforms? Please note that I want to setup the anchors myself, using code. I just need the composing tool to the create 3D content that will be placed on these anchors.)
Hello guys, I do have a virtual environment in which I have a mesh. I want the mesh to be mirrored onto a glass which is very close nearby.
I can't just duplicate it because it varies depending on from which position you are looking at it.
Is there a possibility to mirror a mesh via reflections? It shouldn't reflect real world objects - just a virtual mesh.
Thank you guys
I created a prototype app with Reality Composer on an iPad. Now I would like to import the project to a Mac for further development using Xcode/Swift. How can I do this?
I am able to export a .reality or .usdz file. If I open the .reality file in Xcode I just get the playback app/scene but it does not appear to be a project that I can edit.
Im trying to take an object capture, and scale it. What I did so far is create a Reality Composer project, insert the .objcap file into the project, and then scaled it from 100%, to 200%. I then extracted it as a USDZ. it just won't show up in the Xcode preview now, and im not sure why it doesn't show. Is there any way to fix this? im going crazy trying to find a fix for this to work.
I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial.
Here's the code for the same:
let screenModelEntity = model.garageScreenEntity as! ModelEntity
let modelEntityMesh = screenModelEntity.model!.mesh
let url = Bundle.main.url(forResource: "<URL>",
withExtension: "mp4")!
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
let material = VideoMaterial(avPlayer: player)
screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material])
player.replaceCurrentItem(with: playerItem)
return player
I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app.
How can I add the video player controls in this case?
I'm a newbie to Reality Composer. Have constructed a couple of simple projects
Have added text, and the "A a" is displayed, but as soon as it is selected for editing Properties, Xcode crashes.
Running Ventura 13.3.1, and Reality Composer 1.5, Xcode 14.3.1.
Will appreciate guidance on how to fix.
Thanks!