@DTS Engineer
Yes I have seen x and z change when only setting y. I also tried setting x and z to stored values and logging the position looked correct. However visually the entity would be drifting on x and z the farther away y was.
My use case was dragging an entity up and down and attaching children to it to creating an infinite scroll. If you scroll for a while, the positioning of the parent entity on x and z would appear to change even if the correct values were logged. The workaround was to move the children out of the scroll entity, and just use the y to position them manually.
I could imagine you would see the same issue if you just created an entity that was 50 or 100 meters long and tried moving it along one axis though.
Post
Replies
Boosts
Views
Activity
Maybe I am not explaining the problem correctly.
X and Z are not changing, only Y.
So I could see how X and Z would lose precision as they got larger, but not if they are remaining the same value.
Are you saying the implementation of SIMD3 is not a collection or struct of three separate values?
@Vision Pro Engineer I submitted FB16424173 thank you
Actually I did sometimes have the issue even when stopping SpatialTrackingSession.
Using a SpatialTrackingSession in the same project with ARKitSession + WorldTrackingProvider does not work properly. I switched to only using PlaneDetectionProvider for the floor and ceiling heights instead.
If I stop the SpatialTrackingSession before starting the ARKitSession, I do not have the issue.
Hello @Vision Pro Engineer
If I add a SpatialTrackingSession to your example I can replicate the problem. The world anchor will be added/updated, then immediately removed without calling removeAnchor().
import SwiftUI
import RealityKit
import ARKit
struct TestAnchor: View {
@State var spatialTrackingSession = SpatialTrackingSession()
@State var worldTrackingProvider = WorldTrackingProvider()
@State var arKitSession = ARKitSession()
@State var firstWorldAnchorTransform: simd_float4x4?
var body: some View {
RealityView { content in
}
update: { content in
if let firstWorldAnchorTransform {
let entity = ModelEntity(mesh: .generateSphere(radius: 0.2), materials: [SimpleMaterial(color: .red, isMetallic: false)])
entity.setTransformMatrix(firstWorldAnchorTransform, relativeTo: nil)
content.add(entity)
}
else {
print("Waiting for world anchor.")
}
}
.task {
do {
try await arKitSession.run([worldTrackingProvider])
}
catch {
print("Error starting session", error)
}
let configuration = SpatialTrackingSession.Configuration(tracking: [.plane])
if let unavailableCapabilities = await spatialTrackingSession.run(configuration) {
if unavailableCapabilities.anchor.contains(.plane) {
print("plane tracking is unavailable.")
}
}
Task {
await processWorldAnchorUpdates()
}
// For demo purposes only. Don't do this in prod :)
// Wait 5 seconds for a world anchor. If one isn't added create one.
// Add anchor should only be called the first time you run this.
// Delete the app and reinstall it to clear the anchor.
Task {
try? await Task.sleep(for: .seconds(5))
if firstWorldAnchorTransform == nil {
let anchor = WorldAnchor(originFromAnchorTransform: matrix_identity_float4x4)
do {
try await worldTrackingProvider.addAnchor(anchor)
}
catch {
print("Error adding anchor", error)
}
}
}
}
}
func processWorldAnchorUpdates() async {
print("Tracking the world")
for await update in worldTrackingProvider.anchorUpdates {
print(update)
await processWorldAnchorUpdate(update: update)
}
}
func processWorldAnchorUpdate(update:AnchorUpdate<WorldAnchor>) async {
print("world anchor updated", update.event, update.anchor.id)
if(update.event == .added && firstWorldAnchorTransform == nil) {
firstWorldAnchorTransform = update.anchor.originFromAnchorTransform
}
}
}
Thank you. Your code works exactly as you would expect. The other full example app from Apple for room tracking also works properly.
.removeAnchor() is not used in my project at all.
It must be something I am doing. I am also running a SpatialTrackingSession to detect the floor and ceiling anchors. I will try to simplify my code.
From the comments for WorldTrackingProvider anchorUpdates:
World anchors persist across device restarts until they are explicitly removed.
Removed by the user? Or removed by ARKit under some unknown condition?
Update: a set of fake portals is much easier to set up than trying to make it work using a crossingMode.
@Media Engineer that's nice to know, is there a max known good bitrate for those media services?
@DTS Engineer I think RealityKit may not be suited to doing this, and the ability to address a texture to each eye with a Metal layer would be needed instead. What do you think
One of the main reasons I am having to use a ShaderGraphMaterial is there doesn't seem to be a way to render a texture to a specific eye outside of RealityKit's Camera Index Switch.
You also probably have to get the display to match the dynamic range using AVDisplayCriteria(refreshRate: , videoDynamicRange: )
So these are all things that an AVPlayer handles that you have to account for when trying to present these textures correct?
Oh hello there
When you render HDR video on a Metal layer, you have to set the .colorspace, .edrMetadata and .wantsExtendedDynamicRangeContent
How does that work with a VideoMaterial where we can't set any of that when using a TextureResoure.DrawableQueue?
I can't use AVPlayerViewController or VideoPlayerComponent because I need to use a ShaderGraphMaterial to process the video.
Through trial-and-error I confirmed that specifying "mp4a.a6" instead of "ec-3" in the manifest causes AVPlayer to fail. Did this used to work in VisionOS 1.x?