Anchor detection issues

Hello,

I have an augmented reality app with a simple Reality Composer project. It works fine on an ipad 14.4 but I'm having problems on higher versions (14.7 and 15).

Anchor detection is much more sensitive. This has the consequence of restarting my scenes with each new detection. On the other hand, the scenes are interrupted as soon as the image of the anchor is no longer visible by the camera.

I am using xcode 13.1

I use this simple code :


class ViewController: UIViewController {

    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        guard let anchor2 = try? Enigme1.loadDebut() else { return }

        arView.scene.anchors.append(anchor2)

    }

}

Thank you very much for the help you could give me.

Image anchoring in Reality Composer is designed in a way that the object disappears as soon as the image is no longer in view. Just with RealityKit alone, the behavior you describe cannot be achieved. In order to keep the object permanently visible after the first detection, you would need to use ARKit API and set maximumNumberOfTrackedImages to 0 on your ARConfiguration. You can retrieve the configuration from the ARView's session object. After modifying the configuration, you need to re-run the session with the updated configuration.

Anchor detection issues
 
 
Q