Video Materials After Image Detection Not Working After iOS 15.0 Update

About a month ago I made a demo app that would detect an image and create a floating AR tv screen above the image and play a local video. After the iOS 15 update, the video screen will be created and load the wanted video materials but the screen appears to be glitched. To add more detail to my description of glitched, the video will only appear for tenths of a second and flash between this and not appearing at all. This demo worked perfectly until the iOS 15 update and I wanted to know if anyone else has also experienced this issue or can recreate it.

I suspect the culprit behind my issue is the ARSessionDelegate's method named func session(_ session: ARSession, didAdd anchors: [ARAnchor]) because I tested a seperate demo that creates video materials without the use/inside this method and it works correctly. I am hoping that the issue has nothing to do with code, instead it is related to Xcode build settings and configurations but I tested this by updating the deployment to iOS 15.0 as well as creating/duplicating a brand new project all within Xcode 13 and still was running into my issue.

I understand that there has been some changes to ARKit and RealityKit since WWDC21 but has the ARSessionDelegate protocol been depreciated? Is there a newer/better way to know when an ARAnchor has been added to a AR scene?

Enclosed is a GitHub gist that has the code to my demo app. I am asking the Apple AR dev community and/or Apple ARKit team to guide me through this issue. Any feedback or confirmation would be much obliged as I would like to continue to build on my knowledge of ARKit but found this issue to be a blocker.

enclosed: https://gist.github.com/Innovoeb/1711719b94ba2df17b0558475ed6203b

Hi, this sounds like a regression. What happens if you create an entity with a video material without needing to recognize an image first? This could be an issue with the image detection algorithms, or with video materials. Please file this on feedback assistant and respond with the feedback ID.

Update:

After filing this issue on the Apple feedback assistant an ARKit dev responded and was able to quickly resolve this for me. The issue involved me not utilizing the ARKit API correctly. To be more specific, I was experiencing the glitched screen behavior because I was tying the newly created anchor entity which would hold my video plane to my detected AR image anchor; the video would only play correctly if the image anchor's isTracked property was set to true. The behavior I was aiming for was for the video to play only after an initial detection of an image; I wanted the video to be untied or separate from the current image anchor's tracking state. I was able to produce this behavior by adding .maximumNumberOfTrackedImages = 1 to my AR session's configuration as well as changing a line in my previously posted code, see below.

/*
       AnchorEntity(world: ) creates an anchor entity with a target fixed at the given position in the scene.
       
      transform = A matrix encoding the position, orientation, and scale of the anchor relative to the world
*/
      let anchor = AnchorEntity(world: imageAnchor.transform)

Many thanks to the Apple graphics and games engineer who instructed me to file a ticket in the feedback assistant. You were able to point me in the right direction which lead to me breaking through a barrier. For reference I have posted the updated GitHub gist to the AR image detection and video materials demo app.

https://gist.github.com/Innovoeb/1711719b94ba2df17b0558475ed6203b

Video Materials After Image Detection Not Working After iOS 15.0 Update
 
 
Q