About a month ago I made a demo app that would detect an image and create a floating AR tv screen above the image and play a local video. After the iOS 15 update, the video screen will be created and load the wanted video materials but the screen appears to be glitched. To add more detail to my description of glitched, the video will only appear for tenths of a second and flash between this and not appearing at all. This demo worked perfectly until the iOS 15 update and I wanted to know if anyone else has also experienced this issue or can recreate it.
I suspect the culprit behind my issue is the ARSessionDelegate's method named func session(_ session: ARSession, didAdd anchors: [ARAnchor]) because I tested a seperate demo that creates video materials without the use/inside this method and it works correctly. I am hoping that the issue has nothing to do with code, instead it is related to Xcode build settings and configurations but I tested this by updating the deployment to iOS 15.0 as well as creating/duplicating a brand new project all within Xcode 13 and still was running into my issue.
I understand that there has been some changes to ARKit and RealityKit since WWDC21 but has the ARSessionDelegate protocol been depreciated? Is there a newer/better way to know when an ARAnchor has been added to a AR scene?
Enclosed is a GitHub gist that has the code to my demo app. I am asking the Apple AR dev community and/or Apple ARKit team to guide me through this issue. Any feedback or confirmation would be much obliged as I would like to continue to build on my knowledge of ARKit but found this issue to be a blocker.
enclosed: https://gist.github.com/Innovoeb/1711719b94ba2df17b0558475ed6203b