Post

Replies

Boosts

Views

Activity

Difference in ARKit plane detection from iPhone 8 to iPhone 15
I am developing an ARKit based application that requires plane detection of the tabletop at which the user is seated. Early testing was with an iPhone 8 and iPhone 8+. With those devices, ARKit rapidly detected the plane of the tabletop when it was only 8 to 10 inches away. Using iPhone 15 with the same code, it seems to require me to move the phone more like 15 to 16 inches away before detecting the plane of the table. This is an awkward motion for a user seated at a table. To validate that it was not necessarily a feature of my code, I determined that the same behavior results with Apple's sample AR Interaction application. Has anyone else experienced this, and if so, have suggestions to improve the situation?
2
0
303
1w
minimum device requirements for ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing
device: iPhone 8 Plus Xcode: Version 14.2 (14C18) Attempting to set up an AR session to allow a background high resolution frame capture using the new ARKit feature in IOS 16.2. My test code (following recommendations in WWDC video on this subject) is         if #available(iOS 16.0, *) {             if let videoCapFormat = ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing {                 print("Format = (videoCapFormat)")             } else {                 print("Format not found")             } In this case, videoCapFormat is populated with nil. Should I assume that indicates that no video format for my device will support this capture. If so, as a follow on question, which devices do support this type of background high res capture?
2
0
679
Feb ’23