ARKit 6 high resolution frame capture issues

M1 iPad Pro with iPadOS 16 Beta 3

Xcode 14.0 beta 3

In a freshly created Xcode 14 beta 2 app using the Augmented Reality App template with Content Technology set to Metal, ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing returns a 60 fps, 1920 x 1440 video format.

So, session.captureHighResolutionFrame fails to deliver a high res frame.

Accepted Reply

Hi, be careful not to confuse two different APIs, although both of them are supported on the iPad Pro with M1 chip:

  • recommendedVideoFormatFor4KResolution returns a video format that delivers frames with 4K resolution at a continuous rate of 30 Hz.
  • recommendedVideoFormatForHighResolutionFrameCapturing returns a video format for high-resolution background image capture. It is expected that regular frames have a lower resolution, such as 1920 x 1440 pixels at 60 Hz, as you correctly observed. However, session.captureHighResolutionFrame should return a 12 megapixel frame. You can verify the resolution of the captured high-resolution frame with the below code snippet.
session.captureHighResolutionFrame { frame, error in
   guard let frame = frame else {
      print(error)
   }

   let width = CVPixelBufferGetWidth(frame.capturedImage)
   let height = CVPixelBufferGetHeight(frame.capturedImage)
   print("Received frame with dimensions: \(width) x \(height)")
}

Replies

Have a look to see if your camera supports 4K with print(ARWorldTrackingConfiguration.supportedVideoFormats)

On an iPhone 13 Pro Max it’s only 4K camera is listed as <ARVideoFormat: 0x2834f2e90 imageResolution=(3840, 2160) pixelFormat=(420f) framesPerSecond=(30) captureDeviceType=AVCaptureDeviceTypeBuiltInWideAngleCamera captureDevicePosition=(1)>]

So in your config change to that format.

 configuration.videoFormat = ARWorldTrackingConfiguration.supportedVideoFormats[12]

That’s on an iPhone 13 Pro Max 128gig - which did has the limited ProRes recording duration, so perhaps it works on the bigger storage models, or just future products.

  • I can't edit my above comment - but DO NOT use the above method in a production app, this is just how I have been able to get it to work in testing. Don't hard code your .videoformat as it changes in different releases of iOS and for different devices.

Add a Comment

Hi, be careful not to confuse two different APIs, although both of them are supported on the iPad Pro with M1 chip:

  • recommendedVideoFormatFor4KResolution returns a video format that delivers frames with 4K resolution at a continuous rate of 30 Hz.
  • recommendedVideoFormatForHighResolutionFrameCapturing returns a video format for high-resolution background image capture. It is expected that regular frames have a lower resolution, such as 1920 x 1440 pixels at 60 Hz, as you correctly observed. However, session.captureHighResolutionFrame should return a 12 megapixel frame. You can verify the resolution of the captured high-resolution frame with the below code snippet.
session.captureHighResolutionFrame { frame, error in
   guard let frame = frame else {
      print(error)
   }

   let width = CVPixelBufferGetWidth(frame.capturedImage)
   let height = CVPixelBufferGetHeight(frame.capturedImage)
   print("Received frame with dimensions: \(width) x \(height)")
}

Interesting. My previous test did all the things you mentioned, but the captureHighResolutionFrame call that failed was made immediately after calling session.run. It works when moved to a tap handler.

Thanks