Can I setup an AVCaptureSession exclusively for use with the new Camera Control APIs?

I have a third party app for controlling Sony mirrorless cameras over WiFi. I’m really excited to integrate the new camera controls on the iPhone 16 Pro with the app. I’ve found the documentation around this, and seems I need an AVCaptureSession setup in order to utilise them.

func configureControls(_ controls: [AVCaptureControl]) {
    
    // Verify the host system supports controls; otherwise, return early.
    guard captureSession.supportsControls else { return }
    
    // Begin configuring the capture session.
    captureSession.beginConfiguration()
    
    // Remove previously configured controls, if any.
    for control in captureSession.controls {
        captureSession.removeControl(control)
    }
    
    // Iterate over the passed in controls.
    for control in controls {
        // Add the control to the capture session if possible.
        if captureSession.canAddControl(control) {
            captureSession.addControl(control)
        } else {
            print("Unable to add control \(control).")
        }
    }
    
    // Commit the capture session configuration.
    captureSession.commitConfiguration()
}

can I just use a freshly initialised capture session for this? Or does it need to be configured in any other ways? Are there any down sides to creating a session (CPU usage etc) that I may experience from this?

Also, the scope of the controls is quite narrow. For something like shutter speed or aperture that has quite a number of possible values but requires custom labels, and a non-linear scale (so the AVCaptureIndexPicker seems to be the way to go). Will that picker support enough values to represent something like shutter speed or aperture? Is there any chance we may get non-linear float based controls in the future, which may feel more natural from a UX perspective than index-based?

Apologies, lots of edits going on here as I think about this more.

Is there any way, or would any way be considered of putting these controls in a disabled state like with other UI elements in iOS? There are times (during capture for example) that a lot of these settings can be unavailable (as communicated by the Sony camera) to be changed by the user, and managing a queue of changes when the function is unavailable to be set is going to be a challenge. If there won’t be, how will they behave if controls are removed whilst being interacted with? Presumably they will disappear entirely from the UI?

Thanks!

Answered by Frameworks Engineer in 803322022

Hi Simon,

Replying to a few of your points here:

For something like shutter speed or aperture that has quite a number of possible values but requires custom labels, and a non-linear scale (so the AVCaptureIndexPicker seems to be the way to go).

You'll want to check out AVCaptureSlider's localizedValueFormat property that can be used to specify a custom label for a slider so that you could display "ƒ1.2" for the slider's value. See its documentation on how to craft the value format string: https://docs.devpubs.apple.com/releases/geode-hw-draft/documentation/avfoundation/avcaptureslider/4479606-localizedvalueformat.

Is there any chance we may get non-linear float based controls in the future, which may feel more natural from a UX perspective than index-based?

Could you submit an enhancement request for this? I see you've filed FB15100641 but it doesn't include these details.

And regarding using the AVCaptureControl API with external cameras over WiFi, the AVCaptureControl classes are designed to be used with cameras added to an AVCaptureSession. On iPhone this means AVCaptureControl is restricted to built-in cameras. And while on iPadOS external USB cameras are supported with AVCaptureSession, it sounds like you are connecting to the external camera over WiFi outside of an AVCaptureSession. So given this the AVCaptureControl API could not be used with your use case to control an external WiFi camera on iPhone as it requires use of a built-in camera on iPhone.

Thanks for your post about the new API for the camera button, I believe you already dove into the documentation at: https://developer.apple.com/documentation/avfoundation/capture_setup/enhancing_your_app_experience_with_the_camera_control

You’ll see the camera button is currently designed to control the built-in camera session. It is not intended to support external camera. I’m curious, what creative ways are you planning to leverage it in your app with a connected camera?

The addControl API allows you to add up to maxControlsCount custom controls. Among the available options, you can use these two general-purpose control types to create custom controls:

  1. AVCaptureSlider: This is a continuous slider that enables users to select a floating-point value from a bounded range. It's great for controlling transitions, exposure compensation, or any other continuous settings.
  2. AVCaptureIndexPicker: This control allows users to select a value by index from a mutually exclusive set. Perfect for implementing presets or different camera modes.

To answer your question about the controls, you can add and remove controls at this time. Please add an enhanced request to support the external camera during a capture session and also change the behavior of the controls. If you'd like us to consider adding the necessary functionality, please file an enhancement request using Feedback Assistant. Once you file the request, please post the FB number here.

If you're not familiar with how to file enhancement requests, take a look at Bug Reporting: How and Why?

If you're exploring creative ways to utilize beyond these, I'd love to hear more! They're adding a lot of flexibility to the camera control experience.

 Albert

Hi Albert,

Thanks for your quick reply! I have to say I was half expecting this to be the answer, but a small part of me hoped it would not be the case.

It makes so much sense to have these controls for external cameras too, in particular I have a "full screen" mode in my app, and being able to use the camera control button whilst in that mode for configuring things such as shutter speed, ISO, aperture, and for actuating the shutter would make it such a better user experience! Instead users have to jump in and out of full screen mode to adjust any of these shooting settings.

They are not particularly creative ways as far as I can see, they will just allow a more seamless interaction with the external camera over WiFi, make the shooting experience less cluttered, and allow users to use more of the screen real-estate for composition!

I have submitted an enhancement under FB15100641 requesting this.

I do have a further question, do you know how much of the AVCaptureSession process I would need to do in order to open up this functionality? And how much draw that would have on CPU and memory usage? I am half tempted to setup an AVCaptureDevice but just not render it in a preview layer in order to open up this functionality. Perhaps too you could speak as to whether that would lead to rejection at the review process stage? Perhaps there's someone I could speak to at Apple regarding this, to get it pre-approved or at least have a discussion around it?

Simon

Hi Simon,

Replying to a few of your points here:

For something like shutter speed or aperture that has quite a number of possible values but requires custom labels, and a non-linear scale (so the AVCaptureIndexPicker seems to be the way to go).

You'll want to check out AVCaptureSlider's localizedValueFormat property that can be used to specify a custom label for a slider so that you could display "ƒ1.2" for the slider's value. See its documentation on how to craft the value format string: https://docs.devpubs.apple.com/releases/geode-hw-draft/documentation/avfoundation/avcaptureslider/4479606-localizedvalueformat.

Is there any chance we may get non-linear float based controls in the future, which may feel more natural from a UX perspective than index-based?

Could you submit an enhancement request for this? I see you've filed FB15100641 but it doesn't include these details.

And regarding using the AVCaptureControl API with external cameras over WiFi, the AVCaptureControl classes are designed to be used with cameras added to an AVCaptureSession. On iPhone this means AVCaptureControl is restricted to built-in cameras. And while on iPadOS external USB cameras are supported with AVCaptureSession, it sounds like you are connecting to the external camera over WiFi outside of an AVCaptureSession. So given this the AVCaptureControl API could not be used with your use case to control an external WiFi camera on iPhone as it requires use of a built-in camera on iPhone.

Can I setup an AVCaptureSession exclusively for use with the new Camera Control APIs?
 
 
Q