iOS Developer Library

Developer

AVFoundation Framework Reference AVCaptureDevice Class Reference

Options
Deployment Target:

On This Page
Language:

AVCaptureDevice

An AVCaptureDevice object represents a physical capture device and the properties associated with that device. You use a capture device to configure the properties of the underlying hardware. A capture device also provides input data (such as audio or video) to an AVCaptureSession object.

You use the methods of the AVCaptureDevice class to enumerate the available devices, query their capabilities, and be informed about when devices come and go. Before you attempt to set properties of a capture device (its focus mode, exposure mode, and so on), you must first acquire a lock on the device using the lockForConfiguration: method. You can then set the properties and release the lock using the unlockForConfiguration method. You may hold the lock if you want all settable device properties to remain unchanged. However, holding the device lock unnecessarily may degrade capture quality in other applications sharing the device and is not recommended.

Most common configurations of capture settings are available through the AVCaptureSession object and its available presets; however, some specialized options (such as high frame rate) require directly setting a capture format on an AVCaptureDevice instance. The following code example illustrates how to select a device’s highest possible frame rate:

  • - (void)configureCameraForHighestFrameRate:(AVCaptureDevice *)device
  • {
  • AVCaptureDeviceFormat *bestFormat = nil;
  • AVFrameRateRange *bestFrameRateRange = nil;
  • for ( AVCaptureDeviceFormat *format in [device formats] ) {
  • for ( AVFrameRateRange *range in format.videoSupportedFrameRateRanges ) {
  • if ( range.maxFrameRate > bestFrameRateRange.maxFrameRate ) {
  • bestFormat = format;
  • bestFrameRateRange = range;
  • }
  • }
  • }
  • if ( bestFormat ) {
  • if ( [device lockForConfiguration:NULL] == YES ) {
  • device.activeFormat = bestFormat;
  • device.activeVideoMinFrameDuration = bestFrameRateRange.minFrameDuration;
  • device.activeVideoMaxFrameDuration = bestFrameRateRange.minFrameDuration;
  • [device unlockForConfiguration];
  • }
  • }
  • }

Inheritance


Conforms To


Import Statement


Swift

import AVFoundation

Objective-C

@import AVFoundation;

Availability


Available in iOS 4.0 and later.
  • Returns an array of the available capture devices on the system.

    Declaration

    Swift

    class func devices() -> [AnyObject]!

    Objective-C

    + (NSArray *)devices

    Return Value

    An array containing the available capture devices on the system

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns the device with a given ID.

    Declaration

    Swift

    init!(uniqueID deviceUniqueID: String!) -> AVCaptureDevice

    Objective-C

    + (AVCaptureDevice *)deviceWithUniqueID:(NSString *)deviceUniqueID

    Parameters

    deviceUniqueID

    The ID of a capture device.

    Return Value

    The device with ID deviceUniqueID.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns the default device used to capture data of a given media type.

    Declaration

    Swift

    class func defaultDeviceWithMediaType(_ mediaType: String!) -> AVCaptureDevice!

    Objective-C

    + (AVCaptureDevice *)defaultDeviceWithMediaType:(NSString *)mediaType

    Parameters

    mediaType

    A media type identifier.

    For possible values, see AV Foundation Constants Reference.

    Return Value

    The default device used to capture data of the type indicated by mediaType.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns an array of the devices able to capture data of a given media type.

    Declaration

    Swift

    class func devicesWithMediaType(_ mediaType: String!) -> [AnyObject]!

    Objective-C

    + (NSArray *)devicesWithMediaType:(NSString *)mediaType

    Parameters

    mediaType

    A media type identifier.

    For possible values, see AV Foundation Constants Reference.

    Return Value

    An array containing the devices able to capture data of the type indicated by mediaType.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Requests the user’s permission, if needed, for recording a specified media type.

    Declaration

    Swift

    class func requestAccessForMediaType(_ mediaType: String!, completionHandler handler: ((Bool) -> Void)!)

    Objective-C

    + (void)requestAccessForMediaType:(NSString *)mediaType completionHandler:(void (^)(BOOL granted))handler

    Parameters

    mediaType

    A media type constant, either AVMediaTypeVideo or AVMediaTypeAudio. If any media type is supplied, an NSInvalidArgumentException will be thrown.

    handler

    A block to be called once permission is granted or denied.

    The completion handler is called on an arbitrary dispatch queue. Is it the client's responsibility to ensure that any UIKit-related updates are called on the main queue or main thread as a result.

    The block receives the following parameter:

    granted

    If the user grants permission to use the hardware YEStrue is returned; otherwise NOfalse. The block will return immediately.

    Discussion

    Recording audio always requires explicit permission from the user; recording video also requires user permission on devices sold in certain regions. The first time you create any AVCaptureDeviceInput objects for a media type that requires permission, the system automatically displays an alert to request recording permission. Alternatively, you can can call this method to prompt the user at a time of your choosing.

    This call will not block while the user is being asked for access, allowing the client to continue running. Until access has been granted, any AVCaptureDevices for the media type will vend silent audio samples or black video frames. The user is only asked for permission the first time the client requests access. Later calls use the permission granted by the user.

    After the user grants recording permission, the system remembers the choice for future use in the same app, but the user can change this choice at any time using the Settings app. If the user has denied your app recoding permission or has not yet responded to the permission prompt, any audio recordings will contain only silence and any video recordings will contain only black frames.

    The response parameter is a block whose sole parameter indicates whether the user granted or denied permission to record. This method always returns immediately. If the user has previously granted or denied recording permission, it executes the block when called; otherwise, it displays an alert and executes the block only after the user has responded to the alert.

    Invoking this method with AVMediaTypeAudio is equivalent to calling the AVAudioSession method requestRecordPermission:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • Returns a constant indicating whether the app has permission for recording a specified media type

    Declaration

    Swift

    class func authorizationStatusForMediaType(_ mediaType: String!) -> AVAuthorizationStatus

    Objective-C

    + (AVAuthorizationStatus)authorizationStatusForMediaType:(NSString *)mediaType

    Parameters

    mediaType

    A media type constant, either AVMediaTypeVideo or AVMediaTypeAudio.

    Return Value

    A constant indicating authorization status.

    Discussion

    Recording audio always requires explicit permission from the user; recording video also requires user permission on devices sold in certain regions. The first time you create any AVCaptureDeviceInput objects for a media type that requires permission, the system automatically displays an alert to request recording permission.

    After the user grants recording permission, the system remembers the choice for future use in the same app, but the user can change this choice at any time using the Settings app. If the user has denied your app recoding permission or has not yet responded to the permission prompt, any audio recordings will contain only silence and any video recordings will contain only black frames.

    If this method returns AVAuthorizationStatusNotDetermined, you can call requestAccessForMediaType:completionHandler: to prompt the user for recording permission.

    Calling this method with any media type other than AVMediaTypeVideo or AVMediaTypeAudio raises an exception.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • Requests exclusive access to the device’s hardware properties.

    Declaration

    Swift

    func lockForConfiguration(_ outError: NSErrorPointer) -> Bool

    Objective-C

    - (BOOL)lockForConfiguration:(NSError **)outError

    Parameters

    outError

    On input, specify a pointer to an error object. If a lock cannot be acquired, this pointer contains an NSError object that describes the problem. You may specify nil for this property.

    Return Value

    YEStrue if a lock was acquired or NOfalse if it was not.

    Discussion

    You must call this method before attempting to configure the hardware related properties of the device. This method returns YEStrue when it successfully locks the device for configuration by your code. After configuring the device properties, call unlockForConfiguration to release the configuration lock and allow other apps to make changes.

    You may hold onto a lock (instead of releasing it) if you require the device properties to remain unchanged. However, holding the device lock unnecessarily may degrade capture quality in other apps sharing the device.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Relinquishes exclusive control over the device’s configuration.

    Declaration

    Swift

    func unlockForConfiguration()

    Objective-C

    - (void)unlockForConfiguration

    Discussion

    Call this method to release the lock acquired using the lockForConfiguration: method when you are done configuring the device.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • formats formats Property

    An array of AVCaptureDeviceFormat objects representing the formats supported by the device (read-only)

    Declaration

    Swift

    var formats: [AnyObject]! { get }

    Objective-C

    @property(nonatomic, readonly) NSArray *formats

    Discussion

    You can use this property to enumerate the formats natively supported by the receiver.

    You can set activeFormat to any of the formats in this array.

    You can observe changes to the value of this property using Key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

    See Also

    activeFormat

  • The currently active format of the receiver.

    Declaration

    Swift

    var activeFormat: AVCaptureDeviceFormat!

    Objective-C

    @property(nonatomic, retain) AVCaptureDeviceFormat *activeFormat

    Discussion

    You use this property to get or set the currently active device format.

    On iOS, you should generally set the session preset on an AVCaptureSession object to configure image or video capture and use the shared AVAudioSession object to configure audio capture. When using a session preset, the session automatically controls the capture device’s active format. However, some specialized capture options (such as high frame rate) are not available in session presets. For these options, you can set the capture device’s active format instead. Doing so changes the associated capture session’s preset to AVCaptureSessionPresetInputPriority.

    Attempting to set the active format to one not present in the formats array throws an NSInvalidArgumentException.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings. You must also call lockForConfiguration: before calling the AVCaptureSession method startRunning, or the session's preset will override the selected active format on the capture device.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • focusMode focusMode Property

    The device’s focus mode.

    Declaration

    Swift

    var focusMode: AVCaptureFocusMode

    Objective-C

    @property(nonatomic) AVCaptureFocusMode focusMode

    Discussion

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    You can observe changes to the value of this property using Key-value observing.

    See AVCaptureFocusMode for possible values.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns a Boolean value that indicates whether the given focus mode is supported.

    Declaration

    Swift

    func isFocusModeSupported(_ focusMode: AVCaptureFocusMode) -> Bool

    Objective-C

    - (BOOL)isFocusModeSupported:(AVCaptureFocusMode)focusMode

    Parameters

    focusMode

    A focus mode. See AVCaptureFocusMode for possible values.

    Return Value

    YEStrue if focusMode is supported, otherwise NOfalse.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • The point of interest for focusing.

    Declaration

    Swift

    var focusPointOfInterest: CGPoint

    Objective-C

    @property(nonatomic) CGPoint focusPointOfInterest

    Discussion

    This property represents a CGPoint where {0,0} corresponds to the top left of the picture area, and {1,1} corresponds to the bottom right in landscape mode with the home button on the right—this applies even if the device is in portrait mode.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Indicates whether the device supports a point of interest for focus. (read-only)

    Declaration

    Swift

    var focusPointOfInterestSupported: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isFocusPointOfInterestSupported) BOOL focusPointOfInterestSupported

    Discussion

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Indicates whether the device is currently adjusting its focus setting. (read-only)

    Declaration

    Swift

    var adjustingFocus: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isAdjustingFocus) BOOL adjustingFocus

    Discussion

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • A Boolean value that determines whether smooth autofocus is enabled.

    Declaration

    Swift

    var smoothAutoFocusEnabled: Bool

    Objective-C

    @property(nonatomic, getter=isSmoothAutoFocusEnabled) BOOL smoothAutoFocusEnabled

    Discussion

    On capable devices, you can enable a “smooth” focusing mode in which lens movements are made more slowly. This mode make focus transitions less visually intrusive, a behavior that you may want for video capture.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • A Boolean value that indicates whether the device supports smooth autofocus. (read-only)

    Declaration

    Swift

    var smoothAutoFocusSupported: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isSmoothAutoFocusSupported) BOOL smoothAutoFocusSupported

    Discussion

    The smooth focusing mode is available only on compatible devices. If this property’s value is NOfalse, setting the value of smoothAutoFocusEnabled to YEStrue raises an exception.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • A value controlling the allowable range for automatic focusing.

    Declaration

    Swift

    var autoFocusRangeRestriction: AVCaptureAutoFocusRangeRestriction

    Objective-C

    @property(nonatomic) AVCaptureAutoFocusRangeRestriction autoFocusRangeRestriction

    Discussion

    By default, a device capable of hardware focusing attempts to focus on objects at any distance. If you expect to focus primarily on near or far objects, set a range restriction to increase the speed and reduce the power consumption of automatic focusing, and to reduce the chance of focusing ambiguities.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • A Boolean value that indicates whether the device supports focus range restrictions. (read-only)

    Declaration

    Swift

    var autoFocusRangeRestrictionSupported: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isAutoFocusRangeRestrictionSupported) BOOL autoFocusRangeRestrictionSupported

    Discussion

    Focus range restriction is available only on compatible devices. If this property’s value is NOfalse, setting the value of autoFocusRangeRestriction raises an exception.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • A value that controls the cropping and enlargement of images captured by the device.

    Declaration

    Swift

    var videoZoomFactor: CGFloat

    Objective-C

    @property(nonatomic) CGFloat videoZoomFactor

    Discussion

    This value is a multiplier. For example, a value of 2.0 doubles the size of an image’s subject (and halves the field of view). Allowed values range from 1.0 (full field of view) to the value of the active format’s videoMaxZoomFactor property. Setting the value of this property jumps immediately to the new zoom factor. For a smooth transition, use the rampToVideoZoomFactor:withRate: method.

    The device achieves a zoom effect by cropping around the center of the image captured by the sensor. At low zoom factors, the cropped images is equal to or larger than the output size. At higher zoom factors, the device must scale the cropped image up to the output size, resulting in a loss of image quality. The active format’s videoZoomFactorUpscaleThreshold property indicates the factors at which upscaling will occur.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

    See Also

    activeFormat

  • Begins a smooth transition from the current zoom factor to another.

    Declaration

    Swift

    func rampToVideoZoomFactor(_ factor: CGFloat, withRate rate: Float)

    Objective-C

    - (void)rampToVideoZoomFactor:(CGFloat)factor withRate:(float)rate

    Parameters

    factor

    The new magnification factor.

    rate

    The rate at which to transition to the new magnification factor, specified in powers of two per second.

    Discussion

    Allowed values for factor range from 1.0 (full field of view) to the videoMaxZoomFactor specified by the active capture format.

    During a ramp, the zoom factor changes at an exponential rate, but this yields a visually linear transition. The rate parameter controls the speed of this transition independent of direction; for example, a value of 1.0 causes zoom factor to double every second if zooming in (that is, if the specified factor is greater than the current videoZoomFactor) or halve every second if zooming out.

    Before calling this method, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. If you do not, calling this method raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • Smoothly ends a zoom transition in progress.

    Declaration

    Swift

    func cancelVideoZoomRamp()

    Objective-C

    - (void)cancelVideoZoomRamp

    Discussion

    Calling this method is equivalent to calling rampToVideoZoomFactor:withRate: with a rate of zero. If a zoom transition is in progress, the transition slows to a stop (instead of stopping abruptly).

    Before calling this method, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. If you do not, calling this method raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • A Boolean value that indicates whether a zoom transition is in progress. (read-only)

    Declaration

    Swift

    var rampingVideoZoom: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isRampingVideoZoom) BOOL rampingVideoZoom

    Discussion

    You can observe changes to the value of this property using key-value observing to be notified when a zoom transition begins or ends.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • hasFlash hasFlash Property

    Indicates whether the capture device has a flash. (read-only)

    Declaration

    Swift

    var hasFlash: Bool { get }

    Objective-C

    @property(nonatomic, readonly) BOOL hasFlash

    Discussion

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • flashMode flashMode Property

    The current flash mode.

    Declaration

    Swift

    var flashMode: AVCaptureFlashMode

    Objective-C

    @property(nonatomic) AVCaptureFlashMode flashMode

    Discussion

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    You can observe changes to the value of this property using key-value observing.

    See AVCaptureFlashMode for possible values.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns a Boolean value that indicates whether the given flash mode is supported.

    Declaration

    Swift

    func isFlashModeSupported(_ flashMode: AVCaptureFlashMode) -> Bool

    Objective-C

    - (BOOL)isFlashModeSupported:(AVCaptureFlashMode)flashMode

    Parameters

    flashMode

    A flash mode. See AVCaptureFlashMode for possible values.

    Return Value

    YEStrue if flashMode is supported, otherwise NOfalse.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Indicates whether the flash is currently active. (read-only)

    Declaration

    Swift

    var flashActive: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isFlashActive) BOOL flashActive

    Discussion

    When the flash is active, it will flash if a still image is captured.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 5.0 and later.

  • Indicates whether the flash is currently available for use. (read-only)

    Declaration

    Swift

    var flashAvailable: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isFlashAvailable) BOOL flashAvailable

    Discussion

    The flash may become unavailable if, for example, the device overheats and needs to cool off.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 5.0 and later.

  • hasTorch hasTorch Property

    A Boolean value that specifies whether the capture device has a torch. (read-only)

    Declaration

    Swift

    var hasTorch: Bool { get }

    Objective-C

    @property(nonatomic, readonly) BOOL hasTorch

    Discussion

    A torch is a light source, such as an LED flash, that is available on the device and used for illuminating captured content or providing general illumination. This property reflects whether the current device has such illumination hardware built-in.

    Even if the device has a torch, that torch might not be available for use. Thus, you should also check the value of the torchAvailable property before using it.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Indicates whether the torch is currently available for use. (read-only)

    Declaration

    Swift

    var torchAvailable: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isTorchAvailable) BOOL torchAvailable

    Discussion

    The torch may become unavailable if, for example, the device overheats and needs to cool off.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 5.0 and later.

  • A Boolean value indicating whether the device’s torch is currently active. (read-only)

    Declaration

    Swift

    var torchActive: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isTorchActive) BOOL torchActive

    Discussion

    A torch must be present on the device and currently available before it can be active.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 6.0 and later.

  • The current torch brightness level. (read-only)

    Declaration

    Swift

    var torchLevel: Float { get }

    Objective-C

    @property(nonatomic, readonly) float torchLevel

    Discussion

    The value of this property is a floating-point number whose value is in the range 0.0 to 1.0. A torch level of 0.0 indicates that the torch is off. A torch level of 1.0 represents the theoretical maximum value, although the actual maximum value may be lower if the device is currently overheated.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 5.0 and later.

  • Returns a Boolean value that indicates whether the device supports the specified torch mode.

    Declaration

    Swift

    func isTorchModeSupported(_ torchMode: AVCaptureTorchMode) -> Bool

    Objective-C

    - (BOOL)isTorchModeSupported:(AVCaptureTorchMode)torchMode

    Parameters

    torchMode

    The desired torch mode. For a list of possible values, see AVCaptureTorchMode.

    Return Value

    YEStrue if torchMode is supported, otherwise NOfalse.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

    See Also

    torchMode

  • torchMode torchMode Property

    The current torch mode.

    Declaration

    Swift

    var torchMode: AVCaptureTorchMode

    Objective-C

    @property(nonatomic) AVCaptureTorchMode torchMode

    Discussion

    Setting the value of this property also sets the torch level to its maximum current value.

    Before setting the value of this property, call the isTorchModeSupported: method to make sure the device supports the desired mode. Setting the device to an unsupported torch mode results in the raising of an exception. For a list of possible values for this property, see AVCaptureTorchMode.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Sets the illumination level when in torch mode.

    Declaration

    Swift

    func setTorchModeOnWithLevel(_ torchLevel: Float, error outError: NSErrorPointer) -> Bool

    Objective-C

    - (BOOL)setTorchModeOnWithLevel:(float)torchLevel error:(NSError **)outError

    Parameters

    torchLevel

    The new torch mode level. This value must be a floating-point number between 0.0 and 1.0. To set the torch mode level to the currently available maximum, specify the constant AVCaptureMaxAvailableTorchLevel for this parameter.

    outError

    On input, a pointer to an error object. If an error occurs, this method assigns an error object to the pointer with information about what happened.

    Return Value

    YEStrue if the torch mode level was set or NOfalse if it was not.

    Discussion

    This method sets the torch mode to AVCaptureTorchModeOn and sets the level to the specified value. If the device does not support the AVCaptureTorchModeOn torch mode or if you specify a value for torchLevel that is outside the accepted range, this method raises an exception. If the torch value in is within the accepted range but greater than the currently supported maximum—perhaps because the device is overheating—this method simply returns NOfalse.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, calling this method raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 6.0 and later.

  • A Boolean value that indicates whether the capture device supports boosting images in low light conditions. (read-only)

    Declaration

    Swift

    var lowLightBoostSupported: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isLowLightBoostSupported) BOOL lowLightBoostSupported

    Discussion

    The AVCaptureDevice object’s automaticallyEnablesLowLightBoostWhenAvailable property can only be set if this property is YEStrue.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 6.0 and later.

  • A Boolean value that indicates whether the capture device’s low light boost feature is enabled. (read-only)

    Declaration

    Swift

    var lowLightBoostEnabled: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isLowLightBoostEnabled) BOOL lowLightBoostEnabled

    Discussion

    The value of this property indicates whether the AVCaptureDevice object is currently enhancing images to improve quality due to low light conditions. When this property is YEStrue, the capture device has switched into a special mode in which more light can be perceived in images.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 6.0 and later.

  • A Boolean value that indicates whether the capture device should automatically switch to low light boost mode when necessary.

    Declaration

    Swift

    var automaticallyEnablesLowLightBoostWhenAvailable: Bool

    Objective-C

    @property(nonatomic) BOOL automaticallyEnablesLowLightBoostWhenAvailable

    Discussion

    On an AVCaptureDevice object where lowLightBoostSupported is YEStrue, a special low light boost mode may be engaged to improve image quality. When the automaticallyEnablesLowLightBoostWhenAvailable property is set to YEStrue, the capture device switches at its discretion to a special boost mode under low light. When the scene becomes sufficiently lit, the device switches back to normal operation. An AVCaptureDevice that supports this feature may only engage boost mode for certain source formats or resolutions.

    The default value of this property is NOfalse. Setting this property throws an NSInvalidArgumentException if lowLightBoostSupported is NOfalse. The AVCaptureDevice object must be locked for configuration using lockForConfiguration: before clients can set this method, otherwise an NSGenericException is thrown.

    Clients may observe changes to the lowLightBoostEnabled property using key-value observing to know when the boost mode engages. The switch between normal operation and low light boost mode may drop one or more video frames.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 6.0 and later.

  • The currently active minimum frame duration.

    Declaration

    Swift

    var activeVideoMinFrameDuration: CMTime

    Objective-C

    @property(nonatomic) CMTime activeVideoMinFrameDuration

    Discussion

    A device’s minimum frame duration is the reciprocal of its maximum frame rate. You can set the value of this property to limit the maximum frame rate during a capture session. The capture device automatically chooses a default minimum frame duration based on its active format. After changing the value of this property, you can return to the default minimum frame duration by setting this property’s value to kCMTimeInvalid. Choosing a new preset for the capture session also resets this property to its default value.

    Attempting to set this property to a value not found in the active format’s videoSupportedFrameRateRanges array raises an exception (NSInvalidArgumentException).

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

    See Also

    activeFormat

  • The currently active maximum frame duration

    Declaration

    Swift

    var activeVideoMaxFrameDuration: CMTime

    Objective-C

    @property(nonatomic) CMTime activeVideoMaxFrameDuration

    Discussion

    A device’s maximum frame duration is the reciprocal of its minimum frame rate. You can set the value of this property to limit the minimum frame rate during a capture session. The capture device automatically chooses a default maximum frame duration based on its active format. After changing the value of this property, you can return to the default maximum frame duration by setting this property’s value to kCMTimeInvalid. Choosing a new preset for the capture session also resets this property to its default value.

    Attempting to set this property to a value not found in the active format’s videoSupportedFrameRateRanges array raises an exception (NSInvalidArgumentException).

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

    See Also

    activeFormat

  • Indicates whether the device should monitor the subject area for changes.

    Declaration

    Swift

    var subjectAreaChangeMonitoringEnabled: Bool

    Objective-C

    @property(nonatomic, getter=isSubjectAreaChangeMonitoringEnabled) BOOL subjectAreaChangeMonitoringEnabled

    Discussion

    The value of this property indicates whether the receiver should monitor the video subject area for changes, such as lighting changes, substantial movement, and so on. If subject area change monitoring is enabled, the capture device object sends an AVCaptureDeviceSubjectAreaDidChangeNotification whenever it detects a change to the subject area, at which time an interested client may wish to re-focus, adjust exposure, white balance, etc.

    Before changing the value of this property, you must call lockForConfiguration: to acquire exclusive access to the device’s configuration properties. Otherwise, setting the value of this property raises an exception. When you are done configuring the device, call unlockForConfiguration to release the lock and allow other devices to configure the settings.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 5.0 and later.

    See Also

    – focusMode

  • connected connected Property

    Indicates whether the device is currently connected. (read-only)

    Declaration

    Swift

    var connected: Bool { get }

    Objective-C

    @property(nonatomic, readonly, getter=isConnected) BOOL connected

    Discussion

    The value of this property indicates whether the device represented by the receiver is connected and available for use as a capture device. When the value of this property becomes NOfalse for a given instance, however, it will not become YEStrue again. If the same physical device again becomes available to the system, it will be represented using a new instance of AVCaptureDevice.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • position position Property

    Indicates the physical position of the device hardware on the system. (read-only)

    Declaration

    Swift

    var position: AVCaptureDevicePosition { get }

    Objective-C

    @property(nonatomic, readonly) AVCaptureDevicePosition position

    Discussion

    You can observe changes to the value of this property using key-value observing.

    See AVCaptureDevicePosition for possible values.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns a Boolean value that indicates whether the device provides media with a given type.

    Declaration

    Swift

    func hasMediaType(_ mediaType: String!) -> Bool

    Objective-C

    - (BOOL)hasMediaType:(NSString *)mediaType

    Parameters

    mediaType

    A media type, such as AVMediaTypeVideo, AVMediaTypeAudio, or AVMediaTypeMuxed. For a complete list of supported media type constants, see AV Foundation Constants Reference.

    Return Value

    YES if the device provides media of type mediaType, otherwise NOfalse.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • modelID modelID Property

    The model ID of the device. (read-only)

    Declaration

    Swift

    var modelID: String! { get }

    Objective-C

    @property(nonatomic, readonly) NSString *modelID

    Discussion

    The value of this property is an identifier unique to all devices of the same model. The value is persistent across device connections and disconnections, and across different systems. For example, the model ID of the camera built in to two identical iPhone models will be the same even though they are different physical devices.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • A localized human-readable name for the receiver. (read-only)

    Declaration

    Swift

    var localizedName: String! { get }

    Objective-C

    @property(nonatomic, readonly) NSString *localizedName

    Discussion

    You can use this property to display the name of a capture device in a user interface.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • uniqueID uniqueID Property

    An ID unique to the model of device corresponding to the receiver. (read-only)

    Declaration

    Swift

    var uniqueID: String! { get }

    Objective-C

    @property(nonatomic, readonly) NSString *uniqueID

    Discussion

    Every available capture device has a unique ID that persists on one system across device connections and disconnections, application restarts, and reboots of the system itself. You can store the value returned by this property to recall or track the status of a specific device in the future.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Returns a Boolean value that indicates whether the receiver can be used in an capture session configured with the given preset.

    Declaration

    Swift

    func supportsAVCaptureSessionPreset(_ preset: String!) -> Bool

    Objective-C

    - (BOOL)supportsAVCaptureSessionPreset:(NSString *)preset

    Parameters

    preset

    A capture session preset.

    Return Value

    YEStrue if the receiver can be used with preset, otherwise NOfalse.

    Discussion

    An AVCaptureSession instance can be associated with a preset that configures its inputs and outputs to fulfill common use cases. You can use this method to determine if the receiver can be used in a capture session with the given preset. For a list of preset constants, see AVCaptureSession Class Reference.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • The size of the lens diaphragm. (read-only)

    Declaration

    Swift

    var lensAperture: Float { get }

    Objective-C

    @property(nonatomic, readonly) float lensAperture

    Discussion

    The value of this property is a float indicating the size (the f number) of the lens diaphragm.

    This value does not change.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Indicates the focus position of the lens. (read-only)

    Declaration

    Swift

    var lensPosition: Float { get }

    Objective-C

    @property(nonatomic, readonly) float lensPosition

    Discussion

    A given lens position value does not correspond to an exact physical distance, nor does it represent a consistent focus distance from device to device.

    The range of possible positions is 0.0 to 1.0, with 0.0 being the shortest distance at which the lens can focus and 1.0 the furthest. Note that 1.0 does not represent focus at infinity. The default value is 1.0.

    The value can be read at any time, regardless of focus mode, but can only be set via setFocusModeLockedWithLensPosition:completionHandler:.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Locks the lens position at the specified value.

    Declaration

    Swift

    func setFocusModeLockedWithLensPosition(_ lensPosition: Float, completionHandler handler: ((CMTime) -> Void)!)

    Objective-C

    - (void)setFocusModeLockedWithLensPosition:(float)lensPosition completionHandler:(void (^)(CMTime syncTime))handler

    Parameters

    lensPosition

    The lens position. A value of AVCaptureLensPositionCurrent can be used to indicate that the caller does not wish to specify a value for lensPosition.

    handler

    A block that is called when lensPosition has been set to the value specified and focusMode is AVCaptureFocusModeLocked. The block receives a timestamp which matches that of the first buffer to which all settings have been applied.

    Note that the timestamp is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers delivered by AVCaptureVideoDataOutput instance.

    The client may pass nil for the handler parameter if knowledge of the operation's completion is not required.

    Discussion

    This method is the only means of setting the lensPosition property.

    This method throws an NSInvalidArgumentException exception if lensPosition is set to an unsupported level. An NSGenericException exception is thrown if the method is called without first obtaining exclusive access to the receiver using lockForConfiguration:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The length of time over which exposure takes place. (read-only)

    Declaration

    Swift

    var exposureDuration: CMTime { get }

    Objective-C

    @property(nonatomic, readonly) CMTime exposureDuration

    Discussion

    Only exposure duration values between minExposureDuration and maxExposureDuration are supported.

    The exposure duration can be read at any time, regardless of exposure mode, but can only be set by the setExposureModeCustomWithDuration:ISO:completionHandler: method.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Locks the exposure duration and ISO at the specified values.

    Declaration

    Swift

    func setExposureModeCustomWithDuration(_ duration: CMTime, ISO ISO: Float, completionHandler handler: ((CMTime) -> Void)!)

    Objective-C

    - (void)setExposureModeCustomWithDuration:(CMTime)duration ISO:(float)ISO completionHandler:(void (^)(CMTime syncTime))handler

    Parameters

    duration

    The exposure duration.

    A value of AVCaptureExposureDurationCurrent can be used to indicate that the caller does not wish to specify a value for exposureDuration. Note that changes to this property may result in changes to activeVideoMinFrameDuration and/or activeVideoMaxFrameDuration.

    ISO

    The exposure ISO value.

    A value of AVCaptureISOCurrent can be used to indicate that the caller does not wish to specify a value for ISO.

    handler

    A block to be called when both exposureDuration and ISO have been set to the values specified and exposureMode is AVCaptureExposureModeCustom.

    The block receives a timestamp which matches that of the first buffer to which all settings have been applied. The timestamp is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers delivered via an AVCaptureVideoDataOutput instance.

    Pass nil for the handler parameter if knowledge of the operation's completion is not required.

    Discussion

    This method is the only way of setting exposureDuration and ISO.

    This method throws an NSInvalidArgumentException exception if either exposureDuration or ISO is set to an unsupported level. An NSGenericException exception is thrown if this method is invoked without first obtaining exclusive access to the receiver using lockForConfiguration:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The metered exposure level's offset from the target exposure value, in EV units. (read-only)

    Declaration

    Swift

    var exposureTargetOffset: Float { get }

    Objective-C

    @property(nonatomic, readonly) float exposureTargetOffset

    Discussion

    The value of property indicates the difference between the metered exposure level of the current scene and the target exposure value.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Bias applied to the target exposure value, in EV units. (read-only)

    Declaration

    Swift

    var exposureTargetBias: Float { get }

    Objective-C

    @property(nonatomic, readonly) float exposureTargetBias

    Discussion

    When exposureMode is AVCaptureExposureModeAutoExpose or AVCaptureExposureModeLocked, the bias will affect both metering (exposureTargetOffset), and the actual exposure level (exposureDuration and ISO). When the exposure mode is AVCaptureExposureModeCustom, it will only affect metering.

    This property can be read at any time, but can only be set using the setExposureTargetBias:completionHandler:: method.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The maximum supported exposure bias, in EV units. (read-only)

    Declaration

    Swift

    var maxExposureTargetBias: Float { get }

    Objective-C

    @property(nonatomic, readonly) float maxExposureTargetBias

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The minimum supported exposure bias, in EV units. (read-only)

    Declaration

    Swift

    var minExposureTargetBias: Float { get }

    Objective-C

    @property(nonatomic, readonly) float minExposureTargetBias

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Sets the bias to be applied to the target exposure value.

    Declaration

    Swift

    func setExposureTargetBias(_ bias: Float, completionHandler handler: ((CMTime) -> Void)!)

    Objective-C

    - (void)setExposureTargetBias:(float)bias completionHandler:(void (^)(CMTime syncTime))handler

    Parameters

    bias

    The bias to be applied to the exposure target value.

    handler

    A block to be called when the exposureTargetBias property has been set to the value specified.

    The block receives a timestamp which matches that of the first buffer to which the setting has been applied. The timestamp is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers delivered via an AVCaptureVideoDataOutput instance.

    The client may pass nil for the handler parameter if knowledge of the operation's completion is not required.

    Discussion

    This method is the only way of setting the exposureTargetBias property.

    This method throws an NSInvalidArgumentException exception if exposureTargetBias is set to an unsupported level. An NSGenericException exception is thrown if this method is invoked without first obtaining exclusive access to the receiver using lockForConfiguration:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Converts device-specific white balance RGB gain values to device-independent chromaticity values.

    Declaration

    Swift

    func chromaticityValuesForDeviceWhiteBalanceGains(_ whiteBalanceGains: AVCaptureWhiteBalanceGains) -> AVCaptureWhiteBalanceChromaticityValues

    Objective-C

    - (AVCaptureWhiteBalanceChromaticityValues)chromaticityValuesForDeviceWhiteBalanceGains:(AVCaptureWhiteBalanceGains)whiteBalanceGains

    Parameters

    whiteBalanceGains

    The white balance gain values. A value of AVCaptureWhiteBalanceGainsCurrent may not be used.

    Return Value

    A fully populated AVCaptureWhiteBalanceChromaticityValues structure containing device-independent values.

    Discussion

    This method may be called on the receiver to convert device-specific white balance RGB gain values to device-independent chromaticity (little x, little y) values.

    For each channel in the whiteBalanceGains struct, only values between 1.0 and maxWhiteBalanceGain are supported. This method throws an NSInvalidArgumentException exception if any of the whiteBalanceGains fields are set to unsupported values.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The current device-specific RGB white balance gain values. (read-only)

    Declaration

    Swift

    var deviceWhiteBalanceGains: AVCaptureWhiteBalanceGains { get }

    Objective-C

    @property(nonatomic, readonly) AVCaptureWhiteBalanceGains deviceWhiteBalanceGains

    Discussion

    This property specifies the current red, green, and blue gain values used for white balance. The values can be used to adjust color casts for a given scene. For each channel, only values between 1.0 and maxWhiteBalanceGain are supported.

    The value can be read at any time, regardless of white balance mode, but can only be set using the setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler: method.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The maximum supported value to which a channel in the AVCaptureWhiteBalanceGains may be set. (read-only)

    Declaration

    Swift

    var maxWhiteBalanceGain: Float { get }

    Objective-C

    @property(nonatomic, readonly) float maxWhiteBalanceGain

    Discussion

    This property does not change for the life of the object.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Converts device-independent chromaticity values to device-specific white balance RGB gain values.

    Declaration

    Swift

    func deviceWhiteBalanceGainsForChromaticityValues(_ chromaticityValues: AVCaptureWhiteBalanceChromaticityValues) -> AVCaptureWhiteBalanceGains

    Objective-C

    - (AVCaptureWhiteBalanceGains)deviceWhiteBalanceGainsForChromaticityValues:(AVCaptureWhiteBalanceChromaticityValues)chromaticityValues

    Parameters

    chromaticityValues

    The chromaticity values.

    Return Value

    A fully populated AVCaptureWhiteBalanceGains structure containing device-specific RGB gain values.

    Discussion

    This method may be called on the receiver to convert device-independent chromaticity values to device-specific RGB white balance gain values.

    This method throws an NSInvalidArgumentException exception if any of the chromaticityValues fields are set outside the range [0,1].

    Some chromaticityValues field combinations yield out-of-range device RGB values that will cause an exception to be thrown if passed directly to setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler:. Be sure to check that red, green, and blue gain values are within the range of [1.0 - maxWhiteBalanceGain].

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Converts device-independent temperature and tint values to device-specific white balance RGB gain values.

    Declaration

    Swift

    func deviceWhiteBalanceGainsForTemperatureAndTintValues(_ tempAndTintValues: AVCaptureWhiteBalanceTemperatureAndTintValues) -> AVCaptureWhiteBalanceGains

    Objective-C

    - (AVCaptureWhiteBalanceGains)deviceWhiteBalanceGainsForTemperatureAndTintValues:(AVCaptureWhiteBalanceTemperatureAndTintValues)tempAndTintValues

    Parameters

    tempAndTintValues

    An AVCaptureWhiteBalanceTemperatureAndTintValues struct containing the temperature and tint values.

    Return Value

    A fully populated AVCaptureWhiteBalanceGains struct containing device-specific RGB gain values.

    Discussion

    This method is invoked to convert device-independent temperature and tint values to device-specific RGB white balance gain values.

    You may pass any temperature and tint values and corresponding white balance gains will be produced. Note though that some temperature and tint combinations yield out-of-range device RGB values that will cause an exception to be thrown if passed directly to setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler:. Be sure to check that the red, green, and blue gain values are within the range of [1.0 - maxWhiteBalanceGain].

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • The current device-specific gray world RGB white balance gain values. (read-only)

    Declaration

    Swift

    var grayWorldDeviceWhiteBalanceGains: AVCaptureWhiteBalanceGains { get }

    Objective-C

    @property(nonatomic, readonly) AVCaptureWhiteBalanceGains grayWorldDeviceWhiteBalanceGains

    Discussion

    This property specifies the current red, green, and blue gain values derived from the current scene to deliver a neutral (or Gray world) white point for white balance.

    Gray world values assume a neutral subject (e.g. a gray card) has been placed in the middle of the subject area and fills the center 50% of the frame. Clients can read these values and apply them to the device using setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler:.

    For each channel, only values between 1.0 and maxWhiteBalanceGain are supported. The value can be read at any time, regardless of white balance mode.

    You can observe changes to the value of this property using key-value observing.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Sets white balance to locked mode with the specified deviceWhiteBalanceGains values.

    Declaration

    Swift

    func setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains(_ whiteBalanceGains: AVCaptureWhiteBalanceGains, completionHandler handler: ((CMTime) -> Void)!)

    Objective-C

    - (void)setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:(AVCaptureWhiteBalanceGains)whiteBalanceGains completionHandler:(void (^)(CMTime syncTime))handler

    Parameters

    whiteBalanceGains

    The white balance gains.

    A value of AVCaptureWhiteBalanceGainsCurrent can be used to indicate that the caller does not wish to specify a value for deviceWhiteBalanceGains.

    handler

    A block to be called when white balance gains have been set to the values specified and whiteBalanceMode property is AVCaptureWhiteBalanceModeLocked. The block receives a timestamp which matches that of the first buffer to which all settings have been applied. The timestamp is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers delivered via an AVCaptureVideoDataOutput instance.

    This parameter may be nil if synchronization is not required.

    Discussion

    For each channel in the whiteBalanceGains struct, only values between 1.0 and maxWhiteBalanceGain are supported. Gain values are normalized to the minimum channel value to avoid brightness changes (for example, R:2 G:2 B:4 will be normalized to R:1 G:1 B:2).

    This method throws an NSInvalidArgumentException if any of the whiteBalanceGains fields are set to an unsupported level. An NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Converts device-specific white balance RGB gain values to device-independent temperature and tint values.

    Declaration

    Swift

    func temperatureAndTintValuesForDeviceWhiteBalanceGains(_ whiteBalanceGains: AVCaptureWhiteBalanceGains) -> AVCaptureWhiteBalanceTemperatureAndTintValues

    Objective-C

    - (AVCaptureWhiteBalanceTemperatureAndTintValues)temperatureAndTintValuesForDeviceWhiteBalanceGains:(AVCaptureWhiteBalanceGains)whiteBalanceGains

    Parameters

    whiteBalanceGains

    The white balance gain values.

    A value of AVCaptureWhiteBalanceGainsCurrent may not be used.

    Return Value

    A fully populated AVCaptureWhiteBalanceTemperatureAndTintValues struct containing device-independent values.

    Discussion

    This method is used to convert device-specific white balance RGB gain values to device-independent temperature (in kelvin) and tint values.

    For each channel in the whiteBalanceGains struct, only values between 1.0 and maxWhiteBalanceGain are supported.

    This method throws an NSInvalidArgumentException exception if any of the whiteBalanceGains struct fields are set to unsupported values.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • ISO ISO Property

    The current exposure ISO value. (read-only)

    Declaration

    Swift

    var ISO: Float { get }

    Objective-C

    @property(nonatomic, readonly) float ISO

    Discussion

    This property returns the sensor's sensitivity to light by means of a gain value applied to the signal.

    Only exposure duration values between minISO and maxISO are supported. Higher values will result in noisier images.

    The property value can be read at any time, regardless of exposure mode, but can only be set using the setExposureModeCustomWithDuration:ISO:completionHandler: method.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • A Boolean value that indicates whether the capture device is allowed to turn high dynamic range streaming on or off.

    Declaration

    Swift

    var automaticallyAdjustsVideoHDREnabled: Bool

    Objective-C

    @property(nonatomic) BOOL automaticallyAdjustsVideoHDREnabled

    Discussion

    The default value is YEStrue. By default, AVCaptureDevice always sets videoHDREnabled to NOfalse when a client sets a new format using the activeFormat property.

    When the client uses the AVCaptureSession property sessionPreset instead, AVCaptureDevice turns video HDR on automatically if it's a good fit for the preset. An NSGenericException exception is thrown if this property is set without first obtaining exclusive access to the receiver using lockForConfiguration:. Clients can use key-value observing of the videoHDREnabled property to know when the receiver has automatically changed the value.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • A Boolean value that indicates whether the receiver is allowed to turn high dynamic range streaming on or off.

    Declaration

    Swift

    var videoHDREnabled: Bool

    Objective-C

    @property(nonatomic, getter=isVideoHDREnabled) BOOL videoHDREnabled

    Discussion

    The default value is YEStrue. By default, AVCaptureDevice always sets videoHDREnabled to NOfalse when a client sets a new format using the activeFormat property.

    When the client uses the AVCaptureSession property sessionPreset instead, AVCaptureDevice turns video HDR on automatically if it's a good fit for the preset. An NSGenericException exception is thrown if this property is set without first obtaining exclusive access to the receiver using lockForConfiguration:. Clients can use key-value observing of the videoHDREnabled property to know when the receiver has automatically changed the value.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 8.0 and later.

  • Constants to specify the position of a capture device.

    Declaration

    Swift

    enum AVCaptureDevicePosition : Int { case Unspecified case Back case Front }

    Objective-C

    enum { AVCaptureDevicePositionUnspecified = 0, AVCaptureDevicePositionBack = 1, AVCaptureDevicePositionFront = 2 }; typedef NSInteger AVCaptureDevicePosition;

    Constants

    • Unspecified

      AVCaptureDevicePositionUnspecified

      The capture device’s position relative to the system hardware is unspecified.

      Available in iOS 4.0 and later.

    • Back

      AVCaptureDevicePositionBack

      The capture device is on the back of the unit.

      Available in iOS 4.0 and later.

    • Front

      AVCaptureDevicePositionFront

      The capture device is on the front of the unit.

      Available in iOS 4.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Constants to specify the flash mode of a capture device.

    Declaration

    Swift

    enum AVCaptureFlashMode : Int { case Off case On case Auto }

    Objective-C

    typedef enum : NSInteger { AVCaptureFlashModeOff = 0, AVCaptureFlashModeOn = 1, AVCaptureFlashModeAuto = 2 } AVCaptureFlashMode;

    Constants

    • Off

      AVCaptureFlashModeOff

      The capture device flash is always off.

      Available in iOS 4.0 and later.

    • On

      AVCaptureFlashModeOn

      The capture device flash is always on.

      Available in iOS 4.0 and later.

    • Auto

      AVCaptureFlashModeAuto

      The capture device continuously monitors light levels and uses the flash when necessary.

      Available in iOS 4.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Constants to specify the capture device’s torch mode.

    Declaration

    Swift

    enum AVCaptureTorchMode : Int { case Off case On case Auto }

    Objective-C

    typedef enum : NSInteger { AVCaptureTorchModeOff = 0, AVCaptureTorchModeOn = 1, AVCaptureTorchModeAuto = 2 } AVCaptureTorchMode;

    Constants

    • Off

      AVCaptureTorchModeOff

      The capture device torch is always off.

      Available in iOS 4.0 and later.

    • On

      AVCaptureTorchModeOn

      The capture device torch is always on.

      Available in iOS 4.0 and later.

    • Auto

      AVCaptureTorchModeAuto

      The capture device continuously monitors light levels and uses the torch when necessary.

      Available in iOS 4.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • The maximum torch level.

    Declaration

    Swift

    let AVCaptureMaxAvailableTorchLevel: Float

    Objective-C

    const float AVCaptureMaxAvailableTorchLevel;

    Constants

    • AVCaptureMaxAvailableTorchLevel

      AVCaptureMaxAvailableTorchLevel

      This constant always represents the maximum available torch level, independent of the actual maximum value currently supported by the device. Thus, pass this constant to the setTorchModeOnWithLevel:error: in situations where you want to specify the maximum torch level without having to worry about whether the device is overheating and might not accept a value of 1.0 as the maximum.

      Available in iOS 6.0 and later.

  • Constants to specify the focus mode of a capture device.

    Declaration

    Swift

    enum AVCaptureFocusMode : Int { case Locked case AutoFocus case ContinuousAutoFocus }

    Objective-C

    typedef enum : NSInteger { AVCaptureFocusModeLocked = 0, AVCaptureFocusModeAutoFocus = 1, AVCaptureFocusModeContinuousAutoFocus = 2, } AVCaptureFocusMode;

    Constants

    • Locked

      AVCaptureFocusModeLocked

      The focus is locked.

      Available in iOS 4.0 and later.

    • AutoFocus

      AVCaptureFocusModeAutoFocus

      The capture device performs an autofocus operation now.

      Available in iOS 4.0 and later.

    • ContinuousAutoFocus

      AVCaptureFocusModeContinuousAutoFocus

      The capture device continuously monitors focus and auto focuses when necessary.

      Available in iOS 4.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Constants to specify the exposure mode of a capture device.

    Declaration

    Swift

    enum AVCaptureExposureMode : Int { case Locked case AutoExpose case ContinuousAutoExposure case Custom }

    Objective-C

    typedef enum : NSInteger { AVCaptureExposureModeLocked = 0, AVCaptureExposureModeAutoExpose = 1, AVCaptureExposureModeContinuousAutoExposure = 2, AVCaptureExposureModeCustom = 3, } AVCaptureExposureMode;

    Constants

    • Locked

      AVCaptureExposureModeLocked

      The exposure setting is locked.

      Available in iOS 4.0 and later.

    • AutoExpose

      AVCaptureExposureModeAutoExpose

      The device automatically adjusts the exposure once and then changes the exposure mode to AVCaptureExposureModeLocked.

      Available in iOS 4.0 and later.

    • ContinuousAutoExposure

      AVCaptureExposureModeContinuousAutoExposure

      The device continuously monitors exposure levels and auto exposes when necessary.

      Available in iOS 4.0 and later.

    • Custom

      AVCaptureExposureModeCustom

      The device should only adjust exposure according to user provided ISO and exposureDuration property values.

      Available in iOS 8.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Constants to specify the white balance mode of a capture device.

    Declaration

    Swift

    enum AVCaptureWhiteBalanceMode : Int { case Locked case AutoWhiteBalance case ContinuousAutoWhiteBalance }

    Objective-C

    typedef enum : NSInteger { AVCaptureWhiteBalanceModeLocked = 0, AVCaptureWhiteBalanceModeAutoWhiteBalance = 1, AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance = 2, } AVCaptureWhiteBalanceMode;

    Constants

    • Locked

      AVCaptureWhiteBalanceModeLocked

      The white balance setting is locked.

      Available in iOS 4.0 and later.

    • AutoWhiteBalance

      AVCaptureWhiteBalanceModeAutoWhiteBalance

      The device performs an auto white balance operation now.

      Available in iOS 4.0 and later.

    • ContinuousAutoWhiteBalance

      AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance

      The device continuously monitors white balance and adjusts when necessary.

      Available in iOS 4.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 4.0 and later.

  • Constants that provide information regarding permission to use media capture devices.

    Declaration

    Swift

    enum AVAuthorizationStatus : Int { case NotDetermined case Restricted case Denied case Authorized }

    Objective-C

    typedef enum : NSInteger { AVAuthorizationStatusNotDetermined = 0, AVAuthorizationStatusRestricted, AVAuthorizationStatusDenied, AVAuthorizationStatusAuthorized } AVAuthorizationStatus;

    Constants

    • NotDetermined

      AVAuthorizationStatusNotDetermined

      Explicit user permission is required for media capture, but the user has not yet granted or denied such permission.

      Call requestAccessForMediaType:completionHandler: to prompt the user for permission, or create an AVCaptureDeviceInput for the media type and the user will be automatically prompted.

      Available in iOS 7.0 and later.

    • Restricted

      AVAuthorizationStatusRestricted

      The user is not allowed to access media capture devices.

      This status is normally not visible—the AVCaptureDevice class methods for discovering devices do not return devices the user is restricted from accessing.

      Available in iOS 7.0 and later.

    • Denied

      AVAuthorizationStatusDenied

      The user has explicitly denied permission for media capture.

      Available in iOS 7.0 and later.

    • Authorized

      AVAuthorizationStatusAuthorized

      The user has explicitly granted permission for media capture, or explicit user permission is not necessary for the media type in question.

      Available in iOS 7.0 and later.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • Constants to specify the autofocus range of a capture device.

    Declaration

    Swift

    enum AVCaptureAutoFocusRangeRestriction : Int { case None case Near case Far }

    Objective-C

    enum { AVCaptureAutoFocusRangeRestrictionNone = 0, AVCaptureAutoFocusRangeRestrictionNear = 1, AVCaptureAutoFocusRangeRestrictionFar = 2, }; typedef NSInteger AVCaptureAutoFocusRangeRestriction;

    Constants

    • None

      AVCaptureAutoFocusRangeRestrictionNone

      The device attempts to focus on objects at any range.

      This value is the default, and the only value allowed on devices that do not support focus range restriction.

      Available in iOS 7.0 and later.

    • Near

      AVCaptureAutoFocusRangeRestrictionNear

      The device primarily attempts to focus on subjects near the camera.

      This value is recommended for applications that use AVCaptureMetadataOutput to recognize machine-readable codes.

      Available in iOS 7.0 and later.

    • Far

      AVCaptureAutoFocusRangeRestrictionFar

      The device primarily attempts to focus on subjects far away from the camera.

      Available in iOS 7.0 and later.

    Discussion

    If you expect to focus primarily on near or far objects, you can use the autoFocusRangeRestriction property to provide a hint to the focusing system. This approach makes autofocus faster, more power efficient, and less error prone. A restriction prioritizes focusing at distances in the specified range, but does not prevent focusing elsewhere if the device finds no focus point within that range.

    Import Statement

    Objective-C

    @import AVFoundation;

    Swift

    import AVFoundation

    Availability

    Available in iOS 7.0 and later.

  • Indicates that no value should be specified for the deviceWhiteBalanceGains parameter of setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler:.

    Declaration

    Swift

    let AVCaptureWhiteBalanceGainsCurrent: AVCaptureWhiteBalanceGains

    Objective-C

    const AVCaptureWhiteBalanceGains AVCaptureWhiteBalanceGainsCurrent;

    Constants

  • Indicates that no value should be specified for the lensPosition parameter of setFocusModeLockedWithLensPosition:completionHandler:.

    Declaration

    Swift

    let AVCaptureLensPositionCurrent: Float

    Objective-C

    const float AVCaptureLensPositionCurrent;

    Constants

    • AVCaptureLensPositionCurrent

      AVCaptureLensPositionCurrent

      This value may be passed as the lensPosition parameter of setFocusModeLockedWithLensPosition:completionHandler: to indicate that no value should be set for the lensPosition property, and that it should instead be set to its current value.

      Available in iOS 8.0 and later.

    Discussion

    Note that the device may be adjusting lensPosition at the time of the invocation of setFocusModeLockedWithLensPosition:completionHandler:, in which case the value at which lensPosition is locked may differ from the value obtained by querying the lensPosition property.

  • Indicates that no value should be specified for the ISO parameter of setExposureModeCustomWithDuration:ISO:completionHandler:.

    Declaration

    Swift

    let AVCaptureISOCurrent: Float

    Objective-C

    const float AVCaptureISOCurrent;

    Constants

    • AVCaptureISOCurrent

      AVCaptureISOCurrent

      This value indicates that the caller does not wish to specify a value for the ISO property, and that it should instead be set to its current value.

      Available in iOS 8.0 and later.

    Discussion

    Note that the device may be adjusting ISO at the time of the call, in which case the value to which ISO is set may differ from the value obtained by querying the ISO property.

  • Indicates that no value should be specified for the exposureTargetBias parameter of setExposureTargetBias:completionHandler:.

    Declaration

    Swift

    let AVCaptureExposureTargetBiasCurrent: Float

    Objective-C

    const float AVCaptureExposureTargetBiasCurrent;

    Constants

    • AVCaptureExposureTargetBiasCurrent

      AVCaptureExposureTargetBiasCurrent

      This value indicates that the caller does not wish to specify a value for the exposureTargetBias property, and that it should instead be set to its current value.

      Available in iOS 8.0 and later.

  • Indicates that no value should be specified for the duration parameter of setExposureModeCustomWithDuration:ISO:completionHandler:.

    Declaration

    Swift

    let AVCaptureExposureDurationCurrent: CMTime

    Objective-C

    const CMTime AVCaptureExposureDurationCurrent;

    Constants

    • AVCaptureExposureDurationCurrent

      AVCaptureExposureDurationCurrent

      This value indicates that the caller does not wish to specify a value for the exposureDuration property, and that it should instead be set to its current value.

      Available in iOS 8.0 and later.

    Discussion

    Note that the device may be adjusting the exposure duration at the time of the call, in which case the value to which exposureDuration is set may differ from the value obtained by querying the exposureDuration property.