I prefer to use the album fetched from the library instead of the catalog since this is faster. If doing so, how can I check if all tracks of an album are added to the library. In this case I'd like to fetch the catalog version or throw an error (for example when offline).
Using .with(.tracks) on the library album fetches the tracks added to the library.
The trackCount property is referring to the tracks that can be fetched from the library.
The isComplete property is always nil when fetching from the library.
One possible way is checking the trackNumber and discCount properties. However this only detects that not all tracks of an album are added to the library if there is a song not added ahead of one that is. I'd like to be able to handle this edge case as well.
Is there currently a way to do this? I'd prefer to not rely on the apple music catalog for this since this is supposed to work offline as well. Fetching and storing all trackIDs when connected and later comparing against these would work, but this would potentially mean storing tens of thousands of track ids.
Thank you
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
It seems currently 5G Network Slicing is only available on URL Request / URL Session / Low level network and not on AVPlayer, is it possible to have the app using default slice when start the app and only enable Network Slicing when streaming video via AVPlayer?
Issue Description
I'm implementing a system audio capture feature using AudioHardwareCreateProcessTap and AudioHardwareCreateAggregateDevice. The app successfully creates the tap and aggregate device, but when starting the IO procedure with AudioDeviceStart, it sometimes fails with OSStatus error 1852797029. (The operation couldn’t be completed. (OSStatus error 1852797029.)) The error occurs inconsistently, which makes it particularly difficult to debug and reproduce.
Questions
Has anyone encountered this intermittent "nope" error code (0x6e6f7065) when working with system audio capture?
Are there specific conditions or system states that might trigger this error sporadically?
Are there any known workarounds for handling this intermittent failure case?
Any insights or guidance would be greatly appreciated. I'm wondering if anyone else has encountered this specific "nope" error code (0x6e6f7065) when working with system audio capture.
I have added some custom views on my pip. These controls disappeared after opening the camera in the Xcode16 environment and iOS 18 system, and it was found that these custom views were not removed and seemed to be obscured. They were displayed normally in the Xcode15.4 environment. I would like to ask how to make my custom views display normally
Topic:
Media Technologies
SubTopic:
Video
Hi there,
Can anyone tell me how to possibly get approved as an Apple News Publisher in 2025?
We attempted in 2024, but received this message from Apple support:
"Thank you for your interest in Apple News. At this time, we're not accepting new applications."
When I inquired further, I got this second response:
"Apple News is no longer accepting unsolicited applications. To learn more about Apple News requirements, visit the Apple News support page. If you have any feedback, please use this form to send us your comments. Keep in mind that while we read all feedback, we are unable to respond to each submission individually."
My questions are:
Is this still the case? (Especially when you are a legit local news outlet)
Is there a link to apply as a news publisher? I don't seem to have that option at all.
Thanks for any feedback.
I'm trying to load Music Kit on the server with solid js. I can confirm that my implementation has been sufficient to return authentication tokens and for MusicKit.isAuthorized to return true. My issue is that if I reload the page, it only succeeds intermittently (perhaps 25% of the time?). My question is - what is wrong with my implementation? Removing the async keyword ensures it loads every time but playing and queuing music no longer works. I'm currently assuming this is an SSR issue but the docs haven't explicitly specified this isn't possible.
I have the following boilerplate:
export default createHandler(
() => (
<StartServer
document={({ assets, children, scripts }) => {
return (
<html lang="en">
<head>
<meta name="apple-music-developer-token" content={authResult.token} />
<meta name="apple-music-app-name" content="app name" />
<meta name="apple-music-app-build" content="1978.4.1" />
{assets}
<script
src="https://js-cdn.music.apple.com/musickit/v3/musickit.js"
async
/>
</head>
<body>
<div id="app">{children}</div>
{scripts}
</body>
</html>
)
}}
/>
))
When I first load my app, I'll encounter:
musickit.js:13 Uncaught TypeError: Cannot read properties of undefined (reading 'node')
at musickit.js:13:10194
at musickit.js:13:140
at musickit.js:13:209
The intermittence signals an issue relating to the async keyword. An expansion on this issue can be found here.
Hi,
I am trying to enable the default MIDINetworkSession in a Catalyst app on MacOS like this:
MIDINetworkSession.default().isEnabled = true
MIDINetworkSession.default().connectionPolicy = .anyone
In the AppSandbox I have both incoming and outgoing network connections enabled. And I also added the NSLocalNetworkUsageDescription key to the info.plist. Bonjour services are also added to the info.plist:
NSBonjourServices
_apple-midi._udp.
Nevertheless the session stays disabled. Running the same code works just fine on iOS.
Is there any special setup I need to make on MacOS to enable the MIDINetworkSession?
Thanks!
How can I use my RGB Curve points:
let redCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.152), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)]
let greenCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.247, y: 0.196), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)]
let blueCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.184), CIVector(x: 0.466, y: 0.466), CIVector(x: 1, y: 1)]
in colorCurvesFilter which I've found in Apple Docs:
func colorCurves(inputImage: CIImage) -> CIImage {
let colorCurvesEffect = CIFilter.colorCurves()
colorCurvesEffect.inputImage = inputImage
colorCurvesEffect.curvesDomain = CIVector(x: 0, y: 1)
colorCurvesEffect.curvesData = Data(
bytes: [Float32]([
0.0,0.0,0.0,
0.8,0.8,0.8,
1.0,1.0,1.0
]), count: 36)
colorCurvesEffect.colorSpace = CGColorSpaceCreateDeviceRGB()
return colorCurvesEffect.outputImage!
}
Hi I'm working on a project that require video frame PTS to be consistent between original video and a transcoded one. It's working fairly well on regular mp4, however if I set preferredOutputSegmentInterval to have generate a fMP4 output, even I specified the initialSegmentStartTime as 0, it always add one frame pts offset to all frames.
For example: if I use the code sample provided by Apple: https://developer.apple.com/videos/play/wwdc2020/10011/?time=406, useffprobe -select_streams v:0 -show_entries packet=pts_time -of csv ~/Downloads/fmp4/prog_index.m3u8 to display the pts of the output, it doesn't start from 0, but has some one frame pts offset. I also tried open with MP4Box, it also shows the first frames dts and cts are not start from 0.
However, if I use AVAssetReader to read the same output video, and get the PTS from 1st frame, it's returning 0. So I can't use it to calculate the pts difference between 2 videos neither.
Can I get some help to understand why there is difference between AVAssetWriter/Reader fMP4's pts and others like ffprobe?
We are encountering an issue where AVPlayer throws the error:
Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" > Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
This error seems to occur intermittently during video playback, especially after extended usage or when switching between different streams. We observe Error 11819 (AVFoundationErrorDomain) in Conviva platform that some of our users experience it but we couldn't reproduce it so far and we’re need support to determine the root cause and/or best practices to prevent it.
Some questions we have:
What typically triggers this error?
Could it be related to memory/resource constraints, network instability, or backgrounding?
Are there any recommended ways to handle or recover from this error gracefully?
Any insights or guidance would be greatly appreciated. Thanks!
Topic:
Media Technologies
SubTopic:
Streaming
Hello,
Is there a way to handle 403 error returned by the server, eg token expired ?
Cannot find any information about this and everything that I tried wasn't working (addObserver, NotificationCenter with .AVPlayerItemNewErrorLogEntry, AVPlayerItemPlaybackStalled, ...)
Thank you very much.
Topic:
Media Technologies
SubTopic:
Video
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds).
Two permissions in xcode have been added:
Privacy - Photo Library Usage Description
Privacy - Photo Library Additions Usage Description
Simulate: iphone 16, ios 18.3
The codes in ContentView.swift :
private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) {
PHPhotoLibrary.shared().performChanges {
let creationRequest = PHAssetCreationRequest.forAsset()
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = false
creationRequest.addResource(with: .photo, fileURL: imageURL, options: options)
creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options)
} completionHandler: { success, error in
DispatchQueue.main.async {
print(error)
completion(success, error)
}
}
}
guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"),
let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else {
showAlertMessage(title: "error", message: "cant find Live Photo ")
return
}
print("imageURL: \(imageURL)")
print("videoURL: \(videoURL)")
saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in
if success {
xxxxx
} else {
xxxxx
}
}
Really need help, thanks
I am creating an app that decodes H.265 elementary streams on iOS.
I use VideoToolBox to decode from H.265 to NV12.
The decoded data is enqueued in the CMSampleBufferDisplayLayer as a CMSampleBuffer.
However, nothing is displayed in the VideoPlayerView. It remains black.
The decoding in VideoToolBox is successful. I confirmed this by saving the NV12 data in the CMSampleBuffer to a file and displaying it using a tool.
Why is nothing displayed in the VideoPlayerView?
I can provide other source code as well.
//
// VideoPlayerView.swift
// H265Decoder
//
// Created by Kohshin Tokunaga on 2025/02/15.
//
import SwiftUI
import AVFoundation
struct VideoPlayerView: UIViewRepresentable {
// Return an H265Player as the coordinator, and start playback there.
func makeCoordinator() -> H265Player {
H265Player()
}
func makeUIView(context: Context) -> UIView {
let uiView = UIView(frame: .zero)
// Base layer for attaching sublayers
uiView.backgroundColor = .black // Screen background color (for iOS)
// Create the display layer and add it to uiView.layer
let displayLayer = context.coordinator.displayLayer
displayLayer.frame = uiView.bounds
displayLayer.backgroundColor = UIColor.clear.cgColor
uiView.layer.addSublayer(displayLayer)
// Start playback
context.coordinator.startPlayback()
return uiView
}
func updateUIView(_ uiView: UIView, context: Context) {
// Reset the frame of the AVSampleBufferDisplayLayer when the view's size changes.
let displayLayer = context.coordinator.displayLayer
displayLayer.frame = uiView.layer.bounds
// Optionally update the layer's background color, etc.
uiView.backgroundColor = .black
displayLayer.backgroundColor = UIColor.clear.cgColor
// Flush transactions if necessary
CATransaction.flush()
}
}
//
// H265Player.swift
// H265Decoder
//
// Created by Kohshin Tokunaga on 2025/02/15.
//
import Foundation
import AVFoundation
import CoreMedia
class H265Player: NSObject, VideoDecoderDelegate {
let displayLayer = AVSampleBufferDisplayLayer()
private var decoder: H265Decoder?
override init() {
super.init()
// Initial configuration for the display layer
displayLayer.videoGravity = .resizeAspect
// Initialize the decoder (delegate = self)
decoder = H265Decoder(delegate: self)
// For simple playback, set isBaseline to true
decoder?.isBaseline = true
}
func startPlayback() {
// Load the file "cars_320x240.h265"
guard let url = Bundle.main.url(forResource: "temp2", withExtension: "h265") else {
print("File not found")
return
}
do {
let data = try Data(contentsOf: url)
// Set FPS and video size as needed
let packet = VideoPacket(data: data,
type: .h265,
fps: 30,
videoSize: CGSize(width: 1080, height: 1920))
// Decode as a single packet
decoder?.decodeOnePacket(packet)
} catch {
print("Failed to load file: \(error)")
}
}
// MARK: - VideoDecoderDelegate
func decodeOutput(video: CMSampleBuffer) {
// When decoding is complete, send the output to AVSampleBufferDisplayLayer
displayLayer.enqueue(video)
}
func decodeOutput(error: DecodeError) {
print("Decoding error: \(error)")
}
}
Topic:
Media Technologies
SubTopic:
Video
We encounter issue with avplayer in case of EXT-X-DISCONTINUITY misalignment between audio and video produced after insertion of gaps.
The initial objective is to introduce an EXT-X-DISCONTINUITY in audio playlist after some missing segments (EXT-X-GAP) which durations are aligned to video segments durations, to handle irregular audio durations.
Please find below an example of corresponding video and audio playlists:
video:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-MEDIA-SEQUENCE:872524632
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-TARGETDURATION:2
#USP-X-TIMESTAMP-MAP:MPEGTS=7096045027,LOCAL=2025-05-09T12:38:32.369100Z
#EXT-X-MAP:URI="hls/StreamingBasic-video=979200.m4s"
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.369111Z
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524632.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524633.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524634.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524635.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524636.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524637.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524638.m4s
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.383111Z
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524639.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524640.m4s
audio:
EXTM3U
#EXT-X-VERSION:7
#EXT-X-MEDIA-SEQUENCE:872524632
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-TARGETDURATION:2
#USP-X-TIMESTAMP-MAP:MPEGTS=7096045867,LOCAL=2025-05-09T12:38:32.378400Z
#EXT-X-MAP:URI="hls/StreamingBasic-audio_99500_eng=98800.m4s"
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.378444Z
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524632.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524633.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524634.m4s
#EXTINF:1.984, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524635.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524636.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524637.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524638.m4s
#EXT-X-DISCONTINUITY
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.778444Z
#EXTINF:1.6213, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524639.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524640.m4s
In this case playback is broken with avplayer.
Is it conformed to Http Live Streaming?
Is it an avplayer bug?
What are the guidelines to handle such gaps?
Based on the iPhone 14 Max camera, implement model recognition and draw a rectangular box around the recognized object. The width and height are calculated using LiDAR and displayed in centimeters on the real-time updated image.
Hello. My app uses AVAudioRecorder to generate recording files, which are consistently only 4kb in size. Most users generate audio files normally, with only a few users experiencing this phenomenon occasionally. After uninstalling and installing the app, it will work normally, but it will reappear after a period of time. I have compared that the problematic audio files generated each time are fixed and cannot be played. Added the audioRecorderDidFinishRecording proxy method, which shows that the recording was completed normally. The user also reported that the recording is normal, but there is a problem with the generated file. How should I handle this issue? Look forward to your reply.
- (void)startRecordWithOrderID:(NSString *)orderID {
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:nil];
[audioSession setActive:YES error:nil];
NSMutableDictionary *settings = [[NSMutableDictionary alloc] init];
[settings setObject:[NSNumber numberWithFloat: 8000.0] forKey:AVSampleRateKey];
[settings setObject:[NSNumber numberWithInt: kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[settings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[settings setObject:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
[settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
NSString *path = [WDUtility createDirInDocument:@"audios" withOrderID:orderID withPathExtension:@"wav"];
NSURL *tmpFile = [NSURL fileURLWithPath:path];
recorder = [[AVAudioRecorder alloc] initWithURL:tmpFile settings:settings error:nil];
[recorder setDelegate:self];
[recorder prepareToRecord];
[recorder record];
}
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone.
I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for:
// returns false on iPhone 14, iPad gen 10
print(AVExternalStorageDeviceDiscoverySession.isSupported)
The following code returns null, when I try to access the external storage discovery session.
// returns null on iOS devices
print(AVExternalStorageDeviceDiscoverySession.shared)
The following returns false, without displaying a permission dialog:
AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in
// returns false with no permission dialog
print(granted);
What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession?
What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive).
Is there are sample code for using the AV External Storage api?
Hi, I’ve developed a photo app that includes a photo deletion feature.
Some users have reported encountering PHPhotosError.operationInterrupted (3301) when attempting to delete photos.
Initially, I suspected that some of the assets might have a sourceType of typeiTunesSynced, since the documentation notes that iTunes-synced assets cannot be edited or deleted.
However, after checking the logs, all of the assets involved are of typeUserLibrary.
Additionally, the user mentioned that some photos in the iPhone Photos do not show a delete button.
I’m unsure whether the absence of the delete button is related to the 3301 error.
I’d like to confirm the following:
Under what conditions does PHPhotosError.operationInterrupted (3301) occur, and how should it be handled?
Why do some photos in the iPhone Photos not show a delete button?
The code for deleting photos is as follows:
PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary];
[library performChanges:^{
PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithLocalIdentifiers:delUrls options:nil];
if (assetsToBeDeleted) {
[PHAssetChangeRequest deleteAssets:assetsToBeDeleted];
}
} completionHandler:^(BOOL success, NSError *error) {
I am playing the protected HLS streams and the authorization token expires in 3 minutes. I am trying to achieve this with 'AVAssetResourceLoaderDelegate'. I can refresh the token and play it, but the problem is in between the session, the player stalls for a small time, LIKE 1 SECOND.
Here's my code :
class APLCustomAVARLDelegate: NSObject, AVAssetResourceLoaderDelegate {
static let httpsScheme = "https"
static let redirectErrorCode = 302
static let badRequestErrorCode = 400
private var token: String?
private var retryDictionary = [String: Int]()
private let maxRetries = 3
private func schemeSupported(_ scheme: String) -> Bool {
let supported = ishttpSchemeValid(scheme)
print("Scheme '\(scheme)' supported: \(supported)")
return supported
}
private func reportError(loadingRequest: AVAssetResourceLoadingRequest, error: Int) {
let nsError = NSError(domain: NSURLErrorDomain, code: error, userInfo: nil)
print("Reporting error: \(nsError)")
loadingRequest.finishLoading(with: nsError)
}
// Handle token renewal requests to prevent playback stalls
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForRenewalOfRequestedResource renewalRequest: AVAssetResourceRenewalRequest) -> Bool {
print("Resource renewal requested for URL: \(renewalRequest.request.url?.absoluteString ?? "unknown URL")")
// Handle renewal the same way we handle initial requests
guard let scheme = renewalRequest.request.url?.scheme else {
print("No scheme found in the renewal URL.")
return false
}
if isHttpsSchemeValid(scheme) {
return handleHttpsRequest(renewalRequest)
}
print("Scheme not supported for renewal.")
return false
}
private func isHttpsSchemeValid(_ scheme: String) -> Bool {
let isValid = scheme == APLCustomAVARLDelegate.httpsScheme
print("httpsScheme scheme '\(scheme)' valid: \(isValid)")
return isValid
}
private func generateHttpsURL(sourceURL: URL) -> URL? {
// If you need to modify the URL, do it here
// Currently this just returns the same URL
let urlString = sourceURL.absoluteString
print("Generated HTTPS URL: \(urlString)")
return URL(string: urlString)
}
private func handleHttpsRequest(_ loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
print("Handling HTTPS request.")
guard let sourceURL = loadingRequest.request.url,
var redirectURL = generateHttpsURL(sourceURL: sourceURL) else {
print("Failed to generate HTTPS URL.")
reportError(loadingRequest: loadingRequest, error: APLCustomAVARLDelegate.badRequestErrorCode)
return true
}
// Track retry attempts with a dictionary keyed by request URL
let urlString = sourceURL.absoluteString
let currentRetries = retryDictionary[urlString] ?? 0
if currentRetries < maxRetries {
retryDictionary[urlString] = currentRetries + 1
} else {
// Too many retries, report a more specific error
reportError(loadingRequest: loadingRequest, error: NSURLErrorTimedOut)
retryDictionary.removeValue(forKey: urlString)
return true
}
if var urlComponents = URLComponents(url: redirectURL, resolvingAgainstBaseURL: false) {
var queryItems = urlComponents.queryItems ?? []
// Generate a fresh token each time
let freshToken = AESTimeBaseEncription.secureEncryptSecretText()
// Check if the token already exists
if let existingTokenIndex = queryItems.firstIndex(where: { $0.name == "token" }) {
// Update the existing token
queryItems[existingTokenIndex].value = freshToken
} else {
// Add the token if it doesn't exist
queryItems.append(URLQueryItem(name: "token", value: freshToken))
}
urlComponents.queryItems = queryItems
redirectURL = urlComponents.url!
}
let redirectRequest = URLRequest(url: redirectURL)
let response = HTTPURLResponse(url: redirectURL, statusCode: APLCustomAVARLDelegate.redirectErrorCode, httpVersion: nil, headerFields: nil)
print("Redirecting HTTPS to URL: \(redirectURL)")
loadingRequest.redirect = redirectRequest
loadingRequest.response = response
loadingRequest.finishLoading()
// If successful, reset the retry counter
if retryDictionary[urlString] == maxRetries {
retryDictionary.removeValue(forKey: urlString)
}
return true
}
}
I am following the Apple sample code and trying to add a manual focus lens position slider:
@available(iOS 18.0, *)
private func addCameraControls() {
if !self.session.controls.isEmpty {
for control in self.session.controls {
self.session.removeControl(control)
}
}
self.cameraControlFocusSlider = nil
//Focus Slider
if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported {
self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0)
self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in
//Do manual focus
}
if self.session.canAddControl(self.cameraControlFocusSlider!) {
self.session.addControl(self.cameraControlFocusSlider!)
}
}
}
So there are these AVCaptureSessionControlsDelegate methods:
final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeActive")
}
final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillEnterFullscreenAppearance")
}
final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillExitFullscreenAppearance")
}
final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeInactive")
}
So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used?
Please note that I will have more than one AVCaptureSlider in the final code.