Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

75 Posts
Sort by:
Post not yet marked as solved
0 Replies
193 Views
I'm working on an app that uses CoreHaptics to play a synchronised pattern of vibrations and audio. The problem is that the audio only gets played through the iPhones speakers (if the mute switch is not turned on). As soon as I connect my AirPods to the phone the audio stops playing, but the haptics continue. My code looks something like this: let engine = CHHapticEngine() ... var events = [CHHapticEvent]() ... let volume: Float = 1 let decay: Float = 0.5 let sustained: Float = 0.5 let audioParameters = [ CHHapticEventParameter(parameterID: .audioVolume, value: volume), CHHapticEventParameter(parameterID: .decayTime, value: decay), CHHapticEventParameter(parameterID: .sustained, value: sustained) ] let breathingTimes = pacer.breathingTimeInSeconds let combinedTimes = breathingTimes.inhale + breathingTimes.exhale let audioEvent = CHHapticEvent( audioResourceID: selectedAudio, parameters: audioParameters, relativeTime: 0, duration: combinedTimes ) events.append(audioEvent) ... let pattern = try CHHapticPattern(events: events, parameterCurves: []) let player = try engine.makeAdvancedPlayer(with: pattern) ... try player.start(atTime: CHHapticTimeImmediate) My idea to activate an audio session before the player starts, to indicate to the system that audio is played, also didn't changed the outcome: try AVAudioSession.sharedInstance().setActive(true) Is there a different way to route the audio from CoreHaptics to a different output other than the integrated speakers?
Posted
by
Post not yet marked as solved
1 Replies
221 Views
I'm building an app using SwiftUI, and am perplexed at why it seems so difficult to simply play an audio file that is in my assets. One would think it possible to write some code like: play(sound: "(name).m4a") but this seems unsupported. You must write elaborate, verbose code. Anyone comment on why it doesn't 'just work'? I understand that much more complex audio code requires more, but it seems that simply playing a file could be supported.
Posted
by
Post not yet marked as solved
1 Replies
256 Views
I'm using AVPlayer to play mp3 files from a remote url. i'm having some issues with the initial loading time of the mp3, it is very slow (around 5-8 sec). i compared it with other third parties players and its much slower, i also compared it with an android player and it is also much slower. so the problem is not with the url itself nor with the network connection.. another interesting point is that after the AVPlayer start playing the mp3, seeking is very fast (almost immediately), does that mean the player download the entire mp3 file before start playing, and thats the reason it is so slow? can i control this behaviour? if not, any other ideas what can be the reason?
Posted
by
Post not yet marked as solved
0 Replies
537 Views
I reset the old player item(remove all observers also) and avplayercontroller then add a new avplayerviewcontroller instance and avplayer and player item on playing a new asset/stream etc. It works fine and no crash in tvos 14, 13 etc. But in tvos 15.2 and above i get the following stack trace. Below are the details i could collect from firebase, please check if it could help you to infer the cause of the crash. Thanks! Crashed: com.apple.main-thread SIGSEGV 0x00000007f2e51110 Crashed: com.apple.main-thread 0 libobjc.A.dylib        0x7624 class_getMethodImplementation + 32 1 Foundation           0xa1f30 _NSKVONotifyingOriginalClassForIsa + 28 2 Foundation           0x9db38 _NSKeyValueObservationInfoGetObservances + 272 3 Foundation           0xa8c50 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:] + 244 4 Foundation           0xa9540 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:] + 68 5 Foundation           0xa2080 _NSSetObjectValueAndNotify + 284 6 AVKit             0xe6b8 -[AVInterstitialController dealloc] + 32 7 AVKit             0x356e4 -[AVPlayerControllerTVExtras .cxx_destruct] + 144 8 libobjc.A.dylib        0x7d68 object_cxxDestructFromClass(objc_object*, objc_class*) + 112 9 libobjc.A.dylib        0x1dad0 objc_destructInstance + 88 10 libobjc.A.dylib        0x24f90 _objc_rootDealloc + 52 11 libsystem_blocks.dylib     0x37f8 _Block_release + 184 12 libdispatch.dylib       0x4f84 _dispatch_client_callout + 16 13 libdispatch.dylib       0x12164 _dispatch_main_queue_callback_4CF + 916 14 CoreFoundation         0x7a698 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 15 CoreFoundation         0x74b18 __CFRunLoopRun + 2528 16 CoreFoundation         0x73bf4 CFRunLoopRunSpecific + 572 17 GraphicsServices        0x6afc GSEventRunModal + 160 18 UIKitCore           0xa9ccd0 -[UIApplication _run] + 1080 19 UIKitCore           0xaa20cc UIApplicationMain + 164 20 XXXXXXX          0xc474 (Missing UUID xxxxxxxxx…) 21 ???              0x1030e91d0 (Missing)
Post not yet marked as solved
2 Replies
484 Views
I reset the old player item(remove all observers also) and avplayercontroller then add a new avplayerviewcontroller instance and avplayer and player item on playing a new asset/stream etc. It works fine and no crash in tvos 14, 13 etc. But in tvos 15.2 and above i get the following stack trace. Foundation - _NSKVONotifyingOriginalClassForIsa Foundation _NSKVONotifyingOriginalClassForIsa  Foundation _NSKeyValueObservationInfoGetObservances  Foundation -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:]  Foundation -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:]  Foundation _NSSetObjectValueAndNotify  AVKit -[AVInterstitialController dealloc]  AVKit -[AVPlayerControllerTVExtras .cxx_destruct]
Post not yet marked as solved
0 Replies
247 Views
Using AVPlayerView with a running movie, the first scroll event (either direction), and only the first, resets the playhead to the start of a video, if it is the initial event the AVPlayerView receives - after that, the scroll wheel works as expected. This is a PITA when, e.g. after a third of the way through a movie, bumping the mouse wheel sends you back to the beginning. The bug also occurs notwithstanding the playhead is advanced in code with seekToTime(), or the scroll event is simulated with Keyboard Maestro - a wonderfully useful app for debugging and much more (no relation with the developer). Again, only for the first scroll event, and nothing else (see below) has happened. In AppKit, it is possible to intercept scroll wheel events. But one truly useful feature of AVPlayerView is mapping scroll events to the left-/right-arrow keys which provide frame-wise forward/reverse movement. You have to give that up if you intercept scroll events, and there's no way AFAIK to otherwise pass to AVPlayerView an "advance- or reverse-frame" message, at least not exactly the same amount as the scroll wheel does, except on first use. Similarly, I can't figure out a way to send a "forward-one-scroll-unit" event on initial video start to suppress the bug. The AVPlayerView j-k-l shortcuts might be useful for FCP users, but I could do without - alternatively, they should be remappable. If you use these keys before scrolling, or manually advance the playhead with UI, the bug evaporates. The only way to replicate it is to start playing video with the play button (or in code), then wait long enough that you can discern if the playhead resets, then triggering the scroll wheel (or double-swipe on trackpad) - either direction. Since the bug occurs only with the first scroll event, you have to close the AVPlayerView and repeat to replicate. A long debug cycle... Since the bug occurs with both AppKit and SwiftUI, I'm guessing that SwiftUI's PlayerView is just a convenience wrapper of the AppKit version. SwiftUI doesn't offer the developer an equivalent level of event control, which makes the problem even worse. This report applies to macOS; YYMV with iOS. I have seen similar reports earlier by searching for "Calling AVPlayer seekToTime: results in incorrect scrollWheel behavior".
Posted
by
Post not yet marked as solved
0 Replies
265 Views
Can I use FaceID/TouchID when my app in picture in picture mode. Is it possible at all? Because I always get error - LAErrorSystemCancel.
Posted
by
Post not yet marked as solved
0 Replies
242 Views
Hi, I used AVSpeechSynthesizer call in Xcode in one of my app called Trip Tracker GPS - All in One. But I encountered this annoying issue which I have no idea to fix. My users started to complain this also. I am in dire urgent need to find a solution. Please help. Here is the issue: when a user uses this app to track the route, the voice speaks the travel information such as the user's current location, speed and travel time. But sometimes the voice has echo sounds which makes the user hard to understand the voice. The voice does not always have echoing. That makes the debug so difficult. Can Apple technical support tell me in what scenario and why this echo happens? app link https://apps.apple.com/us/app/trip-tracker-gps-all-in-one/id1032770064
Posted
by
Post not yet marked as solved
0 Replies
260 Views
Situation My team is uses AVPlayer to play live audio on iPhones. We would like to better understanding why a user experiences buffering. What we are currently doing: We are currently monitor the following AVPlayer attributes: buffering reason indicated bitrate observed bitrate error log events  What we have noticed: Buffering reason - is always toMinimizeStalls due to the fact that the buffer is empty. Indicated bitrate - reports the BANDWITH from the manifest url as expected. Observed Bitrate - Values reported here can be lower than the indicated bitrate yet still stream without encountering any buffers. I would expect values under indicated bitrate to encounter buffers as described here here on the apple developer website Error Log Events - Occasionally the error log will report an error code and message however around 60% of the time we don’t have any details from here that indicates why the user is experiencing buffering. When we do experience error codes there doesn't appear to be any map showing what the error code means. Questions: Is there a way to get signal strength from an iPhone (weak signal would give us some reasoning for buffering) What is the recommended approach for getting reasons for buffering? (How to distinguish between a server side issue and a client side issue) Are there AVPlayer settings we can manipulate to reduce buffering?
Posted
by
Post not yet marked as solved
1 Replies
367 Views
We currently have an ARKit app on the store which uses ARKit to provide both camera images and audio data for export to video. So far this works well and without issues. We discovered recently that if we plug in a 3rd-party microphone (e.g., RODE VideoMic Me-L) into the Lightning port on an iPhone 11 - 13, the app seems to freeze and crash on starting the ARSession with the following error: com.apple.arkit.error Code=102 "Required sensor failed." This does not, despite the error, seem to be related to Microphone permissions (which searches on this topic have brought up). At this point we have added most plist permissions to the app with no success. We can reproduce this with Apple's own RealityKit2 "Underwater" example. Simply add the following line to line 216 of UnderwaterView.swift: configuration.providesAudioData = true // ADD ME configuration.planeDetection.insert(.horizontal) session.run(configuration) Plug in the 3rd party mic (e.g., RODE VideoMic Me-L), run the app, and it will bomb out with: 2022-01-11 15:59:48.710585+0000 Underwater[3089:1945857] [Session] ARSession <0x113428a80>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x2819d1320 {Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x2819d1f20 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.} We're struggling to find a solution here and could really use the advice if anyone understands what the issue may be. Otherwise, we'll submit a bug report to Apple and see where we go from there.
Posted
by
Post not yet marked as solved
1 Replies
489 Views
I am working on audio recording. when application running on foreground i have to start audio recording and going to background at that time audio recording working fine. But my question is that how to start audio recording when i am already in background, My audio recording function fired like this: I have a Bluetooth LE device with buttons and an iOS app. Those two are paired (Bluetooth LE device and the iPhone which runs the iOS app) and the iOS app is listening for events on the Bluetooth LE device, events like a hit of a button. Now, when the user hits a button on the Bluetooth LE device, the iOS app captures the event and I am able to run code even if the app is in background, but I am not able to start a voice recording. I have already enable Background Modes: Here is my Code for Audio Recording: func startRecording() { DispatchQueue.global(qos: .background).asyncAfter(deadline: DispatchTime.now(), qos: .background) { let audioFilename = self.getDocumentsDirectory().appendingPathComponent("recording.m4a") print("record Audio \(audioFilename)") let settings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: 12000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] do { self.audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings) self.audioRecorder.delegate = self self.audioRecorder.record() } catch { self.finishRecording(success: false) } } } func getDocumentsDirectory() -> URL {     let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)     return paths[0]   } I can't find proper solution to do that thing, Please suggest me proper way to do that, Thanks in Advance.
Posted
by
Post not yet marked as solved
0 Replies
466 Views
Hi! I have a question about the iOS15 picture-in-picture APIs for video calls in VoIP apps. I understand that for an app to continue using the camera when a video call is using PiP in background mode, it needs the com.apple.developer.avfoundation.multitasking-camera-access entitlement. Is this entitlement needed for development purposes as well, i.e. when developing such an app and running on device via Xcode, do I need that entitlement to be able to test the feature locally? I've followed the PiP guides and created a sample app that feeds frames into an AVSampleBufferDisplayLayer ready for PiP - I can see video from the remote end of the call (coming in from a web browser) rendering as expected when the app is in the foreground, but when I press the home button to start PiP multitasking, the PiP UI just shows a spinner. My logs show that frames are still being sent to the AVSampleBufferDisplayLayer in the background, but nothing is rendering. I was wondering if this is connected in some way to me not having the entitlement. So I guess there are 2 questions here: Does the app need the com.apple.developer.avfoundation.multitasking-camera-access entitlement to work during development, or is it only a requirement when submitting to the app store? Could the absence of the entitlement be the reason why PiP isn't rendering video frames from the remote end (which isn't an iOS device)? Thanks for your help in advance! Ceri
Posted
by
Post not yet marked as solved
0 Replies
206 Views
Hi, I'm trying to add custom actions on AVPlayerViewController on iOS. I was able to use transportBarCustomMenuItems for tvOS but I can't find any iOS equivalent. On the Apple's TV app for iOS they use custom menus like this.
Posted
by
Post not yet marked as solved
0 Replies
279 Views
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Posted
by
Post not yet marked as solved
2 Replies
377 Views
I'm trying to enforce a duration limit for picked videos. It doesn't appear that there's a configuration option in PHPickerConfiguration unless I missed it. I'm able to get the duration from the URL returned by loadFileRepresentation but it appears that this loads the entire file which somewhat defeats the point of limiting video size and can take a long time. It also looks like I can use the local assetIdentifier but this requires initializing PHPickerConfiguration with a PHPhotoLibrary which requires asking the user for permission and complicates the whole flow when I just want them to pick a single file. Is there a way to take advantage of PHPickerViewController and let the user choose a video and enforce a video limit without loading the entire file / requiring photo library permissions? Thanks!
Posted
by
Post not yet marked as solved
0 Replies
296 Views
Hello everyone I'm seeing weird crash on bugsnag. Its about player on tvOS and it happens when I'm exiting player. And its 2 system classes. Can someone help me understand what's going on here Unable to activate constraint with anchors <NSLayoutXAxisAnchor:0x2831d5480 "AVFocusProxyView:0x1224b3370.left"> and <NSLayoutXAxisAnchor:0x28356cf40 "AVPlayerLayerView:0x1224bf9b0.left"> because they have no common ancestor. Does the constraint or its anchors reference items in different view hierarchies? That's illegal.
Posted
by
Post not yet marked as solved
0 Replies
355 Views
I want to make a custom UI that integrates each video feed from my FaceTime group activity participants. I found the app Share+ in the App Store integrates the video from each FaceTime participant into it's own UI, so I know it's possible. Can anyone point me to the relevant documentation that shows me how I can get to the video of each FaceTime group member to put in my own AVSampleBufferDisplayLayer or otherwise?
Posted
by