Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

75 Posts
Sort by:
Post not yet marked as solved
0 Replies
161 Views
I have AVPlayer to play live stream URL which is working fine but when DAI Ad load, I got black screen in the player. I have added notification AVPlayerItemNewAccessLogEntryNotification observer as below: [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemAccessLogEntry:) name:AVPlayerItemNewAccessLogEntryNotification object:player.currentItem]; When player goes to the black screen, I got the notification event: -(void)playerItemAccessLogEntry:(NSNotification *)notification { DLog(@"VideoWithAdsController:playerItemAccessLogEntry"); AVPlayerItemAccessLog *accessLog = [((AVPlayerItem *)notification.object) accessLog]; AVPlayerItemAccessLogEvent *lastEvent = accessLog.events.lastObject; NSLog(@"The IP address of the server that was the source of the last delivered media segment. = %@",lastEvent.serverAddress); NSLog(@"A count of changes to the property serverAddress = %ld",(long)lastEvent.numberOfServerAddressChanges); } It is printed in console as below: The IP address of the server that was the source of the last delivered media segment. = (null) A count of changes to the property serverAddress = -1 I have attached screenshot of black screen. Let me know what is solution. Thanks in advance.
Posted
by Gaurav-7.
Last updated
.
Post marked as solved
1 Replies
336 Views
In my app I'm trying to play an Apple Music video using VideoPlayer(player: AVPlayer(url: video.url!)) where "video" is a MusicVideo instance but all I get is a blank player. video.title and video.artistName work fine I've tried using fileURLWithPath and string also, which work with hosted videos no problem (apologies for the poor use of terms, newbie)
Posted
by AntMaz.
Last updated
.
Post not yet marked as solved
10 Replies
4.3k Views
I am running into serious issues when using the new VideoPlayer or any kind of AVPlayer in Xcode 12 beta 4. On the simulator and on my physical device (both running iOS 14 beta 4), the VideoPlayer stays black. Even running a different app that uses AVPlayer with a UIViewControllerRepresentable results in the AVPlayer not working. import SwiftUI import AVKit @main struct SerienStreamApp: App {     let player = AVPlayer(url: URL(string: "https://www.radiantmediaplayer.com/media/big-buck-bunny-360p.mp4")!)     var body: some Scene {         WindowGroup {             VideoPlayer(player: player)         }     } } The error I get: 2020-08-12 14:32:20.260745+0200 SerienStream[11426:2122834] libMobileGestalt MobileGestaltCache.c:166: Cache loaded with 4527 pre-cached in CacheData and 47 items in CacheExtra. 2020-08-12 14:32:20.336493+0200 SerienStream[11426:2122544] <CATransformLayer: 0x2812101a0> - changing property masksToBounds in transform-only layer, will have no effect 2020-08-12 14:32:20.336776+0200 SerienStream[11426:2122544] <CATransformLayer: 0x28120eb60> - changing property masksToBounds in transform-only layer, will have no effect 2020-08-12 14:32:20.339250+0200 SerienStream[11426:2122544] <CATransformLayer: 0x281210e00> - changing property masksToBounds in transform-only layer, will have no effect 2020-08-12 14:32:20.353454+0200 SerienStream[11426:2122544] <CATransformLayer: 0x281275ca0> - changing property masksToBounds in transform-only layer, will have no effect 2020-08-12 14:32:20.354109+0200 SerienStream[11426:2122544] <CATransformLayer: 0x281276340> - changing property masksToBounds in transform-only layer, will have no effect 2020-08-12 14:32:20.373421+0200 SerienStream[11426:2122544] <CATransformLayer: 0x281210e00> - changing property allowsGroupBlending in transform-only layer, will have no effect 2020-08-12 14:32:20.373483+0200 SerienStream[11426:2122544] <CATransformLayer: 0x281275ca0> - changing property allowsGroupBlending in transform-only layer, will have no effect 2020-08-12 14:32:20.419000+0200 SerienStream[11426:2122826] Metal API Validation Enabled 2020-08-12 14:32:20.514484+0200 SerienStream[11426:2122544] [] [14:32:20.514] FigSubtitleSampleCreateFromPropertyList signalled err=50 (kFigCFBadPropertyListErr) (NULL or bad plist) at /Library/Caches/com.apple.xbs/Sources/EmbeddedCoreMedia/EmbeddedCoreMedia-2747.2.1.1/Prototypes/ClosedCaptions/FigCaptionCommand.c:792
Posted Last updated
.
Post not yet marked as solved
0 Replies
193 Views
I'm working on an app that uses CoreHaptics to play a synchronised pattern of vibrations and audio. The problem is that the audio only gets played through the iPhones speakers (if the mute switch is not turned on). As soon as I connect my AirPods to the phone the audio stops playing, but the haptics continue. My code looks something like this: let engine = CHHapticEngine() ... var events = [CHHapticEvent]() ... let volume: Float = 1 let decay: Float = 0.5 let sustained: Float = 0.5 let audioParameters = [ CHHapticEventParameter(parameterID: .audioVolume, value: volume), CHHapticEventParameter(parameterID: .decayTime, value: decay), CHHapticEventParameter(parameterID: .sustained, value: sustained) ] let breathingTimes = pacer.breathingTimeInSeconds let combinedTimes = breathingTimes.inhale + breathingTimes.exhale let audioEvent = CHHapticEvent( audioResourceID: selectedAudio, parameters: audioParameters, relativeTime: 0, duration: combinedTimes ) events.append(audioEvent) ... let pattern = try CHHapticPattern(events: events, parameterCurves: []) let player = try engine.makeAdvancedPlayer(with: pattern) ... try player.start(atTime: CHHapticTimeImmediate) My idea to activate an audio session before the player starts, to indicate to the system that audio is played, also didn't changed the outcome: try AVAudioSession.sharedInstance().setActive(true) Is there a different way to route the audio from CoreHaptics to a different output other than the integrated speakers?
Posted Last updated
.
Post not yet marked as solved
1 Replies
221 Views
I'm building an app using SwiftUI, and am perplexed at why it seems so difficult to simply play an audio file that is in my assets. One would think it possible to write some code like: play(sound: "(name).m4a") but this seems unsupported. You must write elaborate, verbose code. Anyone comment on why it doesn't 'just work'? I understand that much more complex audio code requires more, but it seems that simply playing a file could be supported.
Posted Last updated
.
Post not yet marked as solved
1 Replies
256 Views
I'm using AVPlayer to play mp3 files from a remote url. i'm having some issues with the initial loading time of the mp3, it is very slow (around 5-8 sec). i compared it with other third parties players and its much slower, i also compared it with an android player and it is also much slower. so the problem is not with the url itself nor with the network connection.. another interesting point is that after the AVPlayer start playing the mp3, seeking is very fast (almost immediately), does that mean the player download the entire mp3 file before start playing, and thats the reason it is so slow? can i control this behaviour? if not, any other ideas what can be the reason?
Posted Last updated
.
Post not yet marked as solved
24 Replies
2.9k Views
I am working on a podcast application and achieved all the streaming-related parts of the application. The problem explained below, how can I achieve this? Current: The player view is hidden by default. When the user presses the "Play" button, the player view unhides and the player starts streaming. But it won't stop streaming if the user presses another "Play" button so both streams play simultaneously. Goal: When the user starts playing a new podcast by pressing another Play button, I want other players would stop playing and disappear. Extra: The whole podcast view awakes from the xib cell. So, the problem is, how can I manipulate another cell view's member when pressing a button in another cell view
Posted Last updated
.
Post not yet marked as solved
2 Replies
484 Views
I reset the old player item(remove all observers also) and avplayercontroller then add a new avplayerviewcontroller instance and avplayer and player item on playing a new asset/stream etc. It works fine and no crash in tvos 14, 13 etc. But in tvos 15.2 and above i get the following stack trace. Foundation - _NSKVONotifyingOriginalClassForIsa Foundation _NSKVONotifyingOriginalClassForIsa  Foundation _NSKeyValueObservationInfoGetObservances  Foundation -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:]  Foundation -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:]  Foundation _NSSetObjectValueAndNotify  AVKit -[AVInterstitialController dealloc]  AVKit -[AVPlayerControllerTVExtras .cxx_destruct]
Posted Last updated
.
Post not yet marked as solved
0 Replies
247 Views
Using AVPlayerView with a running movie, the first scroll event (either direction), and only the first, resets the playhead to the start of a video, if it is the initial event the AVPlayerView receives - after that, the scroll wheel works as expected. This is a PITA when, e.g. after a third of the way through a movie, bumping the mouse wheel sends you back to the beginning. The bug also occurs notwithstanding the playhead is advanced in code with seekToTime(), or the scroll event is simulated with Keyboard Maestro - a wonderfully useful app for debugging and much more (no relation with the developer). Again, only for the first scroll event, and nothing else (see below) has happened. In AppKit, it is possible to intercept scroll wheel events. But one truly useful feature of AVPlayerView is mapping scroll events to the left-/right-arrow keys which provide frame-wise forward/reverse movement. You have to give that up if you intercept scroll events, and there's no way AFAIK to otherwise pass to AVPlayerView an "advance- or reverse-frame" message, at least not exactly the same amount as the scroll wheel does, except on first use. Similarly, I can't figure out a way to send a "forward-one-scroll-unit" event on initial video start to suppress the bug. The AVPlayerView j-k-l shortcuts might be useful for FCP users, but I could do without - alternatively, they should be remappable. If you use these keys before scrolling, or manually advance the playhead with UI, the bug evaporates. The only way to replicate it is to start playing video with the play button (or in code), then wait long enough that you can discern if the playhead resets, then triggering the scroll wheel (or double-swipe on trackpad) - either direction. Since the bug occurs only with the first scroll event, you have to close the AVPlayerView and repeat to replicate. A long debug cycle... Since the bug occurs with both AppKit and SwiftUI, I'm guessing that SwiftUI's PlayerView is just a convenience wrapper of the AppKit version. SwiftUI doesn't offer the developer an equivalent level of event control, which makes the problem even worse. This report applies to macOS; YYMV with iOS. I have seen similar reports earlier by searching for "Calling AVPlayer seekToTime: results in incorrect scrollWheel behavior".
Posted Last updated
.
Post not yet marked as solved
0 Replies
242 Views
Hi, I used AVSpeechSynthesizer call in Xcode in one of my app called Trip Tracker GPS - All in One. But I encountered this annoying issue which I have no idea to fix. My users started to complain this also. I am in dire urgent need to find a solution. Please help. Here is the issue: when a user uses this app to track the route, the voice speaks the travel information such as the user's current location, speed and travel time. But sometimes the voice has echo sounds which makes the user hard to understand the voice. The voice does not always have echoing. That makes the debug so difficult. Can Apple technical support tell me in what scenario and why this echo happens? app link https://apps.apple.com/us/app/trip-tracker-gps-all-in-one/id1032770064
Posted Last updated
.
Post marked as solved
2 Replies
839 Views
I have a Catalyst application that uses (as expected) MPNowPlayingInfoCenter to set the now playing info and MPRemoteCommandCenter to get the media events for play/pause/stop/favorite/etc. The code is shared on iOS, tvOS and watchOS and it works correctly there. It seems not to work on macOS (app is compiled as a Catalyst application) on Big Sur (and Monterey, fwiw). Media keys on the keyboard starts the Music app, the music part of the control center do not show now playing info (nor the media controls there send messages to the app). I seem to remember that it used to work in Catalina (at least the media key part) and for sure it used to work in a precedent version of the same app that used to be an UIKit one. Is this a bug (worth a feedback to Apple) or something wrong on my side? I forgot some magic capability for macOS? App is sandboxed and uses hardened runtime, in case this is significant. Thank you for any hint!
Posted
by gtufano.
Last updated
.
Post not yet marked as solved
0 Replies
265 Views
Can I use FaceID/TouchID when my app in picture in picture mode. Is it possible at all? Because I always get error - LAErrorSystemCancel.
Posted Last updated
.
Post not yet marked as solved
1 Replies
367 Views
We currently have an ARKit app on the store which uses ARKit to provide both camera images and audio data for export to video. So far this works well and without issues. We discovered recently that if we plug in a 3rd-party microphone (e.g., RODE VideoMic Me-L) into the Lightning port on an iPhone 11 - 13, the app seems to freeze and crash on starting the ARSession with the following error: com.apple.arkit.error Code=102 "Required sensor failed." This does not, despite the error, seem to be related to Microphone permissions (which searches on this topic have brought up). At this point we have added most plist permissions to the app with no success. We can reproduce this with Apple's own RealityKit2 "Underwater" example. Simply add the following line to line 216 of UnderwaterView.swift: configuration.providesAudioData = true // ADD ME configuration.planeDetection.insert(.horizontal) session.run(configuration) Plug in the 3rd party mic (e.g., RODE VideoMic Me-L), run the app, and it will bomb out with: 2022-01-11 15:59:48.710585+0000 Underwater[3089:1945857] [Session] ARSession <0x113428a80>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x2819d1320 {Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x2819d1f20 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.} We're struggling to find a solution here and could really use the advice if anyone understands what the issue may be. Otherwise, we'll submit a bug report to Apple and see where we go from there.
Posted Last updated
.
Post not yet marked as solved
0 Replies
466 Views
Hi! I have a question about the iOS15 picture-in-picture APIs for video calls in VoIP apps. I understand that for an app to continue using the camera when a video call is using PiP in background mode, it needs the com.apple.developer.avfoundation.multitasking-camera-access entitlement. Is this entitlement needed for development purposes as well, i.e. when developing such an app and running on device via Xcode, do I need that entitlement to be able to test the feature locally? I've followed the PiP guides and created a sample app that feeds frames into an AVSampleBufferDisplayLayer ready for PiP - I can see video from the remote end of the call (coming in from a web browser) rendering as expected when the app is in the foreground, but when I press the home button to start PiP multitasking, the PiP UI just shows a spinner. My logs show that frames are still being sent to the AVSampleBufferDisplayLayer in the background, but nothing is rendering. I was wondering if this is connected in some way to me not having the entitlement. So I guess there are 2 questions here: Does the app need the com.apple.developer.avfoundation.multitasking-camera-access entitlement to work during development, or is it only a requirement when submitting to the app store? Could the absence of the entitlement be the reason why PiP isn't rendering video frames from the remote end (which isn't an iOS device)? Thanks for your help in advance! Ceri
Posted Last updated
.
Post not yet marked as solved
1 Replies
489 Views
I am working on audio recording. when application running on foreground i have to start audio recording and going to background at that time audio recording working fine. But my question is that how to start audio recording when i am already in background, My audio recording function fired like this: I have a Bluetooth LE device with buttons and an iOS app. Those two are paired (Bluetooth LE device and the iPhone which runs the iOS app) and the iOS app is listening for events on the Bluetooth LE device, events like a hit of a button. Now, when the user hits a button on the Bluetooth LE device, the iOS app captures the event and I am able to run code even if the app is in background, but I am not able to start a voice recording. I have already enable Background Modes: Here is my Code for Audio Recording: func startRecording() { DispatchQueue.global(qos: .background).asyncAfter(deadline: DispatchTime.now(), qos: .background) { let audioFilename = self.getDocumentsDirectory().appendingPathComponent("recording.m4a") print("record Audio \(audioFilename)") let settings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: 12000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] do { self.audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings) self.audioRecorder.delegate = self self.audioRecorder.record() } catch { self.finishRecording(success: false) } } } func getDocumentsDirectory() -> URL {     let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)     return paths[0]   } I can't find proper solution to do that thing, Please suggest me proper way to do that, Thanks in Advance.
Posted Last updated
.
Post not yet marked as solved
0 Replies
260 Views
Situation My team is uses AVPlayer to play live audio on iPhones. We would like to better understanding why a user experiences buffering. What we are currently doing: We are currently monitor the following AVPlayer attributes: buffering reason indicated bitrate observed bitrate error log events  What we have noticed: Buffering reason - is always toMinimizeStalls due to the fact that the buffer is empty. Indicated bitrate - reports the BANDWITH from the manifest url as expected. Observed Bitrate - Values reported here can be lower than the indicated bitrate yet still stream without encountering any buffers. I would expect values under indicated bitrate to encounter buffers as described here here on the apple developer website Error Log Events - Occasionally the error log will report an error code and message however around 60% of the time we don’t have any details from here that indicates why the user is experiencing buffering. When we do experience error codes there doesn't appear to be any map showing what the error code means. Questions: Is there a way to get signal strength from an iPhone (weak signal would give us some reasoning for buffering) What is the recommended approach for getting reasons for buffering? (How to distinguish between a server side issue and a client side issue) Are there AVPlayer settings we can manipulate to reduce buffering?
Posted
by schretze.
Last updated
.