Post not yet marked as solved
I want to use AVPlayerViewController to display the video but it should be in auto-play mode.
Previously I was using AVPlayer for that and listening to the .AVPlayerItemDidPlayToEndTime notification but I wonder if there is a better way? eg. using AVPlayerLooper for instance so I don't have to use that .AVPlayerItemDidPlayToEndTime anymore
I wrote something like this but it is not working - I have a black screen with video controls - probably because AVPlayerViewController does not have any playable content...
struct VideoPlayerQueuedView: UIViewControllerRepresentable {
let videoUrl: URL
func makeUIViewController(context: Context) -> AVPlayerViewController {
let queuePlayer = AVQueuePlayer()
let playerViewController = AVPlayerViewController()
// Create an AVPlayerItem from the videoUrl
let playerItem = AVPlayerItem(url: videoUrl)
// Create an AVPlayerLooper with the queuePlayer and the playerItem as the template item
let playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
// Set the player property of AVPlayerViewController
playerViewController.player = queuePlayer
return playerViewController
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
// Update the video player if needed
}
}
Post not yet marked as solved
I'm getting the following error in Xcode and I can't figure out how to fix it.
Cannot load underlying module for 'MediaPlayer'
I've searched Google and have come across lots of other people with similar issues whereby they're importing something and it's producing this error alongside it.
https://stackoverflow.com/questions/76256875/cannot-load-underlying-module-for-scenekit
https://developer.apple.com/forums/thread/115059
https://stackoverflow.com/questions/32673866/cocoapods-cannot-load-underlying-module-for-x
The error is being shown inline with this bit of code:
import SwiftUI
import MusicKit
import MediaPlayer <-- this line here
I can build and run the project without issue and it disappears. But will quickly reappear again a short while later and it's very annoying.
How can I resolve this please?
Post not yet marked as solved
Hello,
We are using HLS for our streaming iOS and tvOS applications. We have DRM protection on our applications but we want add another secure layer which is CDN token.
We want to add that CDN token data on header or query parameters. Any of two are applicable at our CDN side. There is a problem at client side. We want to send that token knowledge and refresh at given a time.
We add token data using at initial state
let asset = AVURLAsset(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headers])
and add interceptor with asset.resourceLoader.setDelegate. It works seamlessly.
We use AVAssetResourceLoaderDelegate and we can intercept just master playlist and playlists via
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool
and we can refresh CDN token data at only playlists. That token data can be at query params or header. It does not matter.
For example,
#EXTM3U
#EXT-X-VERSION:3
#EXTINF:10.0
https://chunk1?cdntoken=A.ts
#EXTINF:10.0
https://chunk2?cdntoken=A.ts
#EXTINF:10.0
https://chunk3?cdntoken=A.ts
#EXTINF:10.0
assume that it is our .m3u8 file for given live video playlist. It has three chunks with cdn token data in query params. If we give that chunks to AVPlayer. It is going to play chunks in order.
When we change new cdn token at query params, it effects our chunk Urls and our player stalls. It is because our cdn adds new cdn token knowledge to chunk's Url. It means that our new .m3u8 is going to be like that for next playlist;
#EXT-X-VERSION:3
#EXTINF:10.0
https://chunk4?cdntoken=B.ts
#EXTINF:10.0
https://chunk5?cdntoken=B.ts
#EXTINF:10.0
https://chunk6?cdntoken=B.ts
#EXTINF:10.0
Cdn token data is converted to B from A at cdn side and it sends new playlist like that. That's why, our player is going to stall. Is there any way not to player stall with edited chunk url?
When we change new cdn token at header. It does not change chunks url like at the first question but AVPlayer does not allow to intercept chunk Urls which means that before callin https://chunk1?cdntoken=A.ts url, i want to intercept and add new cdn token data to header. Is there any way to intercept chunk Urls like intercepting playlist?
Thanks for answers in advance
Post not yet marked as solved
Turn on address sanitizer on Xcode and use a real device and put a Test.mp3 file in the Xcode project. Then it will crash when you initialise a AVAudioPlayer with a mp3 file (with a wav file it works fine). I have made an entry in feedback assistent -> FB12425453.
var player : AVAudioPlayer?
func playSound() {
if let url = Bundle.main.url(forResource: "Test", withExtension: "mp3") {
self.player = try? AVAudioPlayer(contentsOf: url) // --> deallocation of non allocated memory problem --> with a "wav" file it works
....
}
}
Post not yet marked as solved
hi
I am having a issue with sound on a network video stream, the stream is loaded by a m3u,.
during playback there is no audio from the device, however when using headphones / airplay audio works correctly.
the other peculiar thing is the device simulator works fine. this maybe related to airplay working, but I don't know.
this is the view handling the playback. Im not sure where the issue is.
I can also play the videos fine when embedding the avplayer in its own view. but that looks messy when you have to dismiss a second window when closing the video.
#if os(iOS)
import SwiftUI
import AVKit
import MediaPlayer
struct iOSVideoLibraryView: View {
@ObservedObject var videoLibrary: VideoLibrary
@State private var isPlayerDismissed = false
let LiveStreams = [GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible())]
let VODStreams = [GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible())]
var body: some View {
NavigationView {
ScrollView {
LazyVGrid(columns: LiveStreams, spacing: 20) {
ForEach(videoLibrary.videos, id: \.title) { video in
if video.type == "LIVE" {
Button(action: {
isPlayerDismissed = false // Reset the dismissal flag
presentVideoPlayer(videoURL: video.referenceURL)
}) {
VStack {
Image(systemName: "play.circle.fill")
.font(.system(size: 30)) // icon
.foregroundColor(.blue)
Text(video.title)
.frame(width: 100, height: 50) // title bounds
.font(Font.caption)
.background(Color.blue)
.foregroundColor(.white)
.cornerRadius(3)
}
.frame(width: 70) // main button container
.padding()
.background(Color.blue.opacity(0.2))
.cornerRadius(10)
}
} else {
// Handle non-LIVE videos
}
}
}
.padding()
}
.navigationBarTitle("Live Streams")
}
}
private func presentVideoPlayer(videoURL: URL) {
let playerViewController = CustomAVPlayerViewController()
let player = AVPlayer(url: videoURL)
playerViewController.player = player
player.isMuted = false
player.play()
DispatchQueue.main.async {
playerViewController.modalPresentationStyle = .fullScreen
UIApplication.shared.windows.first?.rootViewController?.present(playerViewController, animated: true, completion: nil)
}
}
}
class PlayerManager: NSObject, AVPictureInPictureControllerDelegate {
static let shared = PlayerManager()
func pictureInPictureControllerWillStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
// Perform any necessary actions when picture-in-picture starts
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
// Perform any necessary actions when picture-in-picture stops
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: Error) {
// Perform any necessary actions when picture-in-picture fails to start
}
}
class CustomAVPlayerViewController: AVPlayerViewController {
let playerManager = PlayerManager.shared
let customPlayer = AVPlayer()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
if AVPictureInPictureController.isPictureInPictureSupported() {
if let playerItem = customPlayer.currentItem {
let playerLayer = AVPlayerLayer(player: customPlayer)
playerLayer.videoGravity = .resizeAspectFill
let pictureInPictureController = AVPictureInPictureController(playerLayer: playerLayer)
pictureInPictureController?.delegate = playerManager
if let pictureInPictureController = pictureInPictureController,
pictureInPictureController.isPictureInPicturePossible {
pictureInPictureController.startPictureInPicture()
}
}
}
}
override func viewDidLoad() {
super.viewDidLoad()
customPlayer.addObserver(self, forKeyPath: "currentItem", options: .new, context: nil)
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
customPlayer.removeObserver(self, forKeyPath: "currentItem")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "currentItem" {
if let playerItem = customPlayer.currentItem {
// Handle player item change
}
}
}
}
#endif
Post not yet marked as solved
As of the latest builds of both iOS 16.6 and iOS 17.0, playing albums from a users library using ApplicationMusicPlayer plays the songs on that album out of order. This is a dealbreaker for my app, and I’ve had to revert back to the Media Player framework for reliable behavior.
If I fetch an album from a MusicLibraryRequest and load it into the queue using the API introduced in 16.4, init(album:startingAt:)., it starts at track 1 but then plays the rest of the tracks in random order. This happens whether skipping tracks or letting them play through.
The shuffleMode of the player is .off. The issue does not occur with albums fetched from the Apple Music catalog and loaded using that same API, nor does it occur for MPMediaItemCollections loaded into an applicationQueuePlayer via a queue descriptor.
I've submitted this issue as FB12495051 and provided a sysdiagnose. Please let me know if I can provide any other information.
Post not yet marked as solved
Hi! I'm currently developing an app that can play music stored locally. It was working fine previously, but after updating my device to iOS 17, I started getting error -54 when I try to play the file. I also noticed that when getting the list of files in MPMediaQuery.songs(), I would encounter the following errors:
I suspect it might be some issue with file permissions, but I can't figure out what i am missing. I have already checked that MPMediaLibrary.authorizationStatus() is authorized.
Does anyone know what the issue might be? Thank you
Post not yet marked as solved
Hi,
I am currently using the third-party audio library to play audio in my app. It's worth noting that my app deals exclusively with network audio, so the audio that is stored on the network. I deal with Shoutcast streams and just remote MP3 files. Is there a way to do this with native APple APIs? My motivation is that I want to adopt Share Play, lock screen player support and other native goodies. I use Swift UI.P.S. Sorry for random tags, I am blind and the interface for choosing tags cannot be used with VoiceOver
Post not yet marked as solved
Hi Team,
I am trying to play audio by using AVPlayer from the background that I have fetch data from api after first content play and start next content taken from api.
Firstly I play audio from the foreground state than put my app in background after few minutes first audio completed and I try to initiate AVPlayer using next audio content from background but the audio is not playing.
Same audio playing in next mode fine when app is in foreground state.
Post not yet marked as solved
If I disable playback controls for an AVPlayer (showsPlaybackControls), some feature of MPNowPlayingInfoCenter no longer working. (play/pause, skip forward and backward).
I need custom video and audio controls on my AVPlayer in my app, that's why I disabled the iOS playback controls. But I also need the features of the MPNowPlayingInfoCenter. Is there another solution to achieve this?
Post not yet marked as solved
I imported the MediaPlayer framework and used MPMusicPlayerPlayParameters in my SwiftUI project.
While the app is launching, If the app is built by Xcode 14.3.1 or a newer version, it instantly crashes on a simulator or on a real device.
This problem persists on iOS 16.0 to 16.4. (I haven't tried iOS 15 or below) I also tried to build and run the app with Xcode 15 - beta 4 however it still crashes.
Because of this reason, I archive and submit to App Store with Xcode 14.2.
(the built made by Xcode 14.2 works perfectly!!)
Here is the error message:
dyld[17235]: Symbol not found: _$sSo27MPMusicPlayerPlayParametersCSe05MediaB0Mc
Referenced from:
<77FEF170-9C51-3580-8F8B-2ADD2F1B3FD1> /Users/[UserName]/Library/Developer/CoreSimulator/Devices/72CE26D8-4DD4-4319-B0C7-DE52D6645875/data/Containers/Bundle/Application/C808623F-5372-40F0-907F-E86E12AE6EDD/[AppName].app/[AppName]
Expected in:
<06F636E1-695C-34F1-809D-7CA24F67AFE9> /Library/Developer/CoreSimulator/Volumes/iOS_20B72/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 16.1.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/MediaPlayer.framework/MediaPlayer
I believe this is Apple related issue however if there is any way that can fix the issue, please let me know.
Post not yet marked as solved
I have noticed changes Apple Music made to my library, take in particular a changed album edition that is reflected in how the title is listed. I can see the new title in the Music app in two different devices.
On one device MPMediaQuery returns the album with the new title. The other device (an iPad with less memory, in case that matters) is still returning the old edition. Is there anything I can do to make sure the data returned is up to date and matches what is seen in the Music app?
Post not yet marked as solved
Hi
I'm developing a full-duplex iPhone voice chat application and I'd like to intercept bluetooth headset button events to perform certain actions in my app while maintaining a full-duplex audio. I'm using the MediaPlay to intercept remote bluetooth AVRCP MPRemoteCommandEvent play/pause events as well setting AVAudioSession to use the BluetoothA2DP category, however, when I do this, I can't seem to use the bluetooth microphone as an audio input. Specifically, when I query AVAudioSession for available inputs, bluetooth is not returned. I'm guessing this is because A2DP is a half-duplex protocol, but my understanding is that AVRCP events are only available with A2DP. The other bluetooth profile choice is HSP (AVAudioSession category Bluetooth), which works for full-duplex audio, but does not appear to provide a way to intercept the various AT commands from this profile unless I'm in an actual telephone call. For example, when I use HSP and press a button on my headset, I see in the logs the AT+CHUP command being sent from the headset to the phone.
Two questions:
Is there a way to use a bluetooth microphone while using A2DP for output at the same time?
If the above can't be done, is there a way to intercept the HSP AT control commands from a headset without being in a telephone call?
Thanks.
Post not yet marked as solved
Hi.
In iOS 17 Apple introduced crossfading for the Music app. In my app I am using MPMusicPlayerController.applicationQueuePlayer and also want to support crossfading. In an early beta of iOS 17 my app crossfaded automatically with the same setting as for the Music app which maybe was a bug. However, I did not find any way to enable crossfade for my app.
Does anybody know how to do that?
Thanks,
Dirk
Post not yet marked as solved
Problem Description
This HLS video https://lf3-vod-cdn-tos.douyinstatic.com/obj/vodsass/hls/main.m3u8 starts with noise at 22 seconds play directly on MacOS 12.6.6 Safari,and it also appears on iOS (16.5.1) safari. But there is no noise when playing with MSE on Mac by the third-party open source web playe such as hls.js on Safari.
Test tool
hls.js test demo: https://hlsjs.video-dev.org/demo/
Post not yet marked as solved
There’s an unexpected behaviour when using the append(_:) & prepend(_:) methods on iOS 17 Beta 6 and Public Beta 4.
Observed Behaviour
When queuing using the mentioned methods on recent iOS 17 Betas, the supplied music isn‘t queued up and the now playing music pauses. When using applicationQueuePlayer, it even ends up crashing the app.
FB13010449
Sample Project
Post not yet marked as solved
Am I the only one having this problem?
The Music app on iOS 17 beta does not cleanly seque between songs that are intended to have no gap between them. When gapless songs are played (e.g., Pink Floyd's "Dark Side of the Moon", The Beatles' "Abbey Road"), a noticeable gap is heard between songs. These are songs in my music library that I sync from my computer, not Apple Music streams. It should be noted that this bug also exists in iOS 16.6, and I've verified that the problem does not occur on a older phone running iOS 15.7.8.
More importantly, when I put the device in Airplane mode, the songs seque correctly, without any gaps. I suspect that the Music app is phoning home to Apple (a bad practice in and of itself) and something is interrupting playback queuing. Even when I have Cellular access turned off for the Music app, and am not connected to Wi-Fi, the problem persists. The only way to make gapless playback work is to turn off all of the device radios via Airplane mode.
Understand that this is NOT a cross-fade issue. It is a gapless issue. And it's not an Apple Music streaming issue. The problem seems to be more prevalent for songs encoded as 128 kbps AAC.
In comparison, the Music app on macOS (Ventura, 13.5) operates correctly. It's only iOS that no longer performs gapless playback.
I've filed bug reports (FB12992049 and FB13019931), but have not heard anything from Apple.
Like I asked at the beginning, am I the only one having this problem? It's extremely maddening that Apple can't get this right. I can't play my Pink Floyd, Alan Parsons, and my other AOR playlists, including my late 60's Beatles. Steve Jobs would be rolling in his grave.
Post not yet marked as solved
Hi.
I am transitioning my app from using MPMediaQuery to using MusicLibraryRequest from MusicKit. This is working fine for playlists in the user's library.
I also allow my app users to play their podcasts. I currently use MPMediaQuery.podcasts(). Does anyone know if there is an equivalent using MusicLibraryRequest?
Cheers,
Ian
Post not yet marked as solved
In our application, we play video-on-demand (VOD) content and display subtitles in different languages.
Post not yet marked as solved
I've integrated MPVolumeView into my view, and it correctly responds to hardware volume changes as expected. However, once I initiate audio streaming using AVAudioEngine to capture microphone audio and AudioUnit for decoding, the MPVolumeView ceases to reflect changes made using the hardware volume buttons. Additionally, even when I adjust the volume using the slider on MPVolumeView, it doesn't change the system volume. Has anyone else encountered this issue? What might be causing MPVolumeView to stop responding to hardware volume changes once streaming starts?
For the AVAudioSession.Mode, I use the default setting because using .voiceChat prevents MPVolumeView update from device volume changes permanently.
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord, options: [.allowBluetooth])
try session.setActive(true)
} catch {
print(error.localizedDescription)
}