Post not yet marked as solved
In our application, we play video-on-demand (VOD) content and display subtitles in different languages.
Post not yet marked as solved
Hi.
I am transitioning my app from using MPMediaQuery to using MusicLibraryRequest from MusicKit. This is working fine for playlists in the user's library.
I also allow my app users to play their podcasts. I currently use MPMediaQuery.podcasts(). Does anyone know if there is an equivalent using MusicLibraryRequest?
Cheers,
Ian
Post not yet marked as solved
There’s an unexpected behaviour when using the append(_:) & prepend(_:) methods on iOS 17 Beta 6 and Public Beta 4.
Observed Behaviour
When queuing using the mentioned methods on recent iOS 17 Betas, the supplied music isn‘t queued up and the now playing music pauses. When using applicationQueuePlayer, it even ends up crashing the app.
FB13010449
Sample Project
Is it possible to completely remove the before/next buttons from the CPNowPlayingTemplate?
I've already tried to set live-streaming mode, but the buttons are still there.
Post not yet marked as solved
Hi
I have this piece of code in my app that is supposed to open up a file from Music(old iTunes) app and play. But I get 'Attempted to register account monitor for types client is not authorized to access "com.apple.account.iTunesStore". Any suggests how to fix this. What entitlements do I need to set?
Code and error logs are below
code-block
``` func showiPOD() {
let mediaPicker: MPMediaPickerController = MPMediaPickerController.self(mediaTypes:MPMediaType.anyAudio)
mediaPicker.delegate = self as MPMediaPickerControllerDelegate
mediaPicker.allowsPickingMultipleItems = false
mediaPicker.showsCloudItems = true //show from iCloud as well.. needs to be tested
self.present(mediaPicker, animated: true, completion: nil)
}
2023-01-24 09:31:22.018992-0800 Smart Practice[526:16253] [Entitlements] MSVEntitlementUtilities - Process Smart Practice PID[526] - Group: (null) - Entitlement: com.apple.accounts.appleaccount.fullaccess - Entitled: NO - Error: (null)
2023-01-24 09:31:22.022520-0800 Smart Practice[526:16253] [core] Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
```language
code-block
Post not yet marked as solved
Hey there, we're using a CDN with HTTP referer checks in place for streaming media. When streaming with Airplay, what is the referer set on the HTTP header? For instance, for Google Chromecast, it's: https://www.gstatic.com/
Post not yet marked as solved
Hello,
I'm using systemMusicPlayer to play Apple Music Live Radio Station got from Apple Music API. But it doesn't work. How can I do that?
Error:
Test[46751:13235249] [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play}
My implementation:
let musicPlayerController = MPMusicPlayerController.systemMusicPlayer
musicPlayerController.beginGeneratingPlaybackNotifications()
musicPlayerController.setQueue(with: "ra.978194965")
musicPlayerController.play()
API response:
{
“id”: “ra.978194965”,
“type”: “stations”,
“href”: “/v1/catalog/us/stations/ra.978194965”,
“attributes”: {
“artwork”: {
“width”: 4320,
“url”: “https://is2-ssl.mzstatic.com/image/thumb/Features114/v4/e5/10/76/e5107683-9e51-ebc5-3901-d8fbd65f2c2a/source/{w}x{h}sr.jpeg”,
“height”: 1080,
“textColor3”: “332628”,
“textColor2”: “120509”,
“textColor4”: “33272a”,
“textColor1”: “000000”,
“bgColor”: “f4f4f4”,
“hasP3”: false
},
“url”: “https://music.apple.com/us/station/apple-music-1/ra.978194965”,
“mediaKind”: “audio”,
“supportedDrms”: [
“fairplay”,
“playready”,
“widevine”
],
“requiresSubscription”: false,
“name”: “Apple Music 1”,
“kind”: “streaming”,
“radioUrl”: “itsradio://music.apple.com/us/station/ra.978194965”,
“playParams”: {
“id”: “ra.978194965”,
“kind”: “radioStation”,
“format”: “stream”,
“stationHash”: “CgkIBRoFlaS40gMQBA”,
“mediaType”: 0
},
“editorialNotes”: {
“name”: “Apple Music 1”,
“short”: “The new music that matters.”,
“tagline”: “The new music that matters.”
},
“isLive”: true
}
},```
Thank you!
Best regards,
MichaelNg
Post not yet marked as solved
Hello,
We are using HLS for our streaming iOS and tvOS applications. We have DRM protection on our applications but we want add another secure layer which is CDN token.
We want to add that CDN token data on header or query parameters. Any of two are applicable at our CDN side. There is a problem at client side. We want to send that token knowledge and refresh at given a time.
We add token data using at initial state
let asset = AVURLAsset(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headers])
and add interceptor with asset.resourceLoader.setDelegate. It works seamlessly.
We use AVAssetResourceLoaderDelegate and we can intercept just master playlist and playlists via
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool
and we can refresh CDN token data at only playlists. That token data can be at query params or header. It does not matter.
For example,
#EXTM3U
#EXT-X-VERSION:3
#EXTINF:10.0
https://chunk1?cdntoken=A.ts
#EXTINF:10.0
https://chunk2?cdntoken=A.ts
#EXTINF:10.0
https://chunk3?cdntoken=A.ts
#EXTINF:10.0
assume that it is our .m3u8 file for given live video playlist. It has three chunks with cdn token data in query params. If we give that chunks to AVPlayer. It is going to play chunks in order.
When we change new cdn token at query params, it effects our chunk Urls and our player stalls. It is because our cdn adds new cdn token knowledge to chunk's Url. It means that our new .m3u8 is going to be like that for next playlist;
#EXT-X-VERSION:3
#EXTINF:10.0
https://chunk4?cdntoken=B.ts
#EXTINF:10.0
https://chunk5?cdntoken=B.ts
#EXTINF:10.0
https://chunk6?cdntoken=B.ts
#EXTINF:10.0
Cdn token data is converted to B from A at cdn side and it sends new playlist like that. That's why, our player is going to stall. Is there any way not to player stall with edited chunk url?
When we change new cdn token at header. It does not change chunks url like at the first question but AVPlayer does not allow to intercept chunk Urls which means that before callin https://chunk1?cdntoken=A.ts url, i want to intercept and add new cdn token data to header. Is there any way to intercept chunk Urls like intercepting playlist?
Thanks for answers in advance
Post not yet marked as solved
Problem Description
This HLS video https://lf3-vod-cdn-tos.douyinstatic.com/obj/vodsass/hls/main.m3u8 starts with noise at 22 seconds play directly on MacOS 12.6.6 Safari,and it also appears on iOS (16.5.1) safari. But there is no noise when playing with MSE on Mac by the third-party open source web playe such as hls.js on Safari.
Test tool
hls.js test demo: https://hlsjs.video-dev.org/demo/
Post not yet marked as solved
Hi
I'm developing a full-duplex iPhone voice chat application and I'd like to intercept bluetooth headset button events to perform certain actions in my app while maintaining a full-duplex audio. I'm using the MediaPlay to intercept remote bluetooth AVRCP MPRemoteCommandEvent play/pause events as well setting AVAudioSession to use the BluetoothA2DP category, however, when I do this, I can't seem to use the bluetooth microphone as an audio input. Specifically, when I query AVAudioSession for available inputs, bluetooth is not returned. I'm guessing this is because A2DP is a half-duplex protocol, but my understanding is that AVRCP events are only available with A2DP. The other bluetooth profile choice is HSP (AVAudioSession category Bluetooth), which works for full-duplex audio, but does not appear to provide a way to intercept the various AT commands from this profile unless I'm in an actual telephone call. For example, when I use HSP and press a button on my headset, I see in the logs the AT+CHUP command being sent from the headset to the phone.
Two questions:
Is there a way to use a bluetooth microphone while using A2DP for output at the same time?
If the above can't be done, is there a way to intercept the HSP AT control commands from a headset without being in a telephone call?
Thanks.
Post not yet marked as solved
I'm getting the following error in Xcode and I can't figure out how to fix it.
Cannot load underlying module for 'MediaPlayer'
I've searched Google and have come across lots of other people with similar issues whereby they're importing something and it's producing this error alongside it.
https://stackoverflow.com/questions/76256875/cannot-load-underlying-module-for-scenekit
https://developer.apple.com/forums/thread/115059
https://stackoverflow.com/questions/32673866/cocoapods-cannot-load-underlying-module-for-x
The error is being shown inline with this bit of code:
import SwiftUI
import MusicKit
import MediaPlayer <-- this line here
I can build and run the project without issue and it disappears. But will quickly reappear again a short while later and it's very annoying.
How can I resolve this please?
Post marked as Apple Recommended
We are using AVPlayerViewController to display/play video. Till iOS 15 video player options are visible but for iOS 16 it's not visible.
Do we require to make changes for iOS 16 to display video player options?
Post not yet marked as solved
I have noticed changes Apple Music made to my library, take in particular a changed album edition that is reflected in how the title is listed. I can see the new title in the Music app in two different devices.
On one device MPMediaQuery returns the album with the new title. The other device (an iPad with less memory, in case that matters) is still returning the old edition. Is there anything I can do to make sure the data returned is up to date and matches what is seen in the Music app?
Post not yet marked as solved
I imported the MediaPlayer framework and used MPMusicPlayerPlayParameters in my SwiftUI project.
While the app is launching, If the app is built by Xcode 14.3.1 or a newer version, it instantly crashes on a simulator or on a real device.
This problem persists on iOS 16.0 to 16.4. (I haven't tried iOS 15 or below) I also tried to build and run the app with Xcode 15 - beta 4 however it still crashes.
Because of this reason, I archive and submit to App Store with Xcode 14.2.
(the built made by Xcode 14.2 works perfectly!!)
Here is the error message:
dyld[17235]: Symbol not found: _$sSo27MPMusicPlayerPlayParametersCSe05MediaB0Mc
Referenced from:
<77FEF170-9C51-3580-8F8B-2ADD2F1B3FD1> /Users/[UserName]/Library/Developer/CoreSimulator/Devices/72CE26D8-4DD4-4319-B0C7-DE52D6645875/data/Containers/Bundle/Application/C808623F-5372-40F0-907F-E86E12AE6EDD/[AppName].app/[AppName]
Expected in:
<06F636E1-695C-34F1-809D-7CA24F67AFE9> /Library/Developer/CoreSimulator/Volumes/iOS_20B72/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 16.1.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/MediaPlayer.framework/MediaPlayer
I believe this is Apple related issue however if there is any way that can fix the issue, please let me know.
Post not yet marked as solved
If I disable playback controls for an AVPlayer (showsPlaybackControls), some feature of MPNowPlayingInfoCenter no longer working. (play/pause, skip forward and backward).
I need custom video and audio controls on my AVPlayer in my app, that's why I disabled the iOS playback controls. But I also need the features of the MPNowPlayingInfoCenter. Is there another solution to achieve this?
Post not yet marked as solved
Hi Team,
I am trying to play audio by using AVPlayer from the background that I have fetch data from api after first content play and start next content taken from api.
Firstly I play audio from the foreground state than put my app in background after few minutes first audio completed and I try to initiate AVPlayer using next audio content from background but the audio is not playing.
Same audio playing in next mode fine when app is in foreground state.
Post not yet marked as solved
Hi,
I am currently using the third-party audio library to play audio in my app. It's worth noting that my app deals exclusively with network audio, so the audio that is stored on the network. I deal with Shoutcast streams and just remote MP3 files. Is there a way to do this with native APple APIs? My motivation is that I want to adopt Share Play, lock screen player support and other native goodies. I use Swift UI.P.S. Sorry for random tags, I am blind and the interface for choosing tags cannot be used with VoiceOver
Post not yet marked as solved
As of the latest builds of both iOS 16.6 and iOS 17.0, playing albums from a users library using ApplicationMusicPlayer plays the songs on that album out of order. This is a dealbreaker for my app, and I’ve had to revert back to the Media Player framework for reliable behavior.
If I fetch an album from a MusicLibraryRequest and load it into the queue using the API introduced in 16.4, init(album:startingAt:)., it starts at track 1 but then plays the rest of the tracks in random order. This happens whether skipping tracks or letting them play through.
The shuffleMode of the player is .off. The issue does not occur with albums fetched from the Apple Music catalog and loaded using that same API, nor does it occur for MPMediaItemCollections loaded into an applicationQueuePlayer via a queue descriptor.
I've submitted this issue as FB12495051 and provided a sysdiagnose. Please let me know if I can provide any other information.
Post not yet marked as solved
hi
I am having a issue with sound on a network video stream, the stream is loaded by a m3u,.
during playback there is no audio from the device, however when using headphones / airplay audio works correctly.
the other peculiar thing is the device simulator works fine. this maybe related to airplay working, but I don't know.
this is the view handling the playback. Im not sure where the issue is.
I can also play the videos fine when embedding the avplayer in its own view. but that looks messy when you have to dismiss a second window when closing the video.
#if os(iOS)
import SwiftUI
import AVKit
import MediaPlayer
struct iOSVideoLibraryView: View {
@ObservedObject var videoLibrary: VideoLibrary
@State private var isPlayerDismissed = false
let LiveStreams = [GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible())]
let VODStreams = [GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible())]
var body: some View {
NavigationView {
ScrollView {
LazyVGrid(columns: LiveStreams, spacing: 20) {
ForEach(videoLibrary.videos, id: \.title) { video in
if video.type == "LIVE" {
Button(action: {
isPlayerDismissed = false // Reset the dismissal flag
presentVideoPlayer(videoURL: video.referenceURL)
}) {
VStack {
Image(systemName: "play.circle.fill")
.font(.system(size: 30)) // icon
.foregroundColor(.blue)
Text(video.title)
.frame(width: 100, height: 50) // title bounds
.font(Font.caption)
.background(Color.blue)
.foregroundColor(.white)
.cornerRadius(3)
}
.frame(width: 70) // main button container
.padding()
.background(Color.blue.opacity(0.2))
.cornerRadius(10)
}
} else {
// Handle non-LIVE videos
}
}
}
.padding()
}
.navigationBarTitle("Live Streams")
}
}
private func presentVideoPlayer(videoURL: URL) {
let playerViewController = CustomAVPlayerViewController()
let player = AVPlayer(url: videoURL)
playerViewController.player = player
player.isMuted = false
player.play()
DispatchQueue.main.async {
playerViewController.modalPresentationStyle = .fullScreen
UIApplication.shared.windows.first?.rootViewController?.present(playerViewController, animated: true, completion: nil)
}
}
}
class PlayerManager: NSObject, AVPictureInPictureControllerDelegate {
static let shared = PlayerManager()
func pictureInPictureControllerWillStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
// Perform any necessary actions when picture-in-picture starts
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
// Perform any necessary actions when picture-in-picture stops
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: Error) {
// Perform any necessary actions when picture-in-picture fails to start
}
}
class CustomAVPlayerViewController: AVPlayerViewController {
let playerManager = PlayerManager.shared
let customPlayer = AVPlayer()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
if AVPictureInPictureController.isPictureInPictureSupported() {
if let playerItem = customPlayer.currentItem {
let playerLayer = AVPlayerLayer(player: customPlayer)
playerLayer.videoGravity = .resizeAspectFill
let pictureInPictureController = AVPictureInPictureController(playerLayer: playerLayer)
pictureInPictureController?.delegate = playerManager
if let pictureInPictureController = pictureInPictureController,
pictureInPictureController.isPictureInPicturePossible {
pictureInPictureController.startPictureInPicture()
}
}
}
}
override func viewDidLoad() {
super.viewDidLoad()
customPlayer.addObserver(self, forKeyPath: "currentItem", options: .new, context: nil)
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
customPlayer.removeObserver(self, forKeyPath: "currentItem")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "currentItem" {
if let playerItem = customPlayer.currentItem {
// Handle player item change
}
}
}
}
#endif
Hey there Apple Music team! I'm excited to dig into the sessions coming up this week, and what I've seen so far from the developer documentation diffs looks great: audio quality, artist images, and a way to interface with a user's music library in MusicKit. Love it!
The thing at the very top of my WWDC wishlist this year was macOS/Mac Catalyst support for the ApplicationMusicPlayer class. I just got finished installing Ventura and Xcode 14, and sadly it looks like the support story is the same as on Big Sur. No API availability on macOS, and an available Mac Catalyst API that ultimately results in the same error from a feedback I submitted on Big Sur: FB9851840
The connection to service named com.apple.Music.MPMusicPlayerApplicationControllerInternal was invalidated: failed at lookup with error 3 - No such process.
Is that the end of the story on Ventura, or is there a chance support might be added in a later beta? Is there any additional detail at all that can be shared? I field several requests a week asking if/when my app is coming to the Mac, and I would really love to be able to make that happen. If there is anything at all I can do to test and help overcome the engineering challenges alluded to in the past, I am ready, willing, and able!
In any case, thanks for the great work, and I'm looking forward to spending time with the new stuff this summer.