Hey all!
in my personal quest to make future proof apps moving to Swift 6, one of my app has a problem when setting an artwork image in MPNowPlayingInfoCenter
Here's what I'm using to set the metadata
func setMetadata(title: String? = nil, artist: String? = nil, artwork: String? = nil) async throws {
let defaultArtwork = UIImage(named: "logo")!
var nowPlayingInfo = [
MPMediaItemPropertyTitle: title ?? "***",
MPMediaItemPropertyArtist: artist ?? "***",
MPMediaItemPropertyArtwork: MPMediaItemArtwork(boundsSize: defaultArtwork.size) { _ in
defaultArtwork
}
] as [String: Any]
if let artwork = artwork {
guard let url = URL(string: artwork) else { return }
let (data, response) = try await URLSession.shared.data(from: url)
guard (response as? HTTPURLResponse)?.statusCode == 200 else { return }
guard let image = UIImage(data: data) else { return }
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in
image
}
}
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
}
the app crashes when hitting
MPMediaItemPropertyArtwork: MPMediaItemArtwork(boundsSize: defaultArtwork.size) { _ in
defaultArtwork
}
or
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in
image
}
commenting out these two make the app work again.
Again, no clue on why.
Thanks in advance
Media Player
RSS for tagFind and play songs, audio podcasts, audio books, and more from within your app using Media Player.
Posts under Media Player tag
50 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
My Environment:
Device: Mac (Apple Silicon, arm64)
OS: macOS 15.6.1
Description:
I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state.
I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue.
Steps to Reproduce:
Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B):
Start playback in App A.
Start playback in App B. (App B now has audio focus, and App A is still playing).
Attempt to pause App A via the system's Control Center or its own UI.
Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive.
If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state.
My Questions:
Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application?
Is this a known issue or expected behavior in macOS?
Are there any official workarounds or solutions to ensure the UI updates correctly?
How can media resources in my app be recommended to the system media control center, just like TikTok in the picture
I want to use Apple ringtones in my app so that the app can play some of them. Could this lead to any legal issues or App Store rejection? What are the guidelines and potential concerns I should be aware of?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
AudioToolbox
Media Player
Audio
After Picture-in-Picture is opened on the camera interface, the custom UI cannot be displayed. Is there any way to make the custom UI visible? If custom UI cannot be displayed, how do teleprompter-type apps in the store manage to show custom teleprompter text within the camera?
I have a small .mov I created using screenshot and I want to use it as a preview. I have managed to resize it to the required 1920x1080, added a sound track using ffmpeg (home-brew), drop it into an iMovie App preview project, share it as a file, drag that file to App Store Connect/Apps/myApp/"App previews and Screenshots" only to have it rejected for "frame rate too high", 30 fps required. There appears to be no way to specify frame rate in "Screenshot" nor iMovie during "share". Aside from using a third party app "Handbrake" to edit the file, what can be done?
Maybe more importantly, why is 30 fps required when it isn't a standard output of screenshot nor iMovie/AppPreviewProject ?
btw: iMovie/AppPreview/Help shows submittal of non-1920x1080 files to AppStoreConnect
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Image I/O
Media Player
App Store Connect
Hi all,
I'm working on an audio visualizer app that plays files from the user's music library utilizing MediaPlayer and AVAudioEngine. I'm working on getting the music library functionality working before the visualizer aspect.
After setting up the engine for file playback, my app inexplicably crashes with an EXC_BREAKPOINT with code = 1. Usually this means I'm unwrapping a nil value, but I think I'm handling the optionals correctly with guard statements. I'm not able to pinpoint where it's crashing. I think it's either in the play function or the setupAudioEngine function. I removed the processAudioBuffer function and my code still crashes the same way, so it's not that. The device that I'm testing this on is running iOS 26 beta 3, although my app is designed for iOS 18 and above.
After commenting out code, it seems that the app crashes at the scheduleFile call in the play function, but I'm not fully sure.
Here is the setupAudioEngine function:
private func setupAudioEngine() {
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print("Audio session error: \(error)")
}
engine.attach(playerNode)
engine.attach(analyzer)
engine.connect(playerNode, to: analyzer, format: nil)
engine.connect(analyzer, to: engine.mainMixerNode, format: nil)
analyzer.installTap(onBus: 0, bufferSize: 1024, format: nil) { [weak self] buffer, _ in
self?.processAudioBuffer(buffer)
}
}
Here is the play function:
func play(_ mediaItem: MPMediaItem) {
guard let assetURL = mediaItem.assetURL else {
print("No asset URL for media item")
return
}
stop()
do {
audioFile = try AVAudioFile(forReading: assetURL)
guard let audioFile else {
print("Failed to create audio file")
return
}
duration = Double(audioFile.length) / audioFile.fileFormat.sampleRate
if !engine.isRunning {
try engine.start()
}
playerNode.scheduleFile(audioFile, at: nil)
playerNode.play()
DispatchQueue.main.async { [weak self] in
self?.isPlaying = true
self?.startDisplayLink()
}
} catch {
print("Error playing audio: \(error)")
DispatchQueue.main.async { [weak self] in
self?.isPlaying = false
self?.stopDisplayLink()
}
}
}
Here is a link to my test project if you want to try it out for yourself:
https://github.com/aabagdi/VisualMan-example
Thanks!
Hi Apple Music API / MusicKit / MediaPlayer Team,
Similar to the currentPlaybackRate keeps the same pitch, it would be great to have a currentPlaybackPitch parameter as well. Alternatively, adding a preservesPitch parameter would also work.
I see that iOS 26 AutoMix on Apple Music currently does pitch shifting during music transitions, so maybe this is something that could be exposed on the later betas of iOS 26?
Main feature request we get is to have simple pitch changes to Apple Music we play through our app. Is this being considered?
Topic:
Media Technologies
SubTopic:
General
Tags:
Apple Music API
FairPlay Streaming
Media Player
MusicKit
Just wondering if anyone else is having issues with currentPlaybackRate in release version of iOS 15.4? In my particular case this is using MPMusicPlayerController.applicationQueuePlayer.
I've always had issues controlling this property reliably but from what I can see it is now completely non-operational in 15.4.
I've isolated this behavior in a trivial project, and will file a radar, but hoping others may have some insight first.
FWIW- This is my trivial test case:
class ViewController: UIViewController {
lazy var player: MPMusicPlayerApplicationController = {
let player = MPMusicPlayerController.applicationQueuePlayer
player.repeatMode = .none
player.shuffleMode = .off
player.beginGeneratingPlaybackNotifications()
return player
}()
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(forName: .MPMusicPlayerControllerPlaybackStateDidChange, object: nil, queue: .main) { [weak self] notification in
guard let notificationPlayer = notification.object as? MPMusicPlayerApplicationController,
notificationPlayer === self?.player else {
return
}
debugPrint("Player state now: \(notificationPlayer.playbackState)")
}
}
@IBAction func goAction(_ sender: Any) {
guard let item = MPMediaQuery.songs().items?.randomElement() else {
debugPrint("Unable to access media items")
return
}
debugPrint("Now playing item: \(item.title ?? "")")
player.setQueue(with: [item.playbackStoreID])
player.prepareToPlay() { error in
guard error == nil else {
debugPrint("Player error: \(error!.localizedDescription)")
return
}
DispatchQueue.main.async { [weak self] in
self?.player.play()
}
}
}
@IBAction func slowAction(_ sender: Any) {
debugPrint("Setting currentPlaybackRate to 0.5")
player.currentPlaybackRate = 0.5
checkPlaybackRate()
}
@IBAction func fastAction(_ sender: Any) {
debugPrint("Setting currentPlaybackRate to 1.5")
player.currentPlaybackRate = 1.5
checkPlaybackRate()
}
func checkPlaybackRate(afterSeconds delay: TimeInterval = 1.0) {
DispatchQueue.main.asyncAfter(deadline: .now() + delay) {
debugPrint("After \(delay) seconds currentPlaybackRate now: \(self.player.currentPlaybackRate)")
}
}
}
Typical console output:
"Now playing item: I Know You Know"
"Player state now: MPMusicPlaybackState(rawValue: 2)"
"Player state now: MPMusicPlaybackState(rawValue: 1)"
"Setting currentPlaybackRate to 1.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
"Setting currentPlaybackRate to 0.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
Hi everyone,
I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback, using ApplicationMusicPlayer. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing.
I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already:
• Display detailed scrolling waveforms of Apple Music songs
• Scratch, loop or time-stretch those tracks in real time
That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement.
My questions:
Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content?
If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access?
Where can I find official documentation or a point of contact for this kind of request?
I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated!
Thanks in advance.
When I import both AppIntents and MediaPlayer in the same file, I get the error “No such module '_MediaPlayer_AppIntents’”
To reproduce:
Create a new project with Xcode 26 beta, selecting storyboard UI and swift language.
In ViewController.swift add import AppIntents and import MediaPlayer
Build
Get the error: “No such module '_MediaPlayer_AppIntents’”
Submitted as FB18189693
We are seeing logs were on iOS devices we see some keyframes request.
but on safari browser don’t see any request like this. could you please explain what is it.
/d8ceb9244ff889b42b82eb807327531-c27dbcb10e0bbf3cde6c-1/d8ceb9244ff88e9b42b82eb807327531-c27dbcb10e0bbf3cde6c-1/keyframes/hls/.
I’m building a music app using Apple Music streaming via ApplicationMusicPlayer.
My goal is to decrease the volume of the current song during the last 10 seconds, and when the next track begins, restore the volume to its normal level.
I know that ApplicationMusicPlayer doesn’t expose a volume API, and I want to avoid triggering the system volume HUD.
✅ Using Apple Music streaming (not local files)
❓ Is it possible to implement per-track fade-out/fade-in logic with ApplicationMusicPlayer?
Appreciate any clarification or official guidance!
I have an app that displays artwork via MPMediaItem.artwork, requesting an image with a specific size. How do I get a media item's MPMediaItemAnimatedArtwork, and how to get the preview image and video to display to the user?
Howdy. I'm trying to access media from a users song library and receive:
<ICUserIdentityStoreACAccountBackend: 0x148f8af30> Failed to initialize active account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}
I'm told I need to add a Media Library Access Capability. Nothing like this shows up in Xcode under Signing & Capabilities > +Capabilities. Also I can't find anything like this in my account in dev.apple.com.
How do I enable myself and a test user using another iPhone device to access my music and their music respectively?
Thanks!
Topic:
Media Technologies
SubTopic:
General
Tags:
App Tracking Transparency
Media Player
iOS
MusicKit
In iOS 18, CarPlay shows an error: “There was a problem loading this content” after playback starts. Audio works fine, but the Now Playing screen doesn’t load. I’m using MPPlayableContentManager. This worked fine in iOS 17. Anyone else seeing this error in iOS 18?
I am work an app development on an app which request an audio function in background as an alert sound.
during debug testing , the function work fine,
but once I testing standalone without debugging , The function not work , it will play out the sound when I back to app.
does any way to trace the issues ?
Is there any way we can detect the status of the Show When Muted and Show on Skip Back device settings in code ?
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
Let's consider the following code.
I've created an actor that loads a list of .mp3 files from a Bundle and then makes it available for audio reproduction.
Unfortunately, I'm experiencing a memory leak.
At the play method.
player.play()
From Instruments I get
_malloc_type_malloc_outlined libsystem_malloc.dylib
start_wqthread libsystem_pthread.dylib
private actor AudioActor {
enum Failure: Error {
case soundsNotLoaded([AudioPlayerClient.Sound: Error])
}
enum Player {
case music(AVAudioPlayer)
}
var players: [Sound: Player] = [:]
let bundles: [Bundle]
init(bundles: UncheckedSendable<[Bundle]>) {
self.bundles = bundles.wrappedValue
}
func load(sounds: [Sound]) throws {
try AVAudioSession.sharedInstance().setActive(true, options: [])
var errors: [Sound: Error] = [:]
for sound in sounds {
guard let url = bundle.url(forResource: sound.name, withExtension: "mp3")
else { continue }
do {
self.players[sound] = try .music(AVAudioPlayer(contentsOf: url))
} catch {
errors[sound] = error
}
}
guard errors.isEmpty
else { throw Failure.soundsNotLoaded(errors) }
}
func play(sound: Sound, loops: Int?) throws {
guard let player = self.players[sound]
else { return }
switch player {
case let .music(player):
player.numberOfLoops = loops ?? -1
player.play()
}
}
func stop(sound: Sound) throws {
guard let player = self.players[sound]
else { throw Failure.soundsNotLoaded([:]) }
switch player {
case let .music(player):
player.stop()
}
}
}