Media Player

RSS for tag

Find and play songs, audio podcasts, audio books, and more from within your app using Media Player.

Posts under Media Player tag

50 Posts

Post

Replies

Boosts

Views

Activity

MPMediaItemPropertyArtwork crashes on Swift 6
Hey all! in my personal quest to make future proof apps moving to Swift 6, one of my app has a problem when setting an artwork image in MPNowPlayingInfoCenter Here's what I'm using to set the metadata func setMetadata(title: String? = nil, artist: String? = nil, artwork: String? = nil) async throws { let defaultArtwork = UIImage(named: "logo")! var nowPlayingInfo = [ MPMediaItemPropertyTitle: title ?? "***", MPMediaItemPropertyArtist: artist ?? "***", MPMediaItemPropertyArtwork: MPMediaItemArtwork(boundsSize: defaultArtwork.size) { _ in defaultArtwork } ] as [String: Any] if let artwork = artwork { guard let url = URL(string: artwork) else { return } let (data, response) = try await URLSession.shared.data(from: url) guard (response as? HTTPURLResponse)?.statusCode == 200 else { return } guard let image = UIImage(data: data) else { return } nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in image } } MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo } the app crashes when hitting MPMediaItemPropertyArtwork: MPMediaItemArtwork(boundsSize: defaultArtwork.size) { _ in defaultArtwork } or nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in image } commenting out these two make the app work again. Again, no clue on why. Thanks in advance
6
0
1.7k
4d
MPNowPlayingInfoCenter playbackState fails to update after losing audio focus on macOS
My Environment: Device: Mac (Apple Silicon, arm64) OS: macOS 15.6.1 Description: I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state. I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue. Steps to Reproduce: Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B): Start playback in App A. Start playback in App B. (App B now has audio focus, and App A is still playing). Attempt to pause App A via the system's Control Center or its own UI. Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive. If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state. My Questions: Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application? Is this a known issue or expected behavior in macOS? Are there any official workarounds or solutions to ensure the UI updates correctly?
0
0
101
1w
making preview for app
I have a small .mov I created using screenshot and I want to use it as a preview. I have managed to resize it to the required 1920x1080, added a sound track using ffmpeg (home-brew), drop it into an iMovie App preview project, share it as a file, drag that file to App Store Connect/Apps/myApp/"App previews and Screenshots" only to have it rejected for "frame rate too high", 30 fps required. There appears to be no way to specify frame rate in "Screenshot" nor iMovie during "share". Aside from using a third party app "Handbrake" to edit the file, what can be done? Maybe more importantly, why is 30 fps required when it isn't a standard output of screenshot nor iMovie/AppPreviewProject ? btw: iMovie/AppPreview/Help shows submittal of non-1920x1080 files to AppStoreConnect
0
0
84
Jul ’25
Execution breakpoint when trying to play a music library file with AVAudioEngine
Hi all, I'm working on an audio visualizer app that plays files from the user's music library utilizing MediaPlayer and AVAudioEngine. I'm working on getting the music library functionality working before the visualizer aspect. After setting up the engine for file playback, my app inexplicably crashes with an EXC_BREAKPOINT with code = 1. Usually this means I'm unwrapping a nil value, but I think I'm handling the optionals correctly with guard statements. I'm not able to pinpoint where it's crashing. I think it's either in the play function or the setupAudioEngine function. I removed the processAudioBuffer function and my code still crashes the same way, so it's not that. The device that I'm testing this on is running iOS 26 beta 3, although my app is designed for iOS 18 and above. After commenting out code, it seems that the app crashes at the scheduleFile call in the play function, but I'm not fully sure. Here is the setupAudioEngine function: private func setupAudioEngine() { do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default) try AVAudioSession.sharedInstance().setActive(true) } catch { print("Audio session error: \(error)") } engine.attach(playerNode) engine.attach(analyzer) engine.connect(playerNode, to: analyzer, format: nil) engine.connect(analyzer, to: engine.mainMixerNode, format: nil) analyzer.installTap(onBus: 0, bufferSize: 1024, format: nil) { [weak self] buffer, _ in self?.processAudioBuffer(buffer) } } Here is the play function: func play(_ mediaItem: MPMediaItem) { guard let assetURL = mediaItem.assetURL else { print("No asset URL for media item") return } stop() do { audioFile = try AVAudioFile(forReading: assetURL) guard let audioFile else { print("Failed to create audio file") return } duration = Double(audioFile.length) / audioFile.fileFormat.sampleRate if !engine.isRunning { try engine.start() } playerNode.scheduleFile(audioFile, at: nil) playerNode.play() DispatchQueue.main.async { [weak self] in self?.isPlaying = true self?.startDisplayLink() } } catch { print("Error playing audio: \(error)") DispatchQueue.main.async { [weak self] in self?.isPlaying = false self?.stopDisplayLink() } } } Here is a link to my test project if you want to try it out for yourself: https://github.com/aabagdi/VisualMan-example Thanks!
8
0
591
Jul ’25
currentPlaybackPitch or preservesPitch parameter
Hi Apple Music API / MusicKit / MediaPlayer Team, Similar to the currentPlaybackRate keeps the same pitch, it would be great to have a currentPlaybackPitch parameter as well. Alternatively, adding a preservesPitch parameter would also work. I see that iOS 26 AutoMix on Apple Music currently does pitch shifting during music transitions, so maybe this is something that could be exposed on the later betas of iOS 26? Main feature request we get is to have simple pitch changes to Apple Music we play through our app. Is this being considered?
0
0
392
Jul ’25
MPMediaPlayback.currentPlaybackRate no longer working in iOS 15.4?
Just wondering if anyone else is having issues with currentPlaybackRate in release version of iOS 15.4? In my particular case this is using MPMusicPlayerController.applicationQueuePlayer. I've always had issues controlling this property reliably but from what I can see it is now completely non-operational in 15.4. I've isolated this behavior in a trivial project, and will file a radar, but hoping others may have some insight first. FWIW- This is my trivial test case: class ViewController: UIViewController {     lazy var player: MPMusicPlayerApplicationController = {         let player = MPMusicPlayerController.applicationQueuePlayer         player.repeatMode = .none         player.shuffleMode = .off         player.beginGeneratingPlaybackNotifications()         return player     }()     override func viewDidLoad() {         super.viewDidLoad()         NotificationCenter.default.addObserver(forName: .MPMusicPlayerControllerPlaybackStateDidChange, object: nil, queue: .main) { [weak self] notification in             guard let notificationPlayer = notification.object as? MPMusicPlayerApplicationController,                   notificationPlayer === self?.player else {                 return             }                          debugPrint("Player state now: \(notificationPlayer.playbackState)")         }     }     @IBAction func goAction(_ sender: Any) {         guard let item = MPMediaQuery.songs().items?.randomElement() else {             debugPrint("Unable to access media items")             return         }         debugPrint("Now playing item: \(item.title ?? "")")         player.setQueue(with: [item.playbackStoreID])         player.prepareToPlay() { error in             guard error == nil else {                 debugPrint("Player error: \(error!.localizedDescription)")                 return             }             DispatchQueue.main.async { [weak self] in                 self?.player.play()             }         }     } @IBAction func slowAction(_ sender: Any) {         debugPrint("Setting currentPlaybackRate to 0.5")         player.currentPlaybackRate = 0.5         checkPlaybackRate()     } @IBAction func fastAction(_ sender: Any) {         debugPrint("Setting currentPlaybackRate to 1.5")         player.currentPlaybackRate = 1.5         checkPlaybackRate()     } func checkPlaybackRate(afterSeconds delay: TimeInterval = 1.0) {         DispatchQueue.main.asyncAfter(deadline: .now() + delay) {             debugPrint("After \(delay) seconds currentPlaybackRate now: \(self.player.currentPlaybackRate)")         }     } } Typical console output: "Now playing item: I Know You Know" "Player state now: MPMusicPlaybackState(rawValue: 2)" "Player state now: MPMusicPlaybackState(rawValue: 1)" "Setting currentPlaybackRate to 1.5" "After 1.0 seconds currentPlaybackRate now: 1.0" "Setting currentPlaybackRate to 0.5" "After 1.0 seconds currentPlaybackRate now: 1.0"
25
0
9.1k
Jul ’25
How can third-party iOS apps obtain real-time waveform / spectrogram data for Apple Music tracks (similar to djay & other DJ apps)?
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback, using ApplicationMusicPlayer. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? Where can I find official documentation or a point of contact for this kind of request? I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated! Thanks in advance.
1
2
161
Jul ’25
Xcode26 beta error: “No such module '_MediaPlayer_AppIntents’”
When I import both AppIntents and MediaPlayer in the same file, I get the error “No such module '_MediaPlayer_AppIntents’” To reproduce: Create a new project with Xcode 26 beta, selecting storyboard UI and swift language. In ViewController.swift add import AppIntents and import MediaPlayer Build Get the error: “No such module '_MediaPlayer_AppIntents’” Submitted as FB18189693
1
0
108
Jun ’25
Can I Fade Out Track Volume Before End Using ApplicationMusicPlayer?
I’m building a music app using Apple Music streaming via ApplicationMusicPlayer. My goal is to decrease the volume of the current song during the last 10 seconds, and when the next track begins, restore the volume to its normal level. I know that ApplicationMusicPlayer doesn’t expose a volume API, and I want to avoid triggering the system volume HUD. ✅ Using Apple Music streaming (not local files) ❓ Is it possible to implement per-track fade-out/fade-in logic with ApplicationMusicPlayer? Appreciate any clarification or official guidance!
0
0
51
Jun ’25
All boys wanna access Media Library (sung to All Gurls Wanna Have Fun)
Howdy. I'm trying to access media from a users song library and receive: <ICUserIdentityStoreACAccountBackend: 0x148f8af30> Failed to initialize active account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store} I'm told I need to add a Media Library Access Capability. Nothing like this shows up in Xcode under Signing & Capabilities > +Capabilities. Also I can't find anything like this in my account in dev.apple.com. How do I enable myself and a test user using another iPhone device to access my music and their music respectively? Thanks!
0
0
126
Jun ’25
The camera preview screen cannot be previewed in full screen
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
1
0
146
Apr ’25
Memory leak AVAudioPlayer
Let's consider the following code. I've created an actor that loads a list of .mp3 files from a Bundle and then makes it available for audio reproduction. Unfortunately, I'm experiencing a memory leak. At the play method. player.play() From Instruments I get _malloc_type_malloc_outlined libsystem_malloc.dylib start_wqthread libsystem_pthread.dylib private actor AudioActor { enum Failure: Error { case soundsNotLoaded([AudioPlayerClient.Sound: Error]) } enum Player { case music(AVAudioPlayer) } var players: [Sound: Player] = [:] let bundles: [Bundle] init(bundles: UncheckedSendable<[Bundle]>) { self.bundles = bundles.wrappedValue } func load(sounds: [Sound]) throws { try AVAudioSession.sharedInstance().setActive(true, options: []) var errors: [Sound: Error] = [:] for sound in sounds { guard let url = bundle.url(forResource: sound.name, withExtension: "mp3") else { continue } do { self.players[sound] = try .music(AVAudioPlayer(contentsOf: url)) } catch { errors[sound] = error } } guard errors.isEmpty else { throw Failure.soundsNotLoaded(errors) } } func play(sound: Sound, loops: Int?) throws { guard let player = self.players[sound] else { return } switch player { case let .music(player): player.numberOfLoops = loops ?? -1 player.play() } } func stop(sound: Sound) throws { guard let player = self.players[sound] else { throw Failure.soundsNotLoaded([:]) } switch player { case let .music(player): player.stop() } } }
0
0
73
Mar ’25