I am work an app development on an app which request an audio function in background as an alert sound.
during debug testing , the function work fine,
but once I testing standalone without debugging , The function not work , it will play out the sound when I back to app.
does any way to trace the issues ?
Media Player
RSS for tagFind and play songs, audio podcasts, audio books, and more from within your app using Media Player.
Posts under Media Player tag
46 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Is there any way we can detect the status of the Show When Muted and Show on Skip Back device settings in code ?
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
Let's consider the following code.
I've created an actor that loads a list of .mp3 files from a Bundle and then makes it available for audio reproduction.
Unfortunately, I'm experiencing a memory leak.
At the play method.
player.play()
From Instruments I get
_malloc_type_malloc_outlined libsystem_malloc.dylib
start_wqthread libsystem_pthread.dylib
private actor AudioActor {
enum Failure: Error {
case soundsNotLoaded([AudioPlayerClient.Sound: Error])
}
enum Player {
case music(AVAudioPlayer)
}
var players: [Sound: Player] = [:]
let bundles: [Bundle]
init(bundles: UncheckedSendable<[Bundle]>) {
self.bundles = bundles.wrappedValue
}
func load(sounds: [Sound]) throws {
try AVAudioSession.sharedInstance().setActive(true, options: [])
var errors: [Sound: Error] = [:]
for sound in sounds {
guard let url = bundle.url(forResource: sound.name, withExtension: "mp3")
else { continue }
do {
self.players[sound] = try .music(AVAudioPlayer(contentsOf: url))
} catch {
errors[sound] = error
}
}
guard errors.isEmpty
else { throw Failure.soundsNotLoaded(errors) }
}
func play(sound: Sound, loops: Int?) throws {
guard let player = self.players[sound]
else { return }
switch player {
case let .music(player):
player.numberOfLoops = loops ?? -1
player.play()
}
}
func stop(sound: Sound) throws {
guard let player = self.players[sound]
else { throw Failure.soundsNotLoaded([:]) }
switch player {
case let .music(player):
player.stop()
}
}
}
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit.
My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method.
let artworkQueue = DispatchQueue(
label: "MusicKit-ArtworkQueue",
qos: .default,
attributes: .concurrent
)
let artworkSemaphore = DispatchSemaphore(value: 5)
extension Song {
func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) {
artworkQueue.async {
artworkSemaphore.wait()
defer {
artworkSemaphore.signal()
}
let imageURL = artwork?.url(
width: Int(size.width),
height: Int(size.height)
)
// I hate doing this as it might very well break in the future
guard let imageURL, imageURL.scheme == "musicKit"
else {
return completion(nil)
}
guard let imageData = try? Data(contentsOf: imageURL),
let image = UIImage(data: imageData) else {
return completion(nil)
}
completion(image)
}
}
}
I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once.
Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly?
Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734).
On iOS 17 this gave the warning that the completion handler was not run on the main thread.
I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231
but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue.
.task {
await MainActor.run {
let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default()
var nowPlayingInfo = [String: Any]()
let image = NSImage(named: "image")!
// warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in
// Not on main thread here!
return image
})
nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo
}
}
I'm wondering if there is an alternative method to set the now playing artwork?
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Hello. I am attempting to display the music inside of my app in Now Playing. I've tried a few different methods and keep running into unknown issues. I'm new to Objective-C and Apple development so I'm at a loss of how to continue.
Currently, I have an external call to viewDidLoad upon initialization. Then, when I'm ready to play the music, I call playMusic. I have it hardcoded to play an mp3 called "1". I believe I have all the signing set up as the music plays after I exit the app. However, there is nothing in Now Playing. There are no errors or issues that I can see while the app is running. This is the only file I have in Xcode relating to this feature.
Please let me know where I'm going wrong or if there is another object I need to use!
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>
#import <AVFoundation/AVFoundation.h>
@interface ViewController : UIViewController <AVAudioPlayerDelegate>
@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) MPRemoteCommandCenter *commandCenter;
@property (nonatomic, strong) MPMusicPlayerController *controller;
@property (nonatomic, strong) MPNowPlayingSession *nowPlayingSession;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
NSLog(@"viewDidLoad started.");
[self setupAudioSession];
[self initializePlayer];
[self createNowPlayingSession];
[self configureNowPlayingInfo];
NSLog(@"viewDidLoad completed.");
}
- (void)setupAudioSession {
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
if (![audioSession setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError]) {
NSLog(@"Error setting category: %@", [setCategoryError localizedDescription]);
} else {
NSLog(@"Audio session category set.");
}
NSError *activationError = nil;
if (![audioSession setActive:YES error:&activationError]) {
NSLog(@"Error activating audio session: %@", [activationError localizedDescription]);
} else {
NSLog(@"Audio session activated.");
}
}
- (void)initializePlayer {
NSString *soundFilePath = [NSString stringWithFormat:@"%@/base/game/%@",[[NSBundle mainBundle] resourcePath], @"bgm/1.mp3"];
if (!soundFilePath) {
NSLog(@"Audio file not found.");
return;
}
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
self.player = [AVPlayer playerWithURL:soundFileURL];
NSLog(@"Player initialized with URL: %@", soundFileURL);
}
- (void)createNowPlayingSession {
self.nowPlayingSession = [[MPNowPlayingSession alloc] initWithPlayers:@[self.player]];
NSLog(@"Now Playing Session created with players: %@", self.nowPlayingSession.players);
}
- (void)configureNowPlayingInfo {
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
CMTime duration = self.player.currentItem.duration;
Float64 durationSeconds = CMTimeGetSeconds(duration);
CMTime currentTime = self.player.currentTime;
Float64 currentTimeSeconds = CMTimeGetSeconds(currentTime);
NSDictionary *nowPlayingInfo = @{
MPMediaItemPropertyTitle: @"Example Title",
MPMediaItemPropertyArtist: @"Example Artist",
MPMediaItemPropertyPlaybackDuration: @(durationSeconds),
MPNowPlayingInfoPropertyElapsedPlaybackTime: @(currentTimeSeconds),
MPNowPlayingInfoPropertyPlaybackRate: @(self.player.rate)
};
infoCenter.nowPlayingInfo = nowPlayingInfo;
NSLog(@"Now Playing info configured: %@", nowPlayingInfo);
}
- (void)playMusic {
[self.player play];
[self createNowPlayingSession];
[self configureNowPlayingInfo];
}
- (void)pauseMusic {
[self.player pause];
[self configureNowPlayingInfo];
}
@end
So I'm using AVAudioEngine. When playing audio I become the 'now playing' app using MPNowPlayingInfoCenter/MPRemoteCommandCenter APIs.
When configuring MPRemoteCommandCenter I add a play/pause command target via -addTargetWithHandler on the togglePlayPauseCommand property.
Now I also have a play/pause button in my app's UI. When I pause playback from my app's UI (which means I'm the active app, I'm in the foreground), what I do is this:
-I pause the AVAudioPlayerNode I'm using with AVAudioEngine.
I do not, stop, reset, etc. the AVAudioEngine. I only pause the player node. My thought process here is that the user just pressed pause and it is very likely that he will hit 'play' to resume playback in the near future because
My app is in the foreground and the user just hit the pause button.
Now if my app moves to the background and if I receive a memory warning I presume it'd make sense to tear down the engine or pause it. Perhaps I'm wrong about this?
So when I initially hit the play button from my app's UI I also activate my AVAudioSession. I do this in high priority NSOperation since the documentation warns that "we recommend that applications not activate their session from a thread where a long blocking operation will be problematic."
So now I'm playing and I hit pause from my app's UI. Then I quickly bring up the "Now Playing" center and I see I'm the "Now Playing" app but the play-pause button is showing the pause icon instead of the play icon but I'm in the pause state. I do set MPNowPlayingInfoCenter's playbackState to MPNowPlayingPlaybackStatePaused when I pause. Not surprisingly this doesn't work. The documentation states this is for macOS only.
So the only way to get MPRemoteCommandCenter to show the "play" image for the play-pause button is to deactivate my AVAudioSession when I pause playback? Since I change the active state of my audio session in a NSOperation because documentation recommends "we recommend that applications not activate their session from a thread where a long blocking operation will be problematic." the play-pause toggle in the remote command center won't immediately update since I'm doing it on another thread.
IMO it feels kind of inappropriate for a play-pause button to wait on a NSOperation activating the audio session before updating its UI when I already know my play/paused state, it should update right away like the button in my app does. Wouldn't it be nicer to just use MPNowPlayingInfoCenter's playbackState property on iOS too? If I'm no the longer the now playing app/active audio session it doesn't matter since I'm not in the now playing UI, just ignore it?
Also is it recommended that I deactivate my audio session explicitly every time the user pauses audio in my app (when I'm in the foreground)?
Also when I do deactivate the audio session I get an error: AVAudioSessionErrorCodeIsBusy (but the button in the now playing center updates to the proper image). I do this :
-(void)pause
{
[self.playerNode pause];
[self runOperationToDeactivateAudioSession];
// This does nothing on iOS:
MPNowPlayingInfoCenter *nowPlayingCenter = [MPNowPlayingInfoCenter defaultCenter];
nowPlayingCenter.playbackState = MPNowPlayingPlaybackStatePaused;
}
So in -runOperationToDeactivateAudioSession I get the AVAudioSessionErrorCodeIsBusy. According to the documentation
Starting in iOS 8, if the session has running I/Os at the time that deactivation is requested, the session will be deactivated, but the method will return NO and populate the NSError with the code property set to AVAudioSessionErrorCodeIsBusy to indicate the misuse of the API.
So pausing the player node when pausing isn't enough to meet the deactivation criteria. I guess I have to pause or stop the audio engine. I could probably wait until I receive a scene went to background notification or something before deactivating my audio session (which is async, so the button may not update to the correct image in time). This seems like a lot of code to have to write to get a play-pause toggle to update, especially in iPad-multi window scene environment.
What's the recommended approach?
Should I pause the AudioEngine instead of the player node always?
Should I always explicitly deactivate my audio session when the user pauses playback from my app's UI even if I'm in the foreground?
I personally like the idea of just being able to set
[MPNowPlayingInfoCenter defaultCenter].playbackState = MPNowPlayingPlaybackStatePaused;
But maybe that's because that would just make things easier on me. This does feels overcomplicated though. If anyone can share some tips on how I should handle this, I'd appreciate it.
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works.
The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior.
Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated.
Thank you!
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
HTTP Live Streaming
I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one.
Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it. The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc.
The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case. When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time.
Error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason:
Scenario:
The live event is scheduled from 7:00 AM to 8:00 AM.
Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time.
As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
HTTP Live Streaming
Other applications(Calibre, Thorium Reader...) support media overlay even if the rendtion:layout is set to reflowable.
Why not Apple book?
Apple Books only support media overlay on "pre-paginated".
I use a AVplayer in a window view, I found that when I move the window to different positions, the default behavior is that the sound will change according to the window position. However, in some cases, I don't need this default behavior. I hope the sound doesn't change.
Hi, I'm wondering about one of the properties in the MPNowPlayingInfoCenter: MPNowPlayingInfoPropertyElapsedPlaybackTime. The docs say that updating this property frequently is not required, because the system can automatically calculate elapsed playback time based on the infrequent values we provide.
Is performance harmed by updating this property every second? Should I add some filtering/throttling to update this property infrequently? Am I overthinking this, and it doesn't matter either way?
Kind regards.
Been enjoying the ios18 Beta. Wanted to provide something I noticed since upgrading to the Beta. When I stop an episode midway through and add the episode back to my queue, my phone now skips the episode as already played. So i either miss the rest of the episode, if I don’t catch it, or have to go back and reset it, to hear the partially finished episode. Hopefully this can be addressed in future updates! :)
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
Hello !
I am working on an app connected to an external streamer .
I would like to display current playing song on the Lock Screen.
I tried to update the information in MPNowPlayingInfoCenter but I need to play a sound on my iPhone for the control to be displayed .
Is there a way to do it without playing a sound?
If not, playing a silent sound would be the only solution ? validated by Apple ? :-/
Thank you
Frederic