A couple of weeks ago I got help here to play one song and the solution to my problem was that I wasn't adding the song (Track Type) to the queue correctly, so now I want to be able to add a playlist worth of songs to the queue. The problem is when I try to add an array of the Track type I get an error. The other part of this issue for me is how do I access an individual song off of the queue after I add it? I see I can do ApplicationMusicPlayer.shared.queue.currentItem but I think I'm missing/misunderstanding something here. Anyway's I'll post the code I have to show how I'm attempting to do this at this moment.
In this scenario we're getting passed in a playlist from another view.
import SwiftUI
import MusicKit
struct PlayBackView: View {
@State var song: Track?
@State private var songs: [Track] = []
@State var playlist: Playlist
private let player = ApplicationMusicPlayer.shared
VStack {
// Album Cover
HStack(spacing: 20) {
if let artwork = player.queue.currentEntry?.artwork {
ArtworkImage(artwork, height: 100)
} else {
Image(systemName: "music.note")
.resizable()
.frame(width: 100, height: 100)
}
VStack(alignment: .leading) {
// Song Title
Text(player.queue.currentEntry?.title ?? "Song Title Not Found")
.font(.title)
.fixedSize(horizontal: false, vertical: true)
}
}
}
.padding()
.task {
await loadTracks()
// It's Here I thought I could do something like this
player.queue = tracks
// Since I can do this with one singular track
player.queue = [song]
do {
try await player.queue.insert(songs, position: .afterCurrentEntry)
} catch {
print(error.localizedDescription)
}
}
}
@MainActor
private func loadTracks() async {
do {
let detailedPlaylist = try await playlist.with([.tracks])
let tracks = detailedPlaylist.tracks ?? []
setTracks(tracks)
} catch {
print(error.localizedDescription)
}
}
@MainActor
private func setTracks(_ tracks: MusicItemCollection<Track>) {
songs = Array(tracks)
}
}
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Basic
iPhone 11
iOS 17.5.1
Main Thread
libsystem_kernel.dylib___ulock_wait (in libsystem_kernel.dylib) +8
libdispatch.dylib__dlock_wait (in libdispatch.dylib) +52
libdispatch.dylib__dispatch_thread_event_wait_slow (in libdispatch.dylib) +52
libdispatch.dylib___DISPATCH_WAIT_FOR_QUEUE__ (in libdispatch.dylib) +364
libdispatch.dylib__dispatch_sync_f_slow (in libdispatch.dylib) +144
MediaToolbox_fpic_CopyCurrentEvent (in MediaToolbox) +132
AVFCore___104-[AVPlayer _setRate:withVolumeRampDuration:playImmediately:rateChangeReason:affectsCoordinatedPlayback:]_block_invoke_2 (in AVFCore) +244
AVFCore-[AVPlayer _setRate:withVolumeRampDuration:playImmediately:rateChangeReason:affectsCoordinatedPlayback:] (in AVFCore) +276
AVFCore-[AVPlayer setRate:] (in AVFCore) +56
call AVPlayer pause
Thread 81 name: fpic-sync
libsystem_kernel.dylib___ulock_wait (in libsystem_kernel.dylib) +8
libdispatch.dylib__dlock_wait (in libdispatch.dylib) +52
libdispatch.dylib__dispatch_thread_event_wait_slow (in libdispatch.dylib) +52
libdispatch.dylib___DISPATCH_WAIT_FOR_QUEUE__ (in libdispatch.dylib) +364
libdispatch.dylib__dispatch_sync_f_slow (in libdispatch.dylib) +144
MediaToolbox_itemasync_CopyProperty (in MediaToolbox) +588
MediaToolbox_fpic_CurrentItemMoment (in MediaToolbox) +184
MediaToolbox___fpic_EstablishCurrentEventForCurrentItem_block_invoke (in MediaToolbox) +136
libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16
libdispatch.dylib__dispatch_lane_barrier_sync_invoke_and_complete (in libdispatch.dylib) +52
MediaToolbox_fpic_ServiceCurrentEvent (in MediaToolbox) +600
MediaToolbox___fpic_NotifyServiceCurrentEvent_block_invoke (in MediaToolbox) +912
libdispatch.dylib__dispatch_call_block_and_release (in libdispatch.dylib) +28
libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16
libdispatch.dylib__dispatch_lane_serial_drain (in libdispatch.dylib) +744
libdispatch.dylib__dispatch_lane_invoke (in libdispatch.dylib) +428
libdispatch.dylib__dispatch_root_queue_drain (in libdispatch.dylib) +388
libdispatch.dylib__dispatch_worker_thread (in libdispatch.dylib) +256
libsystem_pthread.dylib__pthread_start (in libsystem_pthread.dylib) +132
libsystem_pthread.dylib_thread_start (in libsystem_pthread.dylib) +4
Thread 93 name: com.apple.coremedia.player.async.0x303c60240.P/GR
libsystem_kernel.dylib_mach_msg2_trap (in libsystem_kernel.dylib) +8
libsystem_kernel.dylib_mach_msg2_internal (in libsystem_kernel.dylib) +76
libsystem_kernel.dylib_mach_msg_overwrite (in libsystem_kernel.dylib) +432
libsystem_kernel.dylib_mach_msg (in libsystem_kernel.dylib) +20
libdispatch.dylib__dispatch_mach_send_and_wait_for_reply (in libdispatch.dylib) +540
libdispatch.dylib_dispatch_mach_send_with_result_and_wait_for_reply (in libdispatch.dylib) +56
libxpc.dylib_xpc_connection_send_message_with_reply_sync (in libxpc.dylib) +260
CoreMedia_FigXPCConnectionSendSyncMessageCreatingReply (in CoreMedia) +288
CoreMedia_FigXPCRemoteClientSendSyncMessageCreatingReply (in CoreMedia) +44
MediaToolbox_remoteXPCPlayer_SetRateWithOptions (in MediaToolbox) +148
MediaToolbox_playerasync_runOneCommand (in MediaToolbox) +768
MediaToolbox_playerasync_runAsynchronousCommandOnQueue (in MediaToolbox) +180
libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16
libdispatch.dylib__dispatch_lane_serial_drain (in libdispatch.dylib) +744
libdispatch.dylib__dispatch_lane_invoke (in libdispatch.dylib) +428
libdispatch.dylib__dispatch_root_queue_drain (in libdispatch.dylib) +388
libdispatch.dylib__dispatch_worker_thread (in libdispatch.dylib) +256
libsystem_pthread.dylib__pthread_start (in libsystem_pthread.dylib) +132
libsystem_pthread.dylib_thread_start (in libsystem_pthread.dylib) +4
When update as IOS18 , Display automatically dims while the iPhone is video calling with LINE and it is stabilizing.
Topic:
Media Technologies
SubTopic:
Video
I successfully retrieved strings, arrays, and other data through a custom AudioObjectPropertySelector, but I can only get fixed returns. Whenever I modify it to use dynamic data, it results in an error. Below is my code.
case kPlugIn_CustomPropertyID:
{
*((CFStringRef*)outData) = CFSTR("qin@@@123");
*outDataSize = sizeof(CFStringRef);
}
break;
case kPlugIn_ContainDic:
{
CFMutableDictionaryRef mutableDic1 = CFDictionaryCreateMutable(kCFAllocatorDefault,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(mutableDic1, CFSTR("xingming"), CFSTR("qinmu"));
*((CFDictionaryRef*)outData) = mutableDic1;
*outDataSize = sizeof(CFPropertyListRef);
// *((CFPropertyListRef*)outData) = mutableDic;
}
break;
case kPlugIn_ContainArray:
{
CFMutableArrayRef mutableArray = CFArrayCreateMutable(kCFAllocatorDefault, 0, &kCFTypeArrayCallBacks);
CFArrayAppendValue(mutableArray, CFSTR("Hello"));
CFArrayAppendValue(mutableArray, CFSTR("World"));
*((CFArrayRef*)outData) = mutableArray;
*outDataSize = sizeof(CFArrayRef);
}
break;
These are fixed returns, and there are no issues when I retrieve the data.
When I change the return data in kPlugIn_ContainDic to the following, the first time I restart the CoreAudio service and retrieve the data, it works fine. However, when I attempt to retrieve it again, it results in an error:
case kPlugIn_ContainDic:
{
*outDataSize = sizeof(CFPropertyListRef);
*((CFPropertyListRef*)outData) = mutableDic;
}
break;
error code:
HALC_ShellDevice::CreateIOContextDescription: failed to get a description from the server
HAL_HardwarePlugIn_ObjectGetPropertyData: no object
HALPlugIn::ObjectGetPropertyData: got an error from the plug-in routine, Error: 560947818 (!obj)
The declaration and usage of mutableDic are as follows:
static CFMutableDictionaryRef mutableDic;
static OSStatus BlackHole_Initialize(AudioServerPlugInDriverRef inDriver, AudioServerPlugInHostRef inHost)
{
OSStatus theAnswer = 0;
gPlugIn_Host = inHost;
if (mutableDic == NULL){
mutableDic = CFDictionaryCreateMutable(kCFAllocatorDefault,
100,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
}
}
static OSStatus BlackHole_AddDeviceClient(AudioServerPlugInDriverRef inDriver, AudioObjectID inDeviceObjectID, const AudioServerPlugInClientInfo* inClientInfo)
{
CFStringRef string = CFStringCreateWithFormat(kCFAllocatorDefault, NULL, CFSTR("%u"), inClientInfo->mClientID);
CFMutableDictionaryRef dic = CFDictionaryCreateMutable(kCFAllocatorDefault,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(dic, CFSTR("clientID"), string);
CFDictionarySetValue(dic, CFSTR("bundleID"), inClientInfo->mBundleID);
CFDictionarySetValue(mutableDic, string, dic);
}
Can someone tell me why
When I set a custom exposure duration, like 1/8, and then switch back to continuous auto exposure, the exposure duration in areas that were previously 1/17 changes to something like 1/5 or 1/10. As a result, the screen becomes laggy and overexposed. I'm not sure why this is happening.
I am fetching playlist songs from the users library and also need the releaseDate (or year) for the song for my use case. However, the releaseDate is always nil since I have upgraded to sequoia. I am pretty sure, this was working before the upgrade, but I couldn't find any documentation on changes related to this.
Furthermore I noticed, the IDs also now seem to be the catalog IDs instead of the global ones like i.PkdZbQXsPJ4DX04
Here's in a nutshell what I am doing
func fetchSongs(playlist: Playlist) async throws {
let detailedPlaylist = try await playlist.with([.tracks])
var currentTracks: MusicItemCollection<Track>? = detailedPlaylist.tracks
repeat {
for track in currentTracks! {
guard case .song(let song) = track else {
print("This is not a song")
continue
}
print(song.releaseDate)
}
currentTracks = try await currentTracks?.nextBatch()
} while currentTracks != nil
}
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885
My project uses AVPlayer (AVPlayerViewController) to play video. There are continuous warning logs while playing and when it goes to dealloc, it prints information below.
<<<< PlayerRemoteXPC >>>> remoteXPCItem_handleSetProperty signalled err=-12860 (kFigPlayerError_ParamErr) (propertyValue should be MTAudioProcessingTap) at FigPlayer_RemoteXPC.m:2760
This only happens in iOS 18 and I have no idea about this. There is no any information for FigPlayerInterstitial and else.
Iphone 13mini updated to ios18.
Carplay is wired on my 2021 RAM Laramie.
After the update => Premium audio is lost, I can only hear low quality audio.
When I manually change to Bluetooth instead of usb on the car, then audio comes in speaker mode on my phone and not on the truck.
I was wondering if anyone could assist with the following query.
Apple's Private Relay functionality requires companies to register all email-sending subdomains for the service to function properly. With 26 markets and 3 subdomains per market for one department, and another department with around 20 markets and even more subdomains, the limit of 100 sending domains is exceeded.
As a result, we’re unable to register all the domains currently being used to send emails to our customers.
Does any have any recommendations to overcome this?
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it.
Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out.
Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it.
import SwiftUI
import MusicKit
struct ContentView: View {
var body: some View {
VStack{
Button("Play Music") {
Task{
await playMusic()
}
}
}
}
}
func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{
var songs:[Song]?
if let t = tracks{
songs = [Song]()
for track in t {
if case let .song(song) = track {
songs?.append(song)
print("track is song \(track.debugDescription)")
}else{
print("track not song \(track.debugDescription)")
}
}
}
if let songs = songs {
let topSongs = MusicItemCollection(songs)
return topSongs
}
return nil
}
func playMusic() async {
// Request authorization
let status = await MusicAuthorization.request()
guard status == .authorized else {
print("Music authorization denied.")
return
}
do {
// Perform a hardcoded search for a playlist
let searchTerm = "2000"
let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self])
let response = try await request.response()
guard let playlist = response.playlists.first else {
print("No playlists found for the search term '\(searchTerm)'.")
return
}
// Fetch the songs in the playlist
let detailedPlaylist = try await playlist.with([.tracks])
guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else {
print("no songs found")
return }
guard let t = detailedPlaylist.tracks else {
print("no tracks")
return
}
// Create a queue and play
let musicPlayer = ApplicationMusicPlayer.shared
let q = ApplicationMusicPlayer.Queue(for: t)
musicPlayer.queue = q
try await musicPlayer.play()
print("Now playing playlist: \(playlist.name)")
} catch {
print("An error occurred: \(error.localizedDescription)")
}
}
Hello,
I'm getting an unknown, never-before-seen error at application launch, when running my iOS SpriteKit game on the iOS 18 arm64 simulator from Xcode 16.0 (16A242d) —
AudioConverterOOP.cpp:847 Failed to prepare AudioConverterService: -302
This is occurs on all iOS 18 simulator devices, between application(_:didFinishLaunchingWithOptions:) and the first applicationDidBecomeActive(_:) — the SKScene object may have been already initialized by SpriteKit, but the scene's didMove(to:) method hasn't been called yet.
Also, note that the error message is being emitted from a secondary (non-main) thread, obviously not created by the app.
After the error occurs, no SKScene is able to play audio — this had never occurred on iOS versions prior to 18, neither on physical devices nor on the simulator.
Has anyone seen anything like this on a physical device running 18?
Unfortunately, at the moment I cannot test myself on an 18 device, only on the simulator...
Thank you,
D.
Hi, I recently updated to ios 18. And yes the music is playing in background with camera app. But unable to play music with notes app open
Hello! The new lower latency support for AirPods in Game Mode is impressive, but I'm not sure of the best way to handle the transition into/out of Game Mode while audio is playing. In order to lower the latency, the system appears to drop some number of samples, with the result being a good deal less latency. My use case is macOS where it's easier to switch in/out of the fullscreen game (a simple swipe left), thus causing more issues for Game Mode since the audio is playing the entire time. It would be nice if offscreen games could remain in game mode, but I understand not wanting to give developers that control.
Are there any best practices for avoiding or masking the audio glitch caused by this skip-ahead? Is there a system event I can receive to know when Game Mode is about to be enabled or disabled, where I could perhaps fade out the audio? My callback checks the inTimestamp->mSampleTime value to detect gaps, but it only rarely detects a Game Mode gap, even though the audio skip-ahead always happens.
BTW, I am currently only developing on macOS (15.0) and I'm working at a low level with AudioUnit callbacks and a SpatialMixer. I am not currently using any higher-level audio APIs.
And here's a few questions I don't necessarily expect answers to, but it doesn't hurt to ask: Is there any additional technical details about how this latency reduction works, or exactly how much of a reduction is achieved (or said another way, how many samples are dropped)? How much does this affect AirPods battery life? And finally, is there a way to query the actual latency value? I check the value for kAudioDevicePropertyLatency but it seems to always report 160ms for AirPods. Thanks!
Hi
I'm trying to stream a H264 video feed that is coming from a uniview IP camera in a browser however the stream is just not displaying. Either I get a single frame or just a black screen. I get the same issues on safari on the mac or any browser on an iphone. However the video stream works just fine using hls.js in Windows or on Android.
We are grabbing the the RTSP stream from the camera and using ngix to serve the .m3u8 url. However even if we save the stream to a file and try an play it on the iphone it has the same issue (unless we use a separate media player like VLC).
I know if we use ffmpeg to reencode as H264 rather than copy it the it will play. My guess there is an incompatibility between how uniview encode the video and what apple can accept.
I've asked uniview and they are not sure what the problem is either.
Is there a way to get more debug information on why a particular HLS stream is failing in safari on mac or iPhone.
Topic:
Media Technologies
SubTopic:
Streaming
(AVPlayerViewController *)avPlayerVC {
if(!_avPlayerVC){
_avPlayerVC =[[AVPlayerViewController alloc] init];
_avPlayerVC.videoGravity = AVLayerVideoGravityResizeAspectFill;
_avPlayerVC.showsPlaybackControls = NO;
[self addSubview:_avPlayerVC.view];
[_avPlayerVC.view mas_makeConstraints:^(MASConstraintMaker *make) {
make.edges.mas_equalTo(0);
}];
[self sendSubviewToBack:_avPlayerVC.view];
}
return _avPlayerVC;
}
我在一个cell里添加这个,界面无法动弹。只有在iOS18会这样
Hi,
In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]).
The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue.
I'm using MusicKit and targeting iOS18 and visionOS 2.
Could someone please point me towards the best way to approach this?
I have a FairPlay-encrypted HLS stream and played the video in an AVPlayer.And I want to generate scrubbing thumbnails using the AVAssetImageGenerator.
Also, I am able to generate thumbnails for clear streams but get errors for protected content.
*How to generate thumbnails for protected content.
func getImageThumbnail(forTime: CMTime) {
let generator = AVAssetImageGenerator(asset: asset)
generator.appliesPreferredTrackTransform = true
generator.cancelAllCGImageGeneration()
generator.generateCGImagesAsynchronously(forTimes: [NSValue(time: forTime)]) { [weak self] requestedTime, image, actualTime, result, error in
if let error = error {
print("Error generate: \(error.localizedDescription)")
return
}
if let image = image {
DispatchQueue.main.async {
let image = UIImage(cgImage: image).jpegData(compressionQuality: 1.0)
self?.playerImg.image = UIImage(data: image!)
}
}
}
}
Hello everyone, I am using QRCodeScanner library in my project, the scan qr code was working in earlier ipad os but now in iPad os 18 it's stopped working.
Am a musician/DJ. Jumped from 14 Pro Max iOS 17 to 16 Pro Max iOS 18.1 b4. For each audio source(music app/yt music etc.) same track/eq/volume compared side by side. With the new device, overall it's a bit muffled and damping music when highs and lows are mixed. Most noticeable when listening to high vocals and acoustic instruments. Drum and bass sound much like on an old Nokia. On 14 Pro it's nothing like that. Thank you
Hello,
I recently started integrating HLS downloads into my application by using AVAssetDownloadTask and AVAssetDownloadConfiguration. I took an example from the documentation as a basis, with only one small difference: the minimum target for my application is iOS 16, so I replaced urlSession(_:assetDownloadTask:willDownloadTo:) with urlSession(_:assetDownloadTask:didFinishDownloadingTo:).
And I encountered the following issue: after pausing a download and resuming it later, the progress no longer functions as expected.
Could you, please, help me with this? What are the right approaches to implementing pause and progress tracking?
Some details:
I used devices with iOS 16.0.2 and 17.6.1 for testing.
There was no code in the example that pauses the download and resumes it. So, I used the following methods to do this: suspend and resume
Also, I have tried to track downloading progress using two different approaches:
Using task.progress.observe(\.fractionCompleted) { ... }, which was presented in the example. In this scenario, after a pause, an observation callback will only be called once, when the download has completed, despite the fact that data is being successfully downloaded over the network.
Using urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) and calculating progress as totalTimeRangesLoaded.reduce(0.0) { $0 + CMTimeGetSeconds($1.timeRangeValue.duration) / CMTimeGetSeconds(timeRangeExpectedToLoad.duration) }. In this scenario, I have noticed that the result of the calculation does not always increase, but sometimes there are outliers. Example of logs: 68%, 69%, 70%, 72%, 63%, 65%, 66%, 69%, 70%, 71%, 72%. Such fluctuations are most easily reproduced when I try to resume the download after pause. However, sometimes they occur spontaneously. It's important to mention, that this method marked as deprecated, perhaps for this reason.
In both cases download is successful, the problem is with progress reporting only.
Full version of code can be found here.