Just wondering if anyone else is having issues with currentPlaybackRate in release version of iOS 15.4? In my particular case this is using MPMusicPlayerController.applicationQueuePlayer.
I've always had issues controlling this property reliably but from what I can see it is now completely non-operational in 15.4.
I've isolated this behavior in a trivial project, and will file a radar, but hoping others may have some insight first.
FWIW- This is my trivial test case:
class ViewController: UIViewController {
    lazy var player: MPMusicPlayerApplicationController = {
        let player = MPMusicPlayerController.applicationQueuePlayer
        player.repeatMode = .none
        player.shuffleMode = .off
        player.beginGeneratingPlaybackNotifications()
        return player
    }()
    override func viewDidLoad() {
        super.viewDidLoad()
        NotificationCenter.default.addObserver(forName: .MPMusicPlayerControllerPlaybackStateDidChange, object: nil, queue: .main) { [weak self] notification in
            guard let notificationPlayer = notification.object as? MPMusicPlayerApplicationController,
                  notificationPlayer === self?.player else {
                return
            }
            
            debugPrint("Player state now: \(notificationPlayer.playbackState)")
        }
    }
    @IBAction func goAction(_ sender: Any) {
        guard let item = MPMediaQuery.songs().items?.randomElement() else {
            debugPrint("Unable to access media items")
            return
        }
        debugPrint("Now playing item: \(item.title ?? "")")
        player.setQueue(with: [item.playbackStoreID])
        player.prepareToPlay() { error in
            guard error == nil else {
                debugPrint("Player error: \(error!.localizedDescription)")
                return
            }
            DispatchQueue.main.async { [weak self] in
                self?.player.play()
            }
        }
    }
    @IBAction func slowAction(_ sender: Any) {
        debugPrint("Setting currentPlaybackRate to 0.5")
        player.currentPlaybackRate = 0.5
        checkPlaybackRate()
    }
    @IBAction func fastAction(_ sender: Any) {
        debugPrint("Setting currentPlaybackRate to 1.5")
        player.currentPlaybackRate = 1.5
        checkPlaybackRate()
    }
    func checkPlaybackRate(afterSeconds delay: TimeInterval = 1.0) {
        DispatchQueue.main.asyncAfter(deadline: .now() + delay) {
            debugPrint("After \(delay) seconds currentPlaybackRate now: \(self.player.currentPlaybackRate)")
        }
    }
}
Typical console output:
"Now playing item: I Know You Know"
"Player state now: MPMusicPlaybackState(rawValue: 2)"
"Player state now: MPMusicPlaybackState(rawValue: 1)"
"Setting currentPlaybackRate to 1.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
"Setting currentPlaybackRate to 0.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
                    
                  
                General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
            Post
Replies
Boosts
Views
Activity
                    
                      Hi all, 
Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support. 
I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features. 
Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS. 
Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality.
There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap. 
Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received! 
Thanks, 
em
                    
                  
                
                    
                      mac os 系统版本:26.0 (25A354)
Xcode版本:Version 26.0 (17A324)
项目编译报错
`SwiftExplicitDependencyCompileModuleFromInterface arm64 /Users/zhz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/AssetsLibrary-HTIJ05N58KN3.swiftmodule
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:10:25: error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead
8 | public import _StringProcessing
9 | public import _SwiftConcurrencyShims
10 | extension AssetsLibrary.ALAssetsLibrary {
|                         `- error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead
11 |   #if compiler(>=5.3) && $NonescapableTypes
12 |   @available(iOS, introduced: 9.0, deprecated: 9.0, obsoleted: 26.0)
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/System/Library/Frameworks/AssetsLibrary.framework/Headers/ALAssetsLibrary.h:80:12: note: 'ALAssetsLibrary' was obsoleted in iOS 26.0
78 |
79 | OS_EXPORT AL_DEPRECATED(4, "Use PHPhotoLibrary from the Photos framework instead")
80 | @interface ALAssetsLibrary : NSObject {
|            `- note: 'ALAssetsLibrary' was obsoleted in iOS 26.0
81 | @package
82 |     id _internal;
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:1:1: error: failed to build module 'AssetsLibrary'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.17.14 clang-1700.3.17.1)', while this compiler is 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.19.9 clang-1700.3.19.1)'). Please select a toolchain which matches the SDK.
                    
                  
                
                    
                      Hello,
I'm new to the Swift MusicKit API and am starting with the implementation in iOS 16.
I'm getting stuck on an issue where there is no background or text color associated with the Artwork object. Is this something you have to make an additional property request for, and if so, how do you do that?
var catalogSearch = MusicCatalogResourceRequest<Album>(matching: \.id, equalTo: item.id)
let catalogResponse = try await request.response()
guard let firstItem = catalogResponse.items.first else {
    return
}
In this example, firstItem.artwork only contains the url and what look like incorrect max width/height values.
here's a printout of firstItem.artwork
Optional(Artwork(
  urlFormat: "musicKit://artwork/library/5F37858D-F46B-4F12-BA67-40FA8DD63D87/{w}x{h}?at=item&fat=&id=7718670444435992305&lid=5F37858D-F46B-4F12-BA67-40FA8DD63D87&mt=music&aat=Music122/v4/37/25/f5/3725f515-249f-7b91-77bb-f479cd48201c/22UMGIM32254.rgb.jpg",
  maximumWidth: 0,
  maximumHeight: 0
))
                    
                  
                
                    
                      Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
  albums = albumCollections
}
for album in albums {
  let artwork = album.representativeItem?.artwork
  print(artwork,  artwork?.image(at: CGSize(width: 100, height: 100)))
  }
Any help would be greatly appreciated. Thanks!
                    
                  
                
                    
                      I often find when doing basic actions in MusicKit it is incredibly slow compared to Apple's Music App. I've tried different versions, devices, networks, Apple's sample code, it all throughout the last several years, and it is all the same. Does anyone else have this issue?
                    
                  
                
                    
                      Hi, is there a way to display the animated cover art in an app? Not every album has a cover art but for the ones that have one I would like to display them instead of the artwork.
Thank you :)
                    
                  
                
                    
                      I'm wondering if someone happened issues with currentPlaybackRate in released version of iOS 26.0?
There is no issue happened in case of iOS 18.5.
When I changed currentPlaybackRate on iOS 26.0, it seems to unexpectedly change currentPlaybackRate to be 0 or 1.0 forcibly no matter DRM or non-DRM contents. And also, playing music will be abnormal behavior and unstable with noise if currentPlaybackRate is not 1.0. And changes stop state and play state frequently.
                    
                  
                
                    
                      My app has been using the iTunes Search API (itunes.apple.com/search) for a few years now, but at some point over the last week or so (late Sept. 2025) it is no longer returning track results with explicit content, regardless of whether I provide "explicit=Yes" (which is the default anyway, according to the API documentation - https://performance-partners.apple.com/search-api).  Has anyone else experienced this with this API and have you figured out a workaround?
FYI, I do also use the more robust Apple Music API in another part of my app, which isn't going through this issue, so I know it's technically an alternative.  I just need to stick with iTunes Search API in this particular case.  Thanks.
                    
                  
                
                    
                      I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one.
Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it.  The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc.
The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case.  When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
                    
                  
                
                    
                      Has anyone been able to successfully use MusicCatalogSearchRequest in a playgrounds app?
I have configured my playground similar to a regular app: app id with automatic music token generation turned on, music access authorized within the app itself, but whenever I query MusicCatalogSearchRequest I get an error thrown with .developerTokenRequestFailed.
Considering musickit is restricted in the sim, it would not surprise me if it was the same in playgrounds but it would be super helpful if I could prototype with musickit in playgrounds 4!
                    
                  
                
                    
                      Hello!
I am having trouble setting start times for songs when using the ApplicationMusicPlayer.
When I initialize a new MusicPlayer.Queue.Entry using the following constructor, I am seeing strange results:
init(
    _ playableMusicItem: PlayableMusicItem,
    startTime: TimeInterval? = nil,
    endTime: TimeInterval? = nil
)
It appears that any value I provide for startTime is also applied to the endTime. For example:
MusicPlayer.Queue.Entry(playable, startTime: TimeInterval(30), endTime: TimeInterval(183))
provides the following console output:
MusicPlayer.Queue.Entry(id: "3D6A3DA3-595E-4657-8DBA-DDD245BBB7EF", transientItem: PlayableMusicItem, startTime: 30.0, endTime: 30.0)
I have also tried setting the endTime to nil with the same result. Does anyone have any experience setting start times for songs using the MusicKit ApplicationMusicPlayer?
Any feedback is greatly appreciated!
                    
                  
                
                    
                      The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration.
When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer.
There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture).
When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value.
I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it.
What is the proper way with ScreenCapture to handle the retina factors ?
                    
                  
                
                    
                      I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura.
It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura.
This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along.
Calls used
CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura
CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data
Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
                    
                  
                
                    
                      I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image.
The way I have it setup is really simple:
extension MusicKit.Song {
    
    func imageURL(for cgSize: CGSize) -> URL? {
        return artwork?.url(
            width: Int(cgSize.width),
            height: Int(cgSize.height)
        )
    }
    
    func localImage(for cgSize: CGSize) -> UIImage? {
        guard let url = imageURL(for: cgSize),
              url.scheme == "musicKit",
              let data = try? Data(contentsOf: url) else {
            return nil
        }
        return .init(data: data)
    }
    
}
Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these:
2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7
2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted)
My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller.
I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX.
SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
                    
                  
                
                    
                      Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range                              : Limited
Color primaries                          : BT.2020
Transfer characteristics                 : PQ
Matrix coefficients                      : BT.2020 non-constant
Codec configuration box                  : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4  -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
                    
                  
                
                    
                      Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
                    
                  
                
                    
                      I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit.
iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist.
I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks.
FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
                    
                  
                
                    
                      I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below).  The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API.  If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist.  The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists.
I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work.  I get the error that the resource cannot be found.  Since this is a macOs app I cannot use MusicKit to add songs to library playlists.
Does anyone know a way to resolve this?  Or a possible workaround?  Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist.  Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API.
Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API?  For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code.  Thanks everyone for any and all help!
func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async {
    let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap {
        AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs"
    })
    
    let playlistID = playlist.id
    
    // Build the request URL for adding a song to a playlist
    guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else {
        alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.")
        return
    }
    
    // Authorization Header
    guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else {
        alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.")
        return
    }
    
    do {
        var request = URLRequest(url: url)
        request.httpMethod = "POST"
        request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization")
        request.setValue("application/json", forHTTPHeaderField: "Content-Type")
        
        let encoder = JSONEncoder()
        let data = try encoder.encode(tracks)
        request.httpBody = data
        
        let musicRequest = MusicDataRequest(urlRequest: request)
        let musicRequestResponse = try await musicRequest.response()
        
        // Check if the request was successful (status 201)
        if musicRequestResponse.urlResponse.statusCode == 201 {
            alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.")
        } else {
            print("Status Code: \(musicRequestResponse.urlResponse.statusCode)")
            print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")")
            
            // Attempt to decode the error response into the AppleMusicErrorResponse model
            if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) {
                let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred."
                alert.wrappedValue = AlertItem(title: "Error", message: errorMessage)
            } else {
                alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.")
            }
        }
    } catch {
        alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)")
    }
}
                    
                  
                
                    
                      https://api.media.apple.com/v1/feed/exports/song_2024-11-02T16-02/parts?limit=200&offset=400
This is the api used to get parquet file urls. I need all the urls in one api hit, right now if I don't provide the limit then default it is taking 100 and max is 200.
How to get all the records in one hit? Or the count of parquet records in one hit?