Hi there,
I recently launched a dj app to the mac app store, and was wondering how I could access songs for mixing purposes via Apple Music just like how serato, rekordbox, djay, and other DJ apps do?
Thanks,
Gunek
MusicKit
RSS for tagLet users play Apple Music and their local music library from your app using MusicKit.
Posts under MusicKit tag
91 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality.
We’re trying to understand:
Will this Automix feature be accessible to third-party apps that use MusicKit?
If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update?
Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit?
This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists.
Thanks.
Just updated my computer, phone, and dev tools to the latest versions of everything. Now when I run my app in a previously-working simulator (iPhone 16 w. iOS 18.5) I get:
Failed retrieving MusicKit tokens: fetching the developer token is not supported in the simulator when running on this version of macOS; please upgrade your Mac to macOS Ventura.
Also:
<ICCloudServiceStatusMonitor: 0x600003320e60>: Invoking 1 completion handler for MusicKit tokens. error=<ICError.DeveloperTokenFetchingFailed (-8200) "Failed to fetch media token from <AMSMediaTokenService: 0x6000029049a0>." { underlyingErrors: [ <AMSErrorDomain.300 "Token request encoding failed The token request encoder finished with an error." { userInfo: { AMSDescription : "Token request encoding failed", AMSFailureReason : "The token request encoder finished with an error." }; underlyingErrors: [ <AMSErrorDomain.5 "Anisette Failed Platform not supported" { userInfo: { AMSDescription : "Anisette Failed", AMSFailureReason : "Platform not supported" };
Anybody know what gives here? The Ventura message is absurd because I'm on Tahoe 26.1. The same code works on a physical phone running iOS 26.
If I make a request to https://api.music.apple.com/v1/storefronts/us with the proper developer jwt token in the header, I receive the a successful response with a list of store fronts. If I remove the token, I do get back a 401 error.
If I call any other catalog base query, I am getting back a 500 error.
For instance: https://api.music.apple.com/v1/catalog/us/albums/310730204
returns a 500 error with the body being
{"message":"An unexpected error occurred"}
I'm not sure what I can do to fix this. Please help.
My app is properly configured with MusicKit. I've generated a JWT using my valid credentials (Team ID, Key ID, private key), and I’ve ensured the time settings are correct via NTP.
When I call:
https://api.music.apple.com/v1/catalog/jp/search?term=ado&types=songs
I consistently receive a 500 Internal Server Error.
The JWT is generated using ES256 with valid iat and exp values. I’ve confirmed the token decodes properly using jwt.io, and it's passed via the Authorization: Bearer header.
Things I’ve confirmed:
Key ID, Team ID, private key are correct
App ID is configured with MusicKit capability
JWT is generated and signed correctly
macOS time is synced via NTP
Used both curl and Python to test — same result
Is there anything else I should check on the Apple Developer Console (like App ID, Certificates, or provisioning profile)?
Or could this be a backend issue on Apple’s side?
Any guidance would be appreciated.
I use
htttps://api.music.apple.com/v1/me/library/playlists/${playlistId}/tracks
to add tracks to a playlist I created.
How do I DELETE tracks from the playlist?
The documentation does not mention a method for this. I have tried calling DELETE methods in various combinations but nothing seems to work.
Is this possible?
Hi,
I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically:
• Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade.
• Beat-matched transitions between tracks.
From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit.
Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team?
For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed.
Thanks in advance!
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Apple Music API
FairPlay Streaming
MusicKit
AVFoundation
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around.
Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried.
Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is?
I know isFavorited isn't a thing, just using it here so you can see what my intension is:
HStack(spacing: 10) {
Image(systemName: song.isFavorited ? "star.fill" : "star")
.foregroundColor(song.isFavorited ? .yellow : .gray)
Image(systemName: "magnifyingglass")
}
If I fetch a library playlist like the generated "Favorites" playlist via MusicKit like this
guard let initialTracks = try await playlist.with([.tracks]).tracks else {
return nil
}
I get a list of tracks like this:
...
TrackID: i.e5gmPS6rZ856
TrackID: i.4ZQMxU0OxNg0
TrackID: i.J198KH4P85K4
TrackID: i.J1AaRC4P85K4
TrackID: i.4BPqWt0OxNg0
TrackID: 4473570282773028026
TrackID: 4473570282773028025
TrackID: 4015088256684964387
TrackID: 4473570282773028024
TrackID: 7541557725362154249
TrackID: 4473570282773028027
I save the IDs for later use, but when I want to fetch them, only the ones with ids that starts with "i." work.
static func getLibrarySong(from id: String) async -> Song? {
var request = MusicLibraryRequest<Song>()
request.filter(matching: \.id, equalTo: MusicItemID(id))
do {
let response = try await request.response()
return response.items.first
} catch {
...
}
}
Or the Apple Music API endpoint :
static func getLibrarySongFromAPI(with id: String) async -> Song? {
guard let url = AppleMusicURL.getURL(for: .getSongById, id: id) else {
return nil
}
do {
let dataRequest = MusicDataRequest(urlRequest: URLRequest(url: url))
let dataResponse = try await dataRequest.response()
let response = try JSONDecoder().decode(SongsResponse.self, from: dataResponse.data)
return response.data.first
} catch {
...
}
}
Both functions above won't work for the non numeric like 4473570282773028024 so it seems the ID is wrong, but how do I make it work?
Otherwise I can fetch all the songs fine, in catalog or in the library, but these few songs can't be individually fetched, only with the try await playlist.with([.tracks])` fetch, that gets the whole playlist. But obviously this isn't always possible.
Thanks in advance!
Hi there,
I recently launched a dj app to the mac app store, and was wondering how I could access raw data to songs from apple music just like how serato, rekordbox, djay, etc. do?
Thanks,
Gunek
Topic:
Media Technologies
SubTopic:
General
Tags:
Apple Music API
MusicKit
Performance Partners Program
Apple Music Feed
Fetching the featured artists in a playlist, no longer works in iOS 26.1 beta
let detailedPlaylist = try await playlist.with([.tracks, .featuredArtists], preferredSource: .library)
Throws error when using .library and using .catalog returns empty array.
This works correctly in iOS 26.0 and iOS 18 versions
I'm wondering if someone happened issues with currentPlaybackRate in released version of iOS 26.0?
There is no issue happened in case of iOS 18.5.
When I changed currentPlaybackRate on iOS 26.0, it seems to unexpectedly change currentPlaybackRate to be 0 or 1.0 forcibly no matter DRM or non-DRM contents. And also, playing music will be abnormal behavior and unstable with noise if currentPlaybackRate is not 1.0. And changes stop state and play state frequently.
Hi everyone,
I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing.
I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already:
• Display detailed scrolling waveforms of Apple Music songs
• Scratch, loop or time-stretch those tracks in real time
That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement.
My questions:
1. Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content?
2. If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access?
3. Where can I find official documentation or a point of contact for this kind of request?
I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated!
Thanks in advance.
Hi everyone,
I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing.
I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time
That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement.
My questions: 1. Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? 2. If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? 3. Where can I find official documentation or a point of contact for this kind of request?
I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated!
Thanks in advance.
Hey there,
We're seeing a high rate of 403 - Invalid Authentication on this endpoint v1/me/library/artists since a few days.
Does anyone have the same issue ?
l’m trying to automate Apple Music on macOS Tahoe 26 using ScriptingBridge. Scripts that previously worked for controlling playback, fetching track info, or manipulating playlists no longer function.
For example, code like this used to work:
`import ScriptingBridge
let music = SBApplication(bundleIdentifier: "com.apple.Music") as! MusicApplication
print(music.currentTrack?.name ?? "No track playing")`
But now it fails, returning nil for track info and failing to send playback commands.
Questions:
Has ScriptingBridge been deprecated or broken in Tahoe 26 for Apple Music?
Any guidance or example code would be appreciated.
Hi, when using ApplicationMusicPlayer from MusicKit my app automatically gets the media controls on the lock screen: Play/ Pause, Skip Buttons, Playback Position etc.
I would like to customize these. Tried a bunch of things, e.g. using MPRemoteCommandCenter. So far I haven't had any success.
Does anyone know how I can customize the media controls of ApplicationMusicPlayer.
Thank you.
add a currently playing track endpoint on the apple music api. its kinda wild how apple music goes after spotify without having such a useful endpoint.
Hi folks,
I'm trying to generate a provisioning profile that includes both Healthkit and MusicKit entitlements.
The healthKit pieces if fine, and included in the profile. However, despite selecting Musickit under services in the ID setup, the entitlement doesn't seem to be included in the profile.
Other steps taken: Setup the app in App Store Connect, generated a media ID and Key. Tried both automatic and manual signing.
Are there specifics tricks to getting this one to work?
I’m running HomePod OS 26 on two HomePod minis and OS 18.6 on main HomePod (original)
I’ve enabled Crossfade in the Home app.
I’m playing Apple Music directly in the HomePod mini.
Crossfade just doesn’t work on any HomePod.
I can understand it not working on the HomePod - but why isn’t it working on the minis running OS 26?
I’ve tried disabling and enabling Crossfade, rebooting HomePods etc but nothing?!