I take that MusicKit JS is built with TypeScript, based on the attributions in the script: https://js-cdn.music.apple.com/musickit/v3/musickit.js
In the script it points to https://js-cdn.music.apple.com/musickit/v1/acknowledgements.txt – I assume this should be the v3 URL for the v3 version? It returns the same content nonetheless.
This contains attributions for TypeScript.
Currently there's a third-party effort with DefinitelyTyped, which publishes the NPM package @types/musickit-js. The latest supported version available is v1.
However, there is no version compatible with v3.
This makes it hard to use MusicKit JS v3 in a TypeScript project.
Please publish the types, ideally on the CDN along with the musickit.js file. Also consider publishing an officially Apple supported DefinitelyTyped package, or help to maintain the existing @types/musickit-js to make consuming this even easier.
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
getting an interesting error attempting to compile my app in Xcode 26 beta.
error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher')
note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher')
Unable to find module dependency: '_MediaPlayer_AppIntents'
Not sure what to try and pull to fix this issue
Hi,
I'm sending an API request to:
https://api.music.apple.com/v1/me/library/playlists?limit=$limit&offset=$offset
To list all of the users library playlists, however the resulting objects do not contain the playlist artwork in the JSON. I've tried adding the extend and include attributes as well but to no avail.
A partial example of the response:
{"id": PLAYLIST_ID, "type": "library-playlists", "href": "/v1/me/library/playlists/PLAYLIST_ID", "attributes": {"lastModifiedDate": "2024-09-18T20:18:24Z", "canEdit": true, "name": "Afro Party Anthems", "isPublic": false, "description": {"standard": "Definitive African party starters"}, "hasCatalog": false, "dateAdded": "2022-03-10T18:30:56Z", "playParams": {"id": PLAYLIST_ID, "kind": "playlist", "isLibrary": true}}, "relationships": {"catalog": {"href": "/v1/me/library/playlists/PLAYLIST_ID/catalog", "data": []}}}
Is there a way to get the artwork URL without sending a request for each playlist? And if not can this be fixed?
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask.
At the very least a small sample library that works in the simulator would be welcome (similar to how photos works)
Cheers
It's been well over a year since Apple added favoriting of artists back to Apple Music (the little star icon on an artist page), but yet I still haven't seen a way to get this data from an authenticated user from Music API. I was expecting to hear something about this during the WWDC, but there have been no announcements that I've caught.
Has anyone else heard anything? People assume when they provide access to their Apple Music account that we can actually get to the data in their Apple Music account, and we end up looking a little dumb not being able to get this core data.
I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist
to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists.
For collaborative playlist I get the following 500 error as a response:
"errors": [
{
"id": "<some id>",
"title": "Upstream Service Error",
"detail": "Unable to update tracks",
"status": "500",
"code": "50001"
}
]
}
Steps to reproduce:
Create a playlist in your library.
Use the api to add a song.
Confirm that it works.
Make that same playlist collaborative.
Update the playlist ID in your api request (as making a playlist collaborative changes its id)
Confirm that you get the 500 error.
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before.
On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons).
As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system.
But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior.
I need some bases to explain this situation. Thank you.
Dear Apple Developer Team,
On iOS 26, the contents of PDF pages appear to be swapped.
Could you please advise if there is a workaround or a planned fix for this issue?
Steps to Reproduce:
Download the attached PDF on iOS 26.
Open the PDF in the Files app.
Tap the PDF to view it in Quick Look.
Navigate to page 5.
Expected Result:
The page number displayed at the bottom should be 5.
Actual Result:
The page number displayed at the bottom is 4.
Issue:
This is not limited to page 5—multiple page contents appear to be swapped.
I have also submitted feedback via Feedback Assistant (FB20743531) on October 20.
Best regards,
Yoshihito Suezawa
Hello,
We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application.
1 - We implemented the code to upload still images.
We would like to do the same for Live Photos but we are not sure how to proceed since a Live Photo is composed of 2 resources that we would like to upload in the same job so that we keep the association between the 2
resources.
Steps:
A Live Photo is captured on the device.
Our application is notified for new content in the Photo Library and the asset is queued for upload by our application.
The system calls the background upload extension and the Live Photo is prepared for upload but we can schedule a job only for one resource (still photo or paired video) so we can not upload both resources in a single job.
2 - Is there a way to synchronise the application and the extension so that the application would not process the same data or execute the same requests than the extension. For example: our application is working with
tokens and we would like to prevent those tokens to be consumed by the application and the extension at the same
time.
3 - is there a command in xcode or terminal to start or stop the extension, something similar to what exists for processing tasks
(https://developer.apple.com/documentation/backgroundtasks/starting-and-terminating-tasks-during-development).
I’m trying to build a playlist editor on macOS. I can create playlists via the Apple Music HTTP API, but DELETE always returns 401 even immediately after creation with
the same tokens.
Minimal repro:
#!/usr/bin/env bash
set -euo pipefail
BASE_URL="https://api.music.apple.com/v1"
PLAYLIST_NAME="${PLAYLIST_NAME:-blah}"
: "${APPLE_MUSIC_DEV_TOKEN:?}"
: "${APPLE_MUSIC_USER_TOKEN:?}"
create_body="$(mktemp)"
delete_body="$(mktemp)"
trap 'rm -f "$create_body" "$delete_body"' EXIT
curl -sS --compressed -o "$create_body" -w "Create status: %{http_code}\n" \
-X POST "${BASE_URL}/me/library/playlists" \
-H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \
-H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \
-H "Content-Type: application/json" \
-d "{\"attributes\":{\"name\":\"${PLAYLIST_NAME}\"}}"
playlist_id="$(python3 - "$create_body" <<'PY'
import json, sys
with open(sys.argv[1], "r", encoding="utf-8") as f:
data = json.load(f)
print(data["data"][0]["id"])
PY
)"
curl -sS --compressed -o "$delete_body" -w "Delete status: %{http_code}\n" \
-X DELETE "${BASE_URL}/me/library/playlists/${playlist_id}" \
-H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \
-H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \
-H "Content-Type: application/json"
I capture the response bodies like this:
cat "$create_body"
cat "$delete_body"
Result:
Create: 201
Delete: 401
I also checked the latest macOS SDK’s MusicKit interfaces and MusicLibrary.createPlaylist/edit/add(to:) are marked @available(macOS, unavailable), so I can’t create/
delete via MusicKit on macOS either.
Question: How can I implement a playlist editor on macOS (create/delete/modify) if:
MusicKit write APIs are unavailable on macOS, and
The HTTP API can create but DELETE returns 401?
Any guidance or official workaround would be hugely appreciated.
I’ve been using Apple Music API for quite a while now and a recent change must have happened which is quite disruptive.
On many occasions, artists release singles from an album as part of promoting this album. For recent examples, Harry Styles released “Aperture” (a single) to promote his upcoming album “Kiss All The Time. Disco, Occasionally“. Similarly, Bruno Mars released “I Just Might”, a single from the upcoming album “The Romantic”.
Previously, those would return at the endpoint ”artists/{artist_id}/albums” with a “- Single” suffix. But it seems a recent change happens where they only appear as playable tracks inside the album.
This behavior is also evident in the Apple Music app itself. Those singles no longer appear under “Singles & EPs”. Instead, they would only be visible if the single becomes popular enough to be shown on “Top Songs“. Otherwise one would have to know to tap on the (future) album to discover if there are released singles.
Meanwhile Spotify’s API returns those as singles properly, just like Apple Music API used to.
This change must be recent but the question is if it’s intentional, and if so, how can the API be used from here on out to “extract” those singles and represent them?
Is it possible to perform speech-to-text using AVAudioEngine to capture microphone input while being on a FaceTime call at the same time?
I tried implementing this, but whenever I attempt to start the AVAudioEngine while a FaceTime call is active, I get the following error:
“The operation couldn’t be completed. (OSStatus error 2003329396)”
I assume this might be due to microphone resource restrictions during FaceTime, but I’d like to confirm whether this limitation is at the system level or if there’s any possible workaround or entitlement that allows concurrent microphone access.
Has anyone encountered this issue or found a solution?
On macOS 26.2 (Tahoe), the Preview app fails to render many USDZ models correctly.
The issue appears inconsistent:
• Some USDZ files open normally in Preview
• Some USDZ files open but the viewport turns completely black (no geometry, no material, no lighting)
• All the same files open correctly when opened using “Open With → Xcode”
This was not the behavior on macOS 15.7.2, where the exact same USDZ files rendered consistently in Preview without failures.
Hello,
I'm new to the Swift MusicKit API and am starting with the implementation in iOS 16.
I'm getting stuck on an issue where there is no background or text color associated with the Artwork object. Is this something you have to make an additional property request for, and if so, how do you do that?
var catalogSearch = MusicCatalogResourceRequest<Album>(matching: \.id, equalTo: item.id)
let catalogResponse = try await request.response()
guard let firstItem = catalogResponse.items.first else {
return
}
In this example, firstItem.artwork only contains the url and what look like incorrect max width/height values.
here's a printout of firstItem.artwork
Optional(Artwork(
urlFormat: "musicKit://artwork/library/5F37858D-F46B-4F12-BA67-40FA8DD63D87/{w}x{h}?at=item&fat=&id=7718670444435992305&lid=5F37858D-F46B-4F12-BA67-40FA8DD63D87&mt=music&aat=Music122/v4/37/25/f5/3725f515-249f-7b91-77bb-f479cd48201c/22UMGIM32254.rgb.jpg",
maximumWidth: 0,
maximumHeight: 0
))
I often find when doing basic actions in MusicKit it is incredibly slow compared to Apple's Music App. I've tried different versions, devices, networks, Apple's sample code, it all throughout the last several years, and it is all the same. Does anyone else have this issue?
I'm wondering if someone happened issues with currentPlaybackRate in released version of iOS 26.0?
There is no issue happened in case of iOS 18.5.
When I changed currentPlaybackRate on iOS 26.0, it seems to unexpectedly change currentPlaybackRate to be 0 or 1.0 forcibly no matter DRM or non-DRM contents. And also, playing music will be abnormal behavior and unstable with noise if currentPlaybackRate is not 1.0. And changes stop state and play state frequently.
Hi team,
In the Apple Music Feed datasets, we've noticed some unexpected values in the song and album tables.
The primaryartists column from either song or album may contain a "non-default" artist name such as the katakana name shown in the example below:
select id, name, namedefault, primaryartists from amf_song where id = '1698723329'
id | name | namedefault | primaryartists
----------------------------------------
1698723329 | {default=California} | California | [{id=1264818718, name=チャペル・ローン}]
select * from amf_artist where id = '1264818718'
id | name | namedefault | namepronunciation |
----------------------------------------------
1264818718 | {default=Chappell Roan, ja=チャペル・ローン} | Chappell Roan | {ja=チャペルローン} |
Shouldn't the primaryartists column be showing the namedefault instead of the Japanese language version?
When can we expect this bug to resolved?
Thanks,
If I make a request to https://api.music.apple.com/v1/storefronts/us with the proper developer jwt token in the header, I receive the a successful response with a list of store fronts. If I remove the token, I do get back a 401 error.
If I call any other catalog base query, I am getting back a 500 error.
For instance: https://api.music.apple.com/v1/catalog/us/albums/310730204
returns a 500 error with the body being
{"message":"An unexpected error occurred"}
I'm not sure what I can do to fix this. Please help.
mac os 系统版本:26.0 (25A354)
Xcode版本:Version 26.0 (17A324)
项目编译报错
`SwiftExplicitDependencyCompileModuleFromInterface arm64 /Users/zhz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/AssetsLibrary-HTIJ05N58KN3.swiftmodule
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:10:25: error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead
8 | public import _StringProcessing
9 | public import _SwiftConcurrencyShims
10 | extension AssetsLibrary.ALAssetsLibrary {
| `- error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead
11 | #if compiler(>=5.3) && $NonescapableTypes
12 | @available(iOS, introduced: 9.0, deprecated: 9.0, obsoleted: 26.0)
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/System/Library/Frameworks/AssetsLibrary.framework/Headers/ALAssetsLibrary.h:80:12: note: 'ALAssetsLibrary' was obsoleted in iOS 26.0
78 |
79 | OS_EXPORT AL_DEPRECATED(4, "Use PHPhotoLibrary from the Photos framework instead")
80 | @interface ALAssetsLibrary : NSObject {
| `- note: 'ALAssetsLibrary' was obsoleted in iOS 26.0
81 | @package
82 | id _internal;
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:1:1: error: failed to build module 'AssetsLibrary'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.17.14 clang-1700.3.17.1)', while this compiler is 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.19.9 clang-1700.3.19.1)'). Please select a toolchain which matches the SDK.
My app has been using the iTunes Search API (itunes.apple.com/search) for a few years now, but at some point over the last week or so (late Sept. 2025) it is no longer returning track results with explicit content, regardless of whether I provide "explicit=Yes" (which is the default anyway, according to the API documentation - https://performance-partners.apple.com/search-api). Has anyone else experienced this with this API and have you figured out a workaround?
FYI, I do also use the more robust Apple Music API in another part of my app, which isn't going through this issue, so I know it's technically an alternative. I just need to stick with iTunes Search API in this particular case. Thanks.