I take that MusicKit JS is built with TypeScript, based on the attributions in the script: https://js-cdn.music.apple.com/musickit/v3/musickit.js
In the script it points to https://js-cdn.music.apple.com/musickit/v1/acknowledgements.txt – I assume this should be the v3 URL for the v3 version? It returns the same content nonetheless.
This contains attributions for TypeScript.
Currently there's a third-party effort with DefinitelyTyped, which publishes the NPM package @types/musickit-js. The latest supported version available is v1.
However, there is no version compatible with v3.
This makes it hard to use MusicKit JS v3 in a TypeScript project.
Please publish the types, ideally on the CDN along with the musickit.js file. Also consider publishing an officially Apple supported DefinitelyTyped package, or help to maintain the existing @types/musickit-js to make consuming this even easier.
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
When I create a SFSpeechRecognizer object, I find SFLocalSpeechRecognitionClient remains in memory and never gets released.
You can create a demo with a single UIButton whose touch action is
SFSpeechRecognizer(locale: Locale(identifier: "zh_CN"))
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call:
-[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12.
I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem.
Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter
Feedback ID: FB17874311
getting an interesting error attempting to compile my app in Xcode 26 beta.
error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher')
note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher')
Unable to find module dependency: '_MediaPlayer_AppIntents'
Not sure what to try and pull to fix this issue
Hi,
I'm sending an API request to:
https://api.music.apple.com/v1/me/library/playlists?limit=$limit&offset=$offset
To list all of the users library playlists, however the resulting objects do not contain the playlist artwork in the JSON. I've tried adding the extend and include attributes as well but to no avail.
A partial example of the response:
{"id": PLAYLIST_ID, "type": "library-playlists", "href": "/v1/me/library/playlists/PLAYLIST_ID", "attributes": {"lastModifiedDate": "2024-09-18T20:18:24Z", "canEdit": true, "name": "Afro Party Anthems", "isPublic": false, "description": {"standard": "Definitive African party starters"}, "hasCatalog": false, "dateAdded": "2022-03-10T18:30:56Z", "playParams": {"id": PLAYLIST_ID, "kind": "playlist", "isLibrary": true}}, "relationships": {"catalog": {"href": "/v1/me/library/playlists/PLAYLIST_ID/catalog", "data": []}}}
Is there a way to get the artwork URL without sending a request for each playlist? And if not can this be fixed?
I am getting high error rates from the Apple Music API. This has been happening for months now, and it is quite frustrating. It is a mix of 404, 504, and random 500 errors. I hit these endpoints all of the time, so it is not like I am hitting a resource that doesn't exist. Why is this happening? Is this a known issue that is getting worked on?
I have an app (currently in development stage) which needs to use ffmpeg, so I tried searching how to embed ffmpeg in apple apps and found this article https://doc.qt.io/qt-6/qtmultimedia-building-ffmpeg-ios.html
It is working correctly for iOS but not for macOS ( I have made changes macOS specific using chatgpt and traditional web searching)
Drive link for the file and instructions which I'm following: https://drive.google.com/drive/folders/11wqlvb8SU2thMSfII4_Xm3Kc2fPSCZed?usp=share_link
Please can someone from apple or in general help me to figure out what I'm doing wrong?
Use case: When SharePlay -ing a fully immersive 3D scene (e.g. a virtual stage), I would like to shine lights on specific Personas, so they show up brighter when someone in the scene is recording the feed (think a camera person in the scene wearing Vision Pro).
Note: This spotlight effect only needs to render in the camera person's headset and does NOT need to be journaled or shared.
Before I dive into this, my technical question: Can environmental and/or scene lighting affect Persona brightness in a SharePlay? If not, is there a way to programmatically make Personas "brighter" when recording?
My screen recordings always seem to turn out darker than what's rendered in environment, and manually adjusting the contrast tends to blow out the details in a Persona's face (especially in visionOS 26).
My app is properly configured with MusicKit. I've generated a JWT using my valid credentials (Team ID, Key ID, private key), and I’ve ensured the time settings are correct via NTP.
When I call:
https://api.music.apple.com/v1/catalog/jp/search?term=ado&types=songs
I consistently receive a 500 Internal Server Error.
The JWT is generated using ES256 with valid iat and exp values. I’ve confirmed the token decodes properly using jwt.io, and it's passed via the Authorization: Bearer header.
Things I’ve confirmed:
Key ID, Team ID, private key are correct
App ID is configured with MusicKit capability
JWT is generated and signed correctly
macOS time is synced via NTP
Used both curl and Python to test — same result
Is there anything else I should check on the Apple Developer Console (like App ID, Certificates, or provisioning profile)?
Or could this be a backend issue on Apple’s side?
Any guidance would be appreciated.
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask.
At the very least a small sample library that works in the simulator would be welcome (similar to how photos works)
Cheers
In iOS 26, AVSpeechSynthesizer read Mandarin into Cantonese pronunciation.
No matter how you set the language, and change the settings of my phone system, it doesn't work.
let utterance = AVSpeechUtterance(string: "你好啊")
//let voice = AVSpeechSynthesisVoice(language: "zh-CN") // not work
let voice = AVSpeechSynthesisVoice(language: "zh-Hans") // not work too
utterance.voice = voice
et synth = AVSpeechSynthesizer()
synth.speak(utterance)
Topic:
Media Technologies
SubTopic:
General
Tags:
Speech
Internationalization
Localization
AVFoundation
Hi Apple Music API / MusicKit / MediaPlayer Team,
Similar to the currentPlaybackRate keeps the same pitch, it would be great to have a currentPlaybackPitch parameter as well. Alternatively, adding a preservesPitch parameter would also work.
I see that iOS 26 AutoMix on Apple Music currently does pitch shifting during music transitions, so maybe this is something that could be exposed on the later betas of iOS 26?
Main feature request we get is to have simple pitch changes to Apple Music we play through our app. Is this being considered?
Topic:
Media Technologies
SubTopic:
General
Tags:
Apple Music API
FairPlay Streaming
Media Player
MusicKit
Hello everyone,
I'm working on implementing a screen sharing feature using RPSystemBroadcastPickerView and a Broadcast Upload Extension to share the entire app screen in an iOS application.
The Broadcast Upload Extension is set up following Apple's ReplayKit guidelines. However, I’m encountering an issue during the broadcast startup sequence:
❗ Problem Description
The Screen Broadcast UI appears as expected
I tap “Start Broadcast”
The countdown (3 → 2 → 1) completes
Then it immediately reverts to the "Start Broadcast" screen, and screen sharing does not begin
No error messages are displayed
None of the extension lifecycle methods (broadcastStarted(withSetupInfo:), processSampleBuffer, etc.) are called
There are no logs or crash reports, neither in the main app nor in the extension
✅ What Has Been Verified
Info.plist of the Broadcast Upload Extension includes:
NSExtensionPointIdentifier = com.apple.broadcast-services-upload
NSExtensionPrincipalClass set correctly
RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer
preferredExtension is set properly to the extension’s bundle identifier
Extension is listed in the main app's build settings under "Frameworks, Libraries, and Embedded Content"
⚠️ Additional Concern
We noticed that in Xcode (latest version), the Broadcast Upload Extension is listed under "Embedded Frameworks" with the setting "Embed Without Signing", and there is no option to change it to "Embed & Sign". We're wondering if this could be the reason the extension fails to launch correctly at runtime, despite being detected by the broadcast picker.
❓ Questions
Has anyone faced similar issues where the broadcast never starts despite correct setup?
Could the "Embed Without Signing" be causing the system to silently cancel or ignore the extension at runtime?
Are there any provisioning profile or entitlement requirements specific to Broadcast Upload Extensions that might trigger this behavior silently?
Any insights, suggestions, or workarounds would be greatly appreciated.
Thank you in advance!
It's been well over a year since Apple added favoriting of artists back to Apple Music (the little star icon on an artist page), but yet I still haven't seen a way to get this data from an authenticated user from Music API. I was expecting to hear something about this during the WWDC, but there have been no announcements that I've caught.
Has anyone else heard anything? People assume when they provide access to their Apple Music account that we can actually get to the data in their Apple Music account, and we end up looking a little dumb not being able to get this core data.
I am using Apple's original Lightning Digital AV-adapter (Lightning-to-HDMI dongle) to connect my iPhone to an external display via a HDMI cable.
I need to synchronize rendering with the external display's refresh rate, so I create a new CADisplayLink tied to the external display's UIScreen: UIScreen.screens[externalDisplayIdx].displayLink(withTarget:, selector:).
The callback is being called regularly, but with increasing delay relative to the CADisplayLink.timestamp, so the next time the callback is called, I have less and less time to draw the next frame (see the snippet below).
Assuming 60 FPS, the value of secondsTillDeadline starts at an arbitrary value in the range of approx -0.0001 to 0.0166667, and then it slowly decreases towards zero (and for a brief period it goes into small negative numbers). Once it reaches zero, it flips back to 0.0166667 and continues to decrease again. This cycle repeats indefinitely.
Changing the external display's resolution (UIScreen's mode) or the CADisplayLink's preferredFrameRateRange to a lower FPS does not seem to have any effect on the temporal drifting (even the rate of change seem to be the same).
When I create a new CADisplayLink for the iPhone's main screen, the value of secondsTillDeadline is stable, it does not drift and it is very close to 0.0166667, as expected.
Is this drift caused by the external monitor or by Apple's Lightning-to-HDMI dongle ...or is the problem somewhere else?
Can the drifting be stopped?
func onDisplayLinkUpdate(displayLink: CADisplayLink) {
// Gradually decreases from 0.01667 to -0.0001, then flips back to 0.01667 and continues to decrease
let secondsTillDeadline = displayLink.targetTimestamp - CACurrentMediaTime()
}
On an iPhone running iOS 26 beta 5, url(for: FilePath("subdir/asset.mov")) most always throws this error:
The URL for “subdir/asset.mov” couldn’t be retrieved: “asset.mov” couldn’t be copied to “subdir” because an item with the same name already exists.
Yet, contents(at: FilePath("subdir/asset.mov")) always returns Data for a playable AVMovie.
How can I avoid this url(for:) error?
The asset pack in question is downloaded. The error persists even after pack deletion, redownload, relaunch, and combinations of that.
// Assets repo root
subdir.aar
subdir/asset.mov
subdir/asset_thumb.heic
subdir/Manifest.json
// Manifest.json
{
"assetPackID": "subdir",
"downloadPolicy": {
"onDemand": {}
},
"fileSelectors": [
{
"directory": "subdir",
},
],
"platforms": [
"iOS",
"visionOS"
]
}
xcrun ba-package subdir/Manifest.json -o subdir.aar
xcrun ba-serve --host 192.168.0.10 -p 443 subdir.aar
I am working to update a live blog in Apple News. As far as I know there is an update endpoint to update a content in Apple News. Is there any feature in Apple News to trigger an event when the original content updated and pull the updated content?
When making a call to https://api.music.apple.com/v1/me/library/artists to get a user's library artists, it returns the following (as an example):
[
{
id: 'r.FCwruQb',
type: 'library-artists',
href: '/v1/me/library/artists/r.FCwruQb?l=en-US',
attributes: { name: 'A Great Big World' }
},
{
id: 'r.7VSWOgj',
type: 'library-artists',
href: '/v1/me/library/artists/r.7VSWOgj?l=en-US',
attributes: { name: 'Aaliyah' }
},
...
]
If I try and use an artist id from that retuned data to look up additional information about the artist by calling https://api.music.apple.com/v1/catalog/us/artists/{id}, it fails.
User Library Artists don't seem to equal Catalog Artists.
It'd be great if there was a way to use these interchangeably. Am I missing something?
I donate INPlayMediaIntent to systerm(donate success), but not show in control center
My code is as follows
let mediaItems = mediaItems.map { $0.inMediaItem }
let intent = if #available(iOS 13.0, *) {
INPlayMediaIntent(mediaItems: mediaItems,
mediaContainer: nil,
playShuffled: false,
playbackRepeatMode: .none,
resumePlayback: true,
playbackQueueLocation: .now,
playbackSpeed: nil,
mediaSearch: nil)
} else {
INPlayMediaIntent(mediaItems: mediaItems,
mediaContainer: nil,
playShuffled: false,
playbackRepeatMode: .none,
resumePlayback: true)
}
intent.suggestedInvocationPhrase = "播放音乐"
let interaction = INInteraction(intent: intent, response: nil)
interaction.donate { error in
if let error = error {
print("Intent 捐赠失败: \(error.localizedDescription)")
} else {
print("Intent 捐赠成功 ✅")
}
}
How can media resources in my app be recommended to the system media control center, just like TikTok in the picture