Has anyone been able to successfully use MusicCatalogSearchRequest in a playgrounds app?
I have configured my playground similar to a regular app: app id with automatic music token generation turned on, music access authorized within the app itself, but whenever I query MusicCatalogSearchRequest I get an error thrown with .developerTokenRequestFailed.
Considering musickit is restricted in the sim, it would not surprise me if it was the same in playgrounds but it would be super helpful if I could prototype with musickit in playgrounds 4!
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration.
When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer.
There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture).
When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value.
I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it.
What is the proper way with ScreenCapture to handle the retina factors ?
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image.
The way I have it setup is really simple:
extension MusicKit.Song {
func imageURL(for cgSize: CGSize) -> URL? {
return artwork?.url(
width: Int(cgSize.width),
height: Int(cgSize.height)
)
}
func localImage(for cgSize: CGSize) -> UIImage? {
guard let url = imageURL(for: cgSize),
url.scheme == "musicKit",
let data = try? Data(contentsOf: url) else {
return nil
}
return .init(data: data)
}
}
Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these:
2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7
2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted)
My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller.
I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX.
SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
I noticed that Instagram and iMessage support receiving shared lyrics from Apple Music. Specifically, when users long-press lyrics, a sheet pops up showing iMessage and Instagram. Clicking on either app generates a beautifully formatted lyrics image.
I've looked through MusicKit documentation but couldn't find any related APIs.
How can I implement this functionality in my app?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device.
What I Have Tried So Far
I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings:
Google Cast SDK:
Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time.
ReplayKit for Screen Capture:
RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant.
RTSP/UDP Streaming:
Some third-party libraries support RTSP/UDP streaming for real-time screen sharing. Google TV does not natively support RTSP, making this approach difficult.
My Questions:
Is it possible to achieve real-time screen mirroring on Google TV using Google Cast SDK? Does Google TV support WebRTC or any low-latency streaming protocol that can be used from iOS? Are there any alternative approaches to mirror an iOS screen to Google TV with minimal latency? I would appreciate any guidance, code examples, or references to relevant documentation.
We are planning to develop an application using the Apple Music API.
We would like to design our system based on the details of the rate limits mentioned below and have a few questions:
https://developer.apple.com/documentation/applemusicapi/generating-developer-tokens#Request-Rate-Limiting
Regarding the Catalog API (/v1/catalog/*), we understand that server-side caching is enabled, making it less likely to reach the rate limit. Is this understanding correct? (Excluding the search API)
For APIs like the Library API (/v1/me/library/*), where responses vary by user, we assume they are more likely to reach the rate limit. Is this correct?
We plan to implement optimizations to minimize unnecessary API calls. Given this, would the current Music API be able to handle a significant increase in users? (Assuming a DAU of around 100,000 to 1,000,000)
If the API cannot support this scale, would it be allowed under Apple’s policy to cache responses from the Catalog API (/v1/catalog/*) via our proxy server to avoid hitting the rate limit?
The third question is the one we most want to confirm.
In the past, when using Lightning, many external devices had to go through MFi certification. However, since the iPhone 15 switched from Lightning to USB-C, is MFi certification still required?
Our company has developed several UVC devices, and we have confirmed that iPads can read frames from external cameras through the external device type in AVFoundation. However, this is not supported on iPhones.
We are currently exploring feasible ways to enable UVC device support on iPhones. Is MFi certification the only option? If so, is the MFi certification process for USB-C the same as it was for Lightning? Does it still require purchasing an MFi chip and manufacturing specially designed USB-C cables?
When I use musicKit SDK for Android 1.1.2, I found that MediaContainerType only defines three types:
NONE = 0;
ALBUM = 1;
PLAYLIST = 2;
The RADIO_STATION type is not defined.
However, the documentation of com.apple.android.music.playback.model states that the RADIO_STATION type is supported.
This problem causes an error after I pass in the stations ID:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
May I ask how to solve this problem?
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型:
无 = 0;
专辑 = 1;
播放列表 = 2;
未定义 RADIO_STATION 类型。
但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。
此问题在我传入 stations ID 后会导致错误:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
请问如何解决这个问题?
I take that MusicKit JS is built with TypeScript, based on the attributions in the script: https://js-cdn.music.apple.com/musickit/v3/musickit.js
In the script it points to https://js-cdn.music.apple.com/musickit/v1/acknowledgements.txt – I assume this should be the v3 URL for the v3 version? It returns the same content nonetheless.
This contains attributions for TypeScript.
Currently there's a third-party effort with DefinitelyTyped, which publishes the NPM package @types/musickit-js. The latest supported version available is v1.
However, there is no version compatible with v3.
This makes it hard to use MusicKit JS v3 in a TypeScript project.
Please publish the types, ideally on the CDN along with the musickit.js file. Also consider publishing an officially Apple supported DefinitelyTyped package, or help to maintain the existing @types/musickit-js to make consuming this even easier.
getting an interesting error attempting to compile my app in Xcode 26 beta.
error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher')
note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher')
Unable to find module dependency: '_MediaPlayer_AppIntents'
Not sure what to try and pull to fix this issue
Hi,
I'm sending an API request to:
https://api.music.apple.com/v1/me/library/playlists?limit=$limit&offset=$offset
To list all of the users library playlists, however the resulting objects do not contain the playlist artwork in the JSON. I've tried adding the extend and include attributes as well but to no avail.
A partial example of the response:
{"id": PLAYLIST_ID, "type": "library-playlists", "href": "/v1/me/library/playlists/PLAYLIST_ID", "attributes": {"lastModifiedDate": "2024-09-18T20:18:24Z", "canEdit": true, "name": "Afro Party Anthems", "isPublic": false, "description": {"standard": "Definitive African party starters"}, "hasCatalog": false, "dateAdded": "2022-03-10T18:30:56Z", "playParams": {"id": PLAYLIST_ID, "kind": "playlist", "isLibrary": true}}, "relationships": {"catalog": {"href": "/v1/me/library/playlists/PLAYLIST_ID/catalog", "data": []}}}
Is there a way to get the artwork URL without sending a request for each playlist? And if not can this be fixed?
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask.
At the very least a small sample library that works in the simulator would be welcome (similar to how photos works)
Cheers
It's been well over a year since Apple added favoriting of artists back to Apple Music (the little star icon on an artist page), but yet I still haven't seen a way to get this data from an authenticated user from Music API. I was expecting to hear something about this during the WWDC, but there have been no announcements that I've caught.
Has anyone else heard anything? People assume when they provide access to their Apple Music account that we can actually get to the data in their Apple Music account, and we end up looking a little dumb not being able to get this core data.
I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist
to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists.
For collaborative playlist I get the following 500 error as a response:
"errors": [
{
"id": "<some id>",
"title": "Upstream Service Error",
"detail": "Unable to update tracks",
"status": "500",
"code": "50001"
}
]
}
Steps to reproduce:
Create a playlist in your library.
Use the api to add a song.
Confirm that it works.
Make that same playlist collaborative.
Update the playlist ID in your api request (as making a playlist collaborative changes its id)
Confirm that you get the 500 error.
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before.
On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons).
As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system.
But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior.
I need some bases to explain this situation. Thank you.
Dear Apple Developer Team,
On iOS 26, the contents of PDF pages appear to be swapped.
Could you please advise if there is a workaround or a planned fix for this issue?
Steps to Reproduce:
Download the attached PDF on iOS 26.
Open the PDF in the Files app.
Tap the PDF to view it in Quick Look.
Navigate to page 5.
Expected Result:
The page number displayed at the bottom should be 5.
Actual Result:
The page number displayed at the bottom is 4.
Issue:
This is not limited to page 5—multiple page contents appear to be swapped.
I have also submitted feedback via Feedback Assistant (FB20743531) on October 20.
Best regards,
Yoshihito Suezawa
Hello,
We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application.
1 - We implemented the code to upload still images.
We would like to do the same for Live Photos but we are not sure how to proceed since a Live Photo is composed of 2 resources that we would like to upload in the same job so that we keep the association between the 2
resources.
Steps:
A Live Photo is captured on the device.
Our application is notified for new content in the Photo Library and the asset is queued for upload by our application.
The system calls the background upload extension and the Live Photo is prepared for upload but we can schedule a job only for one resource (still photo or paired video) so we can not upload both resources in a single job.
2 - Is there a way to synchronise the application and the extension so that the application would not process the same data or execute the same requests than the extension. For example: our application is working with
tokens and we would like to prevent those tokens to be consumed by the application and the extension at the same
time.
3 - is there a command in xcode or terminal to start or stop the extension, something similar to what exists for processing tasks
(https://developer.apple.com/documentation/backgroundtasks/starting-and-terminating-tasks-during-development).