Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
So I have been using the iOS 18 beta and I would like to say first and foremost really great update I love the timed messages and the car Sickness stuff that’s awesome but my only problem with the update is screen time. I have screen time on my phone and when I updated to iOS 18 beta I can’t seem to claim more screen time. I have my screen time to where I can ignore it once I run out But in the new update I can seem press the button all I want but It wont give me more screen time. This Is what I imagine a minor fix and I would really appreciate if this was fixed soon. But other than that small detail Amazing update love what your doing over here at apple!
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it.
Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out.
Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it.
import SwiftUI
import MusicKit
struct ContentView: View {
var body: some View {
VStack{
Button("Play Music") {
Task{
await playMusic()
}
}
}
}
}
func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{
var songs:[Song]?
if let t = tracks{
songs = [Song]()
for track in t {
if case let .song(song) = track {
songs?.append(song)
print("track is song \(track.debugDescription)")
}else{
print("track not song \(track.debugDescription)")
}
}
}
if let songs = songs {
let topSongs = MusicItemCollection(songs)
return topSongs
}
return nil
}
func playMusic() async {
// Request authorization
let status = await MusicAuthorization.request()
guard status == .authorized else {
print("Music authorization denied.")
return
}
do {
// Perform a hardcoded search for a playlist
let searchTerm = "2000"
let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self])
let response = try await request.response()
guard let playlist = response.playlists.first else {
print("No playlists found for the search term '\(searchTerm)'.")
return
}
// Fetch the songs in the playlist
let detailedPlaylist = try await playlist.with([.tracks])
guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else {
print("no songs found")
return }
guard let t = detailedPlaylist.tracks else {
print("no tracks")
return
}
// Create a queue and play
let musicPlayer = ApplicationMusicPlayer.shared
let q = ApplicationMusicPlayer.Queue(for: t)
musicPlayer.queue = q
try await musicPlayer.play()
print("Now playing playlist: \(playlist.name)")
} catch {
print("An error occurred: \(error.localizedDescription)")
}
}
I have sync 3.4 on my ford F150 - CarPlay does not work with iOS 18.1. Sync says I don’t have anything connected. Something is up.
Hello,
I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v".
This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer.
I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps.
@Observable
final class AVFoundationService: AVService {
// Live View
private let session: AVCaptureSession = .init()
var previewLayer: AVCaptureVideoPreviewLayer {
let layer = AVCaptureVideoPreviewLayer(session: session)
layer.videoGravity = .resizeAspect
return layer
}
var activeVideoDevice: AVCaptureDevice? {
// TODO: implement correct logic
if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) {
return device
}
return AVCaptureDevice.default(for: .video)
}
func setupStreamDemo(completion: @escaping (Error?) -> Void) {
session.beginConfiguration()
if let device = activeVideoDevice {
do {
let input = try AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
} else {
print("explode")
}
for format in device.formats {
let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription)
if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" {
let foundFPS = format.videoSupportedFrameRateRanges.first {
Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60
}
try device.lockForConfiguration()
device.activeFormat = format
device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration
device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration
device.unlockForConfiguration()
}
}
} catch {
return completion(error)
}
}
session.commitConfiguration()
session.startRunning()
completion(nil)
}
}
I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer.
struct VideoPreviewView: NSViewRepresentable {
private let previewLayer: AVCaptureVideoPreviewLayer
func makeNSView(context: Context) -> NSView {
let view = NSView()
view.layer = self.previewLayer
view.layer?.frame = view.bounds
return view
}
func updateNSView(_ nsView: NSView, context: Context) {
if let layer = nsView.layer as? AVCaptureVideoPreviewLayer {
layer.session = self.previewLayer.session
}
}
}
When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30.
If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode.
I am on Mac Sequoia 15.0 with Xcode 16.0.
What I am doing wrong?
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit.
iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist.
I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks.
FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
Hi,
I'm working on an integration with the Apple Music Feed, but over the last day, I've been getting a 500 Upstream Service Error on 99% of the API calls I make.
Remarkably, retrying the same endpoint 20-30 times sometimes gives the correct response, but mostly it's a 500 error.
Just to give an example, this:
GET https://api.media.apple.com/v1/feed/album/latest
Returns this generic error:
{
"errors": [
{
"id": "U4ARRA2QDCGYKYRI2IPEJVBTHY",
"title": "Upstream Service Error",
"detail": "Call to get metadata for album feed failed",
"status": "500",
"code": "50001"
}
]
}
The same goes for other feeds like song, artist, and so on.
Before today, I did get the same error message sometimes, but a few retries would solve the issue.
Any insight on what's happening and/or an ETA on fixing it?
Thank you
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists.
I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists.
Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API.
Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help!
func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async {
let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap {
AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs"
})
let playlistID = playlist.id
// Build the request URL for adding a song to a playlist
guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else {
alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.")
return
}
// Authorization Header
guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else {
alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.")
return
}
do {
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization")
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
let encoder = JSONEncoder()
let data = try encoder.encode(tracks)
request.httpBody = data
let musicRequest = MusicDataRequest(urlRequest: request)
let musicRequestResponse = try await musicRequest.response()
// Check if the request was successful (status 201)
if musicRequestResponse.urlResponse.statusCode == 201 {
alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.")
} else {
print("Status Code: \(musicRequestResponse.urlResponse.statusCode)")
print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")")
// Attempt to decode the error response into the AppleMusicErrorResponse model
if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) {
let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred."
alert.wrappedValue = AlertItem(title: "Error", message: errorMessage)
} else {
alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.")
}
}
} catch {
alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)")
}
}
https://api.media.apple.com/v1/feed/exports/song_2024-11-02T16-02/parts?limit=200&offset=400
This is the api used to get parquet file urls. I need all the urls in one api hit, right now if I don't provide the limit then default it is taking 100 and max is 200.
How to get all the records in one hit? Or the count of parquet records in one hit?
I noticed that Instagram and iMessage support receiving shared lyrics from Apple Music. Specifically, when users long-press lyrics, a sheet pops up showing iMessage and Instagram. Clicking on either app generates a beautifully formatted lyrics image.
I've looked through MusicKit documentation but couldn't find any related APIs.
How can I implement this functionality in my app?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device.
What I Have Tried So Far
I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings:
Google Cast SDK:
Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time.
ReplayKit for Screen Capture:
RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant.
RTSP/UDP Streaming:
Some third-party libraries support RTSP/UDP streaming for real-time screen sharing. Google TV does not natively support RTSP, making this approach difficult.
My Questions:
Is it possible to achieve real-time screen mirroring on Google TV using Google Cast SDK? Does Google TV support WebRTC or any low-latency streaming protocol that can be used from iOS? Are there any alternative approaches to mirror an iOS screen to Google TV with minimal latency? I would appreciate any guidance, code examples, or references to relevant documentation.
In the past, when using Lightning, many external devices had to go through MFi certification. However, since the iPhone 15 switched from Lightning to USB-C, is MFi certification still required?
Our company has developed several UVC devices, and we have confirmed that iPads can read frames from external cameras through the external device type in AVFoundation. However, this is not supported on iPhones.
We are currently exploring feasible ways to enable UVC device support on iPhones. Is MFi certification the only option? If so, is the MFi certification process for USB-C the same as it was for Lightning? Does it still require purchasing an MFi chip and manufacturing specially designed USB-C cables?
When I use musicKit SDK for Android 1.1.2, I found that MediaContainerType only defines three types:
NONE = 0;
ALBUM = 1;
PLAYLIST = 2;
The RADIO_STATION type is not defined.
However, the documentation of com.apple.android.music.playback.model states that the RADIO_STATION type is supported.
This problem causes an error after I pass in the stations ID:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
May I ask how to solve this problem?
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型:
无 = 0;
专辑 = 1;
播放列表 = 2;
未定义 RADIO_STATION 类型。
但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。
此问题在我传入 stations ID 后会导致错误:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
请问如何解决这个问题?
I take that MusicKit JS is built with TypeScript, based on the attributions in the script: https://js-cdn.music.apple.com/musickit/v3/musickit.js
In the script it points to https://js-cdn.music.apple.com/musickit/v1/acknowledgements.txt – I assume this should be the v3 URL for the v3 version? It returns the same content nonetheless.
This contains attributions for TypeScript.
Currently there's a third-party effort with DefinitelyTyped, which publishes the NPM package @types/musickit-js. The latest supported version available is v1.
However, there is no version compatible with v3.
This makes it hard to use MusicKit JS v3 in a TypeScript project.
Please publish the types, ideally on the CDN along with the musickit.js file. Also consider publishing an officially Apple supported DefinitelyTyped package, or help to maintain the existing @types/musickit-js to make consuming this even easier.
getting an interesting error attempting to compile my app in Xcode 26 beta.
error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher')
note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher')
Unable to find module dependency: '_MediaPlayer_AppIntents'
Not sure what to try and pull to fix this issue
Hi,
I'm sending an API request to:
https://api.music.apple.com/v1/me/library/playlists?limit=$limit&offset=$offset
To list all of the users library playlists, however the resulting objects do not contain the playlist artwork in the JSON. I've tried adding the extend and include attributes as well but to no avail.
A partial example of the response:
{"id": PLAYLIST_ID, "type": "library-playlists", "href": "/v1/me/library/playlists/PLAYLIST_ID", "attributes": {"lastModifiedDate": "2024-09-18T20:18:24Z", "canEdit": true, "name": "Afro Party Anthems", "isPublic": false, "description": {"standard": "Definitive African party starters"}, "hasCatalog": false, "dateAdded": "2022-03-10T18:30:56Z", "playParams": {"id": PLAYLIST_ID, "kind": "playlist", "isLibrary": true}}, "relationships": {"catalog": {"href": "/v1/me/library/playlists/PLAYLIST_ID/catalog", "data": []}}}
Is there a way to get the artwork URL without sending a request for each playlist? And if not can this be fixed?
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask.
At the very least a small sample library that works in the simulator would be welcome (similar to how photos works)
Cheers
It's been well over a year since Apple added favoriting of artists back to Apple Music (the little star icon on an artist page), but yet I still haven't seen a way to get this data from an authenticated user from Music API. I was expecting to hear something about this during the WWDC, but there have been no announcements that I've caught.
Has anyone else heard anything? People assume when they provide access to their Apple Music account that we can actually get to the data in their Apple Music account, and we end up looking a little dumb not being able to get this core data.
I've just begun to dip my toes into the iOS16 waters.
One of the first things that I've attempted is to edit a library playlist using:
try await MusicLibrary.shared.edit(targetPlaylist, items: tracksToAdd)
Where targetPlaylist is of type MusicItemCollection<MusicKit.Playlist>.Element and tracksToAdd is of type [Track]
The targetPlaylist was created, using new iOS16 way, here:
let newPlaylist = try await MusicLibrary.shared.createPlaylist(name: name, description: description)
tracksToAdd is derived by performing a MusicLibraryRequest on a specific playlist ID, and then doing something like this:
if let tracksToAdd = try await playlist.with(.tracks).tracks {
// add tracks to target playlist
}
My problem is that when I perform attempt the edit, I am faced with a rather sad looking crash.
libdispatch.dylib`dispatch_group_leave.cold.1:
0x10b43d62c <+0>: mov x8, #0x0
0x10b43d630 <+4>: stp x20, x21, [sp, #-0x10]!
0x10b43d634 <+8>: adrp x20, 6
0x10b43d638 <+12>: add x20, x20, #0xfbf ; "BUG IN CLIENT OF LIBDISPATCH: Unbalanced call to dispatch_group_leave()"
0x10b43d63c <+16>: adrp x21, 40
0x10b43d640 <+20>: add x21, x21, #0x260 ; gCRAnnotations
0x10b43d644 <+24>: str x20, [x21, #0x8]
0x10b43d648 <+28>: str x8, [x21, #0x38]
0x10b43d64c <+32>: ldp x20, x21, [sp], #0x10
-> 0x10b43d650 <+36>: brk #0x1
I assume that I must be doing something wrong, but I frankly have no idea how to troubleshoot this.
Any help would be most appreciated. Thanks. @david-apple?