Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

iOS 16 MusicKit - Artwork has no background or text colors
Hello, I'm new to the Swift MusicKit API and am starting with the implementation in iOS 16. I'm getting stuck on an issue where there is no background or text color associated with the Artwork object. Is this something you have to make an additional property request for, and if so, how do you do that? var catalogSearch = MusicCatalogResourceRequest<Album>(matching: \.id, equalTo: item.id) let catalogResponse = try await request.response() guard let firstItem = catalogResponse.items.first else { return } In this example, firstItem.artwork only contains the url and what look like incorrect max width/height values. here's a printout of firstItem.artwork Optional(Artwork(   urlFormat: "musicKit://artwork/library/5F37858D-F46B-4F12-BA67-40FA8DD63D87/{w}x{h}?at=item&fat=&id=7718670444435992305&lid=5F37858D-F46B-4F12-BA67-40FA8DD63D87&mt=music&aat=Music122/v4/37/25/f5/3725f515-249f-7b91-77bb-f479cd48201c/22UMGIM32254.rgb.jpg", maximumWidth: 0, maximumHeight: 0 ))
1
2
893
Mar ’25
ScreenCaptureKit and mixed Retina/non-Retina configuration
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration. When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer. There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture). When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value. I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it. What is the proper way with ScreenCapture to handle the retina factors ?
1
1
801
Jun ’25
MusicKit / macOS : Song.Artwork not nil when there is no Artwork
I'm using iCloud Music Library. I’m using macOS 14.1 (23B74) and iOS 17.1. i’m using MusicKit to find songs that do not have artwork. On iOS, Song.artwork will be nil for items I know do not have artwork. On macOS, Song.artwork is not nil. However when the songs are shown in Music.app, they do not have Artwork. Is this expected? Alternately, is there a more correct way to determine that a Song has no Artwork? I have also filed FB13315721. Thank you for any tips!
1
0
749
Nov ’24
"Remote call timed out" error when trying to play large collection of music items with MusicKit's ApplicationMusicPlayer
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying: "Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}})) It doesn't matter if songs are downloaded to the device or not. I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me. I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same. typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry let player = ApplicationMusicPlayer.shared let entries: [QueueEntry] = tracks .compactMap { guard let song = $0 as? Song else { return nil } return QueueEntry(song) } Task(priority: .high) { [player] in do { player.queue = .init(entries, startingAt: nil) try await player.play() // prepareToPlay failed } catch { print(error) } }
1
0
636
Mar ’25
How to detect the end of playback with the system music player?
Since iOS 12 it has become difficult to detect the end of playback using the system music player. In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped. In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused. Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference. It would be nice if they added a notification that playback was done (similar to the other players). Any suggestions?
1
1
753
Mar ’25
MusicKit and sorted artist and album names?
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit. iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist. I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks. FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
1
1
503
Nov ’24
Phone turns black
My iphone 15 plus suddenly turns black and a losing icon keeps spinning. Then it turns off and I can use it again, it is only for a few seconds. I have updated to iOS 18.1 beta, could this be the issue. Is my phone broken? I have tried restarting my phone
1
0
466
Dec ’24
Playlist IDs from MusicKit not working with Apple Music API
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists. I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists. Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API. Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help! func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async { let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap { AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs" }) let playlistID = playlist.id // Build the request URL for adding a song to a playlist guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else { alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.") return } // Authorization Header guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else { alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.") return } do { var request = URLRequest(url: url) request.httpMethod = "POST" request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization") request.setValue("application/json", forHTTPHeaderField: "Content-Type") let encoder = JSONEncoder() let data = try encoder.encode(tracks) request.httpBody = data let musicRequest = MusicDataRequest(urlRequest: request) let musicRequestResponse = try await musicRequest.response() // Check if the request was successful (status 201) if musicRequestResponse.urlResponse.statusCode == 201 { alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.") } else { print("Status Code: \(musicRequestResponse.urlResponse.statusCode)") print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")") // Attempt to decode the error response into the AppleMusicErrorResponse model if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) { let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred." alert.wrappedValue = AlertItem(title: "Error", message: errorMessage) } else { alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.") } } } catch { alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)") } }
1
1
770
Dec ’24
Apple music feed Parquet download urls
https://api.media.apple.com/v1/feed/exports/song_2024-11-02T16-02/parts?limit=200&offset=400 This is the api used to get parquet file urls. I need all the urls in one api hit, right now if I don't provide the limit then default it is taking 100 and max is 200. How to get all the records in one hit? Or the count of parquet records in one hit?
1
1
515
Nov ’24
iOS 18.2 crash on CGImageDestinationFinalize
My app reports a lot of crashes from 18.2 users. I have been able to narrow down the issue to this line of code: CGImageDestinationFinalize(imageDestination) The error is Thread 93: EXC_BAD_ACCESS (code=1, address=0x146318000) But I have no idea why this suddently started to crash. Here is the code of the function: private func estimateSizeUsingThumbnailMethod(fromImageURL url: URL, imageSettings: ImageSettings) -> (Int, Int) { let sourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary guard let source = CGImageSourceCreateWithURL(url as CFURL, sourceOptions), let imageProperties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString: Any], let imageWidth = imageProperties[kCGImagePropertyPixelWidth] as? CGFloat, let imageHeight = imageProperties[kCGImagePropertyPixelHeight] as? CGFloat else { return (0, 0) } let maxImageSize = max(imageWidth, imageHeight) let thumbMaxSize = min(2400, maxImageSize) // Use original size if possible, but not if larger than 2400, in this case we'll extrapolate from thumbnail let downsampleOptions = [ kCGImageSourceCreateThumbnailFromImageAlways: true, kCGImageSourceCreateThumbnailWithTransform: true, kCGImageSourceThumbnailMaxPixelSize: thumbMaxSize as CFNumber, ] as CFDictionary guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, downsampleOptions) else { DLog("CGImage thumb creation error") return (0, 0) } let data = NSMutableData() guard let imageDestination = CGImageDestinationCreateWithData(data, UTType.jpeg.identifier as CFString, 1, nil) else { DLog("CGImage destination creation error") return (0, 0) } let destinationProperties = [ kCGImageDestinationLossyCompressionQuality: imageSettings.quality.compressionRatio() // Set jpeg compression ratio ] as CFDictionary CGImageDestinationAddImage(imageDestination, cgImage, destinationProperties) CGImageDestinationFinalize(imageDestination) // <----- CRASHES HERE with EXC_BAD_ACCESS ... } So far, I'm stuck. Any idea that could help would be greatly appreciated, as I'm scared that this crash will propagate on the official release of 18.2
1
0
822
Nov ’24
Does updating MPNowPlayingInfoPropertyElapsedPlaybackTime frequently harm performance?
Hi, I'm wondering about one of the properties in the MPNowPlayingInfoCenter: MPNowPlayingInfoPropertyElapsedPlaybackTime. The docs say that updating this property frequently is not required, because the system can automatically calculate elapsed playback time based on the infrequent values we provide. Is performance harmed by updating this property every second? Should I add some filtering/throttling to update this property infrequently? Am I overthinking this, and it doesn't matter either way? Kind regards.
1
0
581
Dec ’24
Need help with run screen record like root user
Hi, On macOS Sonoma and earlier versions, I used my script along with a LaunchDaemon to continuously record my screen. However, after upgrading to macOS Sequoia, the LaunchDaemon no longer works for screen recording. It only works when I use a LaunchAgent. For my workflow, using a LaunchDaemon and running the ffmpeg process as root is much more efficient than running it as a regular user. Does anyone know how to run a script in the background using a LaunchDaemon on macOS Sequoia? Here are my LaunchDaemon plist and script: LaunchDaemon plist: &lt;?xml version="1.0" encoding="UTF-8"?&gt; &lt;!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"&gt; &lt;?xml version="1.0" encoding="UTF-8"?&gt; &lt;!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"&gt; &lt;plist version="1.0"&gt; &lt;dict&gt; &lt;key&gt;Label&lt;/key&gt; &lt;string&gt;local.ScreenRecord&lt;/string&gt; &lt;key&gt;Disabled&lt;/key&gt; &lt;false/&gt; &lt;key&gt;RunAtLoad&lt;/key&gt; &lt;true/&gt; &lt;key&gt;KeepAlive&lt;/key&gt; &lt;true/&gt; &lt;key&gt;Nice&lt;/key&gt; &lt;integer&gt;-20&lt;/integer&gt; &lt;key&gt;ProgramArguments&lt;/key&gt; &lt;array&gt; &lt;string&gt;/bin/bash&lt;/string&gt; &lt;string&gt;/usr/local/distrib/record.bash&lt;/string&gt; &lt;/array&gt; &lt;/dict&gt; &lt;/plist&gt; Script ## !!! START CONFIGURATION !!! # FFMPEG_LOC="/usr/local/distrib/ffmpeg" FFMPEG_INPUT="-hide_banner -f avfoundation -capture_cursor 1 -pixel_format uyvy422" FFMPEG_VSET="-codec libx264 -r 22 -crf 26 -preset veryfast -b:v 5M -maxrate 7M -bufsize 14M" TIME_REC="-t 600" FOLDER_REC="/usr/local/distrib/REC/" # ## !!! END CONFIGURATION !!! # # Find number id monitor MON_ID=$($FFMPEG_LOC -hide_banner -f avfoundation -list_devices true -i "" 2&gt;&amp;1 | awk -F'[]|[]' '/Capture\ screen/ {print $4}') # if [ -n "$MON_ID" ] then # Time for name DATE_REC=$(date +"%m-%d-%Y_%H-%M-%S") # Number of output video file OUTPUT_NUM=0 # Full command to record FFMPEG_FULL_COMMAND="" for MON_NUM in $MON_ID do FFMPEG_FULL_COMMAND=""$FFMPEG_FULL_COMMAND" "$FFMPEG_INPUT" -i "$MON_NUM" "$FFMPEG_VSET" "$TIME_REC" -map "$OUTPUT_NUM" "$FOLDER_REC""$DATE_REC"_Mon"$MON_NUM".mkv" let OUTPUT_NUM=$OUTPUT_NUM+1 done $FFMPEG_LOC $FFMPEG_FULL_COMMAND else echo "No monitors" sleep 5 fi
1
0
486
Jan ’25
Question regarding CarPlay Integration for a Note/Voice Recording App
Hello everyone, I am currently working on an app project aimed at users who want to quickly and easily capture their ideas and notes while on the go. The basic concept is to develop an iOS app where users can store both typed notes and voice recordings – essentially a "brain dump" solution. The core functionality (storing, editing, synchronizing via CloudKit, etc.) will be handled within the iOS app. In addition, I plan to integrate a CarPlay extension that allows the driver to start and stop a recording – ideally through a minimalist interface featuring a large record button and a "Done" button. Since the iPhone is often not within immediate reach in the car, the CarPlay integration should serve as a quick trigger to initiate the recording in the iOS app. My questions are as follows: Has anyone had experience implementing a CarPlay extension for an app that primarily handles notes and voice recordings, rather than falling into the traditional categories like navigation, audio, or communication? Has such a concept ever been approved by Apple, or are there known hurdles and guidelines that must be observed? Are there alternative approaches to implementing CarPlay integration in this context in a compliant and effective manner? I would greatly appreciate any feedback, shared experiences, and tips on best practices. Thank you in advance and best regards!
1
0
444
Feb ’25
How can I add support for Apple Music lyrics sharing in my app?
I noticed that Instagram and iMessage support receiving shared lyrics from Apple Music. Specifically, when users long-press lyrics, a sheet pops up showing iMessage and Instagram. Clicking on either app generates a beautifully formatted lyrics image. I've looked through MusicKit documentation but couldn't find any related APIs. How can I implement this functionality in my app?
1
1
413
Feb ’25
SFSpeechRecognizer throws User denied access to speech recognition
I have created an app where you can speak using SFSpeechRecognizer and it will recognize you speech into text, translate it and then return it back using speech synthesis. All locales for SFSpeechRecognizer and switching between them work fine when the app is in the foreground but after I turn off my screen(the app is still running I just turned off the screen) and try to create new recognitionTask it it receives this error inside the recognition task: User denied access to speech recognition. The weird thing about this is it only happens with some languages. The error happens with Croatian or Hungarian locale for speech recognition but doesn't with English or Spanish locale.
1
0
361
Mar ’25
HDR video metadata
On an iOS 18 phone, I use AVCaptureSession to capture HDR with x420 format. The output CMSampleBuffer is HLG colorspace, the propagated attachments contain kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey. Now I use CAMetalLayer to render the CVPixelBuffer to the screen, but the brightness is brighter than AVSampleBufferDisplayLayer. Here is my code. - (void)_updateColorSpaceIfNeed:(CVPixelBufferRef)pixelBuffer { CAMetalLayer *layer = (CAMetalLayer *)_mtkView.layer; if (![layer isKindOfClass:CAMetalLayer.class]) return; layer.wantsExtendedDynamicRangeContent = YES; CFDataRef ambientViewingEnvironment = (CFDataRef)CVBufferCopyAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, NULL); NSData *data = (__bridge NSData *)ambientViewingEnvironment; if (ambientViewingEnvironment) CFRelease(ambientViewingEnvironment); CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadataWithAmbientViewingEnvironment:data]; // CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadata]; layer.EDRMetadata = metadata; layer.pixelFormat = MTLPixelFormatRGBA16Float; CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_2100_HLG); layer.colorspace = colorspace; if (colorspace) CGColorSpaceRelease(colorspace); } Why does the CAEDRMetadata class have "HLGMetadataWithAmbientViewingEnvironment:" and "HLGMetadata" methods, but does not provide the "HLGMetadataWithAmbientViewingEnvironment:sceneIllumination" method? I want to know how kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey affect tone mapping. Is there any documentation I can refer to?
1
0
406
Mar ’25
Urdu Language Keyboard Bug
Hello Apple, i've been using ios for many years and never had any issues with urdu language keyboard, but since the new 18.4 beta update some words are not working correctly as it should like a name of my friend who's name is "راعنیہ" but the new updated version cannot type is together and keep seperating like "راعنی ہ" its so frustrating to use like that and its not just one but so many other words that it just cannot do properly also the new font and no gap concept its hurting my eyes so much while reading or even typing.. i hope apple fixes that asap.. thankyou
1
0
453
Feb ’25