Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

ShazamKit for Android and 16 KB native library alignment
Hello, I'm working on a Flutter app targeting both Android and iOS, where I implemented ShazamKit. In order to achieve that, I first tried with the flutter_shazam_kit package, but since it's not maintained anymore, I forked it here, and tried to update it to meet the Google Play Store requirements, as you can see here: https://github.com/mregnauld/flutter_shazam_kit/tree/fix-16k Unfortunately, after trying everything, my app still doesn't meet the (not so) new 16 KB native library alignment. Also, I'm 100% sure it comes from that because the error message disappears if I remove that package from my app. So after investigating, it seems that the problem comes from the ShazamKit for Android (that you can find here: https://developer.apple.com/download/all/?q=Android%20ShazamKit), and especially the .so files in the .aar file. Is there anything I can do to fix that, or should I wait before the ShazamKit team fix that? I'm totally stuck with that so any help is highly appreciated. Thanks.
3
0
188
3h
Apple Music API: Adding To Collaborative playlist gives 500 error
I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists. For collaborative playlist I get the following 500 error as a response: "errors": [ { "id": "<some id>", "title": "Upstream Service Error", "detail": "Unable to update tracks", "status": "500", "code": "50001" } ] } Steps to reproduce: Create a playlist in your library. Use the api to add a song. Confirm that it works. Make that same playlist collaborative. Update the playlist ID in your api request (as making a playlist collaborative changes its id) Confirm that you get the 500 error.
5
0
731
1d
iTunes Search API no longer returning explicit results?
My app has been using the iTunes Search API (itunes.apple.com/search) for a few years now, but at some point over the last week or so (late Sept. 2025) it is no longer returning track results with explicit content, regardless of whether I provide "explicit=Yes" (which is the default anyway, according to the API documentation - https://performance-partners.apple.com/search-api). Has anyone else experienced this with this API and have you figured out a workaround? FYI, I do also use the more robust Apple Music API in another part of my app, which isn't going through this issue, so I know it's technically an alternative. I just need to stick with iTunes Search API in this particular case. Thanks.
1
1
130
2d
Take correctly sized screenshots with ScreenCaptureKit
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window. But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false. Is there a way and what's the best way to take full-resolution screenshots of the correct size? import Cocoa import ScreenCaptureKit class ViewController: NSViewController { @IBOutlet weak var imageView: NSImageView! override func viewDidAppear() { imageView.imageScaling = .scaleProportionallyUpOrDown view.wantsLayer = true view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1) Task { let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows let window = windows[0] let filter = SCContentFilter(desktopIndependentWindow: window) let configuration = SCStreamConfiguration() configuration.ignoreShadowsSingleWindow = false configuration.showsCursor = false configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale) configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale) print(filter.contentRect) let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration) imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height)) } } }
5
0
870
2d
Metal CIKernel instances with arbitrarily structured data arguments
Hi, In the iOS13 and macOS Catalina release notes it says: Metal CIKernel instances now support arguments with arbitrarily structured data. I've been trying to use this functionality in a CIKernel with mixed results. I'm particularly interested in passing data in the form of a dynamically sized array. It seems to work up to a certain size. Beyond the threshold excessive data is discarded and the kernel becomes unstable. I assume there is some kind of memory alignment issue going on, but I've tried various types in my array and always get a similar result. I have not found any documentation or sample code regarding this. It would be great to know how this is intended to work and what the limitations are. In the forums there are two similar unanswered questions about data arguments, so I'm sure there are a few out there with similar issues. Thanks! Michael
5
0
386
3d
Ventura Hack for FireWire Core Audio Support on Supported MacBook Pro and others...
Hi all,  Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support.  I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features.  Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS.  Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality. There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap.  Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received!  Thanks,  em
61
9
32k
3d
MPMusicPlayerController.applicationMusicPlayer.currentPlaybackRate no longer working in iOS 26.0 (Tahoe)?
I'm wondering if someone happened issues with currentPlaybackRate in released version of iOS 26.0? There is no issue happened in case of iOS 18.5. When I changed currentPlaybackRate on iOS 26.0, it seems to unexpectedly change currentPlaybackRate to be 0 or 1.0 forcibly no matter DRM or non-DRM contents. And also, playing music will be abnormal behavior and unstable with noise if currentPlaybackRate is not 1.0. And changes stop state and play state frequently.
0
1
121
1w
Clarification on SFSpeechRecognizer system alert message and service URLs for whitelisting
Hello Apple Engineers, I am developing a feature related to SpeechRecognizer (import Speech), and I have two questions: After adding the NSSpeechRecognitionUsageDescription key in my Info.plist, when I initialize an SFSpeechRecognizer instance, the system shows an authorization alert. The alert contains a pre-defined message from Apple: “Speech data from this app will be sent to Apple to process your requests. This will also help Apple improve its speech recognition technology.” Is it possible to remove or customize this message? My app runs in a network environment with a whitelist. I need to know which URL the SFSpeechRecognizer instance connects to, and which port it uses, so that I can add it to the whitelist. Thank you very much for your support! Best regards, Yu Cheng
0
0
27
1w
Failed to change the TTS language to CN or TW
I have some question about the TTS My device default language is zh-HK. (cantonese) my device is iPhone 16 Pro, IOS 18.6 I create a function speakMandarin I want the device to speak the zh-CN (putonghua) however the device only can speak zh-HK (cantonese). I already set the AVSpeechSynthesisVoice language as zh-CN func speakMandarin(text: String) { print("speakMandarin, \(text)") lastError = nil // Reset error // Stop any ongoing speech before starting new if synthesizer.isSpeaking { synthesizer.stopSpeaking(at: .immediate) } // Configure speech utterance let utterance = AVSpeechUtterance(ssmlRepresentation: text)! utterance.rate = 0.5 // Natural speaking speed utterance.pitchMultiplier = 1.0 utterance.volume = 1.0 utterance.voice = AVSpeechSynthesisVoice(language: "zh-CN") let preferredLanguages = [ "zh-CN" , "zh-TW"] var selectedVoice: AVSpeechSynthesisVoice? for lang in preferredLanguages { if let voice = AVSpeechSynthesisVoice(language: lang) { print(lang) selectedVoice = voice utterance.voice = voice break } } // If no Mandarin voice found, use system default if selectedVoice == nil { selectedVoice = AVSpeechSynthesisVoice(language: nil) lastError = "未偵測到普通話語音包,將使用系統預設語音" print (lastError) } utterance.voice = selectedVoice print(utterance) synthesizer.speak(utterance) } here is my log speakMandarin, <speak>你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢?</speak> zh-CN [AVSpeechUtterance 0x1194efb80] String: 你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢? Voice: [AVSpeechSynthesisVoice 0x104ceff90] Language: zh-CN, Name: Tingting, Quality: Default [com.apple.voice.compact.zh-CN.Tingting] Rate: 0.50 Volume: 1.00 Pitch Multiplier: 1.00 Delays: Pre: 0.00(s) Post: 0.00(s)
0
0
193
1w
iOS26中ALAssetsLibrary 编译报错问题
mac os 系统版本:26.0 (25A354) Xcode版本:Version 26.0 (17A324) 项目编译报错 `SwiftExplicitDependencyCompileModuleFromInterface arm64 /Users/zhz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/AssetsLibrary-HTIJ05N58KN3.swiftmodule /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:10:25: error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead 8 | public import _StringProcessing 9 | public import _SwiftConcurrencyShims 10 | extension AssetsLibrary.ALAssetsLibrary { | `- error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead 11 | #if compiler(>=5.3) && $NonescapableTypes 12 | @available(iOS, introduced: 9.0, deprecated: 9.0, obsoleted: 26.0) /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/System/Library/Frameworks/AssetsLibrary.framework/Headers/ALAssetsLibrary.h:80:12: note: 'ALAssetsLibrary' was obsoleted in iOS 26.0 78 | 79 | OS_EXPORT AL_DEPRECATED(4, "Use PHPhotoLibrary from the Photos framework instead") 80 | @interface ALAssetsLibrary : NSObject { | `- note: 'ALAssetsLibrary' was obsoleted in iOS 26.0 81 | @package 82 | id _internal; /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:1:1: error: failed to build module 'AssetsLibrary'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.17.14 clang-1700.3.17.1)', while this compiler is 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.19.9 clang-1700.3.19.1)'). Please select a toolchain which matches the SDK.
2
3
634
2w
donate INPlayMediaIntent to systerm, but not show in control center
I donate INPlayMediaIntent to systerm(donate success), but not show in control center My code is as follows let mediaItems = mediaItems.map { $0.inMediaItem } let intent = if #available(iOS 13.0, *) { INPlayMediaIntent(mediaItems: mediaItems, mediaContainer: nil, playShuffled: false, playbackRepeatMode: .none, resumePlayback: true, playbackQueueLocation: .now, playbackSpeed: nil, mediaSearch: nil) } else { INPlayMediaIntent(mediaItems: mediaItems, mediaContainer: nil, playShuffled: false, playbackRepeatMode: .none, resumePlayback: true) } intent.suggestedInvocationPhrase = "播放音乐" let interaction = INInteraction(intent: intent, response: nil) interaction.donate { error in if let error = error { print("Intent 捐赠失败: \(error.localizedDescription)") } else { print("Intent 捐赠成功 ✅") } }
3
0
104
3w
MPNowPlayingInfoCenter playbackState fails to update after losing audio focus on macOS
My Environment: Device: Mac (Apple Silicon, arm64) OS: macOS 15.6.1 Description: I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state. I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue. Steps to Reproduce: Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B): Start playback in App A. Start playback in App B. (App B now has audio focus, and App A is still playing). Attempt to pause App A via the system's Control Center or its own UI. Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive. If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state. My Questions: Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application? Is this a known issue or expected behavior in macOS? Are there any official workarounds or solutions to ensure the UI updates correctly?
0
0
110
3w
401 Unauthorized when attempting to access Apple Music Feed API
Hello, I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it. I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully). > GET /v1/feed/song/latest HTTP/2 > Host: api.media.apple.com > user-agent: insomnia/2023.5.8 > authorization: Bearer [REDACTED] > accept: */* < HTTP/2 401 < content-type: application/json; charset=utf-8 < content-length: 0 < x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q < x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod and also the Apple provided Python example code, which gives me authentication errors too. $ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path "/Users/foxt/Documents/am-feed/NMBH[...].p8" --out-dir . running.... INFO:__main__:Sending requests to https://api.media.apple.com INFO:__main__:Getting the latest export for feed artist Exception: Authentication Failed. Did you provide the correct team id, key id, and p8 file? Does this API need to be enabled on my account separately from the main Apple Music API? The documentation reads to me as if anyone with an Apple Developer Programme membership can use this API and I did not see any information regarding any other requirements
0
0
338
4w
Can individual Apple Developer accounts stream full tracks with MusicKit?
I have implemented fetching Apple Music preview songs using a Swift framework integrated into a Unity app. My requirement is to fetch full tracks from a user’s Apple Music library and play them inside Unity. To do this, I understand that I need to handle authentication, generate a Developer Token, and then obtain a Music User Token to access the user’s Apple Music content. Currently, I have an Individual Apple Developer account (not Organization). Based on my research, it seems that: With an Individual account, I can implement this functionality and even upload builds to TestFlight for internal testing. However, when releasing the app publicly on the App Store, full-track playback may be restricted for Individual accounts and allowed only for Organization accounts. 👉 Can you confirm if this understanding is correct? 👉 Specifically, is it possible for an Individual account to fetch and play full-length tracks from a subscribed Apple Music user’s library (at least for internal/TestFlight testing)?
0
0
149
Sep ’25
Use MusicKit's User Library Artists with Catalog Artists?
When making a call to https://api.music.apple.com/v1/me/library/artists to get a user's library artists, it returns the following (as an example): [ { id: 'r.FCwruQb', type: 'library-artists', href: '/v1/me/library/artists/r.FCwruQb?l=en-US', attributes: { name: 'A Great Big World' } }, { id: 'r.7VSWOgj', type: 'library-artists', href: '/v1/me/library/artists/r.7VSWOgj?l=en-US', attributes: { name: 'Aaliyah' } }, ... ] If I try and use an artist id from that retuned data to look up additional information about the artist by calling https://api.music.apple.com/v1/catalog/us/artists/{id}, it fails. User Library Artists don't seem to equal Catalog Artists. It'd be great if there was a way to use these interchangeably. Am I missing something?
0
0
239
Aug ’25
Why does CADisplayLink of an external UIScreen drift in time?
I am using Apple's original Lightning Digital AV-adapter (Lightning-to-HDMI dongle) to connect my iPhone to an external display via a HDMI cable. I need to synchronize rendering with the external display's refresh rate, so I create a new CADisplayLink tied to the external display's UIScreen: UIScreen.screens[externalDisplayIdx].displayLink(withTarget:, selector:). The callback is being called regularly, but with increasing delay relative to the CADisplayLink.timestamp, so the next time the callback is called, I have less and less time to draw the next frame (see the snippet below). Assuming 60 FPS, the value of secondsTillDeadline starts at an arbitrary value in the range of approx -0.0001 to 0.0166667, and then it slowly decreases towards zero (and for a brief period it goes into small negative numbers). Once it reaches zero, it flips back to 0.0166667 and continues to decrease again. This cycle repeats indefinitely. Changing the external display's resolution (UIScreen's mode) or the CADisplayLink's preferredFrameRateRange to a lower FPS does not seem to have any effect on the temporal drifting (even the rate of change seem to be the same). When I create a new CADisplayLink for the iPhone's main screen, the value of secondsTillDeadline is stable, it does not drift and it is very close to 0.0166667, as expected. Is this drift caused by the external monitor or by Apple's Lightning-to-HDMI dongle ...or is the problem somewhere else? Can the drifting be stopped? func onDisplayLinkUpdate(displayLink: CADisplayLink) { // Gradually decreases from 0.01667 to -0.0001, then flips back to 0.01667 and continues to decrease let secondsTillDeadline = displayLink.targetTimestamp - CACurrentMediaTime() }
3
0
203
Aug ’25