With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the  sound volume of device to the level when user muted before.
On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons).
As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system.
But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior.
I need some bases to explain this situation. Thank you.
                    
                  
                General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
            Post
Replies
Boosts
Views
Created
                    
                      I use
htttps://api.music.apple.com/v1/me/library/playlists/${playlistId}/tracks
to add tracks to a playlist I created.
How do I DELETE tracks from the playlist?
The documentation does not mention a method for this. I have tried calling DELETE methods in various combinations but nothing seems to work.
Is this possible?
                    
                  
                
                    
                      When I’m on FaceTime, my phone will randomly end my call? I have an iPhone 17, iOS 26.1. Some times I won’t even be touching my phone screen and it’ll hang up. I’m not sure if this is a universal issue or just a me problem. It’s getting really annoying.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hello, I'm trying to write a shortcut using Toolbox Pro that gets triggered by an accessibility trigger and then favorites the currently playing song. It's working pretty well, but I noticed that for some artists, especially asian ones, it simply doesn't work. While debugging, I noticed that the tool uses the same song ID, artist ID, everything as it should to search for the song and favorite it. However, I noticed that Apple Music treats artists with romanized names as two separate artists!
https://music.apple.com/br/artist/王菲/41760704
https://music.apple.com/br/artist/faye-wong/41760704?l=en-GB
You can see that the ID is the same (41760704). It seems that, when I search for the artist, the first artist (王菲) returns, so that when I open URLs on the web for the artist I can see a star next to the song name, meaning that it got a like. However, the romanized artist (faye-wong) doesn't have a like on the same song.
This is very weird, right?
                    
                  
                
                    
                      Hi there,
I recently launched a dj app to the mac app store, and was wondering how I could access raw data to songs from apple music just like how serato, rekordbox, djay, etc. do?
Thanks,
Gunek
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Apple Music API
          
        
        
      
      
    
      
      
      
        
          
            MusicKit
          
        
        
      
      
    
      
      
      
        
          
            Performance Partners Program
          
        
        
      
      
    
      
      
      
        
          
            Apple Music Feed
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      Fetching the featured artists in a playlist, no longer works in iOS 26.1 beta
 let detailedPlaylist = try await playlist.with([.tracks, .featuredArtists], preferredSource: .library)
Throws error when using .library and using .catalog returns empty array.
This works correctly in iOS 26.0 and iOS 18 versions
                    
                  
                
                    
                      If I fetch a library playlist like the generated "Favorites" playlist via MusicKit like this
guard let initialTracks = try await playlist.with([.tracks]).tracks else {
            return nil
}
I get a list of tracks like this:
...
TrackID:  i.e5gmPS6rZ856
TrackID:  i.4ZQMxU0OxNg0
TrackID:  i.J198KH4P85K4
TrackID:  i.J1AaRC4P85K4
TrackID:  i.4BPqWt0OxNg0
TrackID:  4473570282773028026
TrackID:  4473570282773028025
TrackID:  4015088256684964387
TrackID:  4473570282773028024
TrackID:  7541557725362154249
TrackID:  4473570282773028027
I save the IDs for later use, but when I want to fetch them, only the ones with ids that starts with "i." work.
static func getLibrarySong(from id: String) async -> Song? {
        var request = MusicLibraryRequest<Song>()
        request.filter(matching: \.id, equalTo: MusicItemID(id))
        
        do {
            let response = try await request.response()
            
            return response.items.first
        } catch {
            ...
        }
    }
Or the Apple Music API endpoint :
static func getLibrarySongFromAPI(with id: String) async -> Song? {
        guard let url = AppleMusicURL.getURL(for: .getSongById, id: id) else {
            return nil
        }
        do {
            let dataRequest = MusicDataRequest(urlRequest: URLRequest(url: url))
            let dataResponse = try await dataRequest.response()
            
            let response = try JSONDecoder().decode(SongsResponse.self, from: dataResponse.data)
            
            return response.data.first
        } catch {
            ...
        }
    }
Both functions above won't work for the non numeric like 4473570282773028024 so it seems the ID is wrong, but how do I make it work?
Otherwise I can fetch all the songs fine, in catalog or in the library, but these few songs can't be individually fetched, only with the try await playlist.with([.tracks])` fetch, that gets the whole playlist. But obviously this isn't always possible.
Thanks in advance!
                    
                  
                
                    
                      Hi, I'm working an a video editing software that lets you composite and export videos. I use a custom compositor to apply my effects etc.
In my crash dashboard, I am seeing a report of an EXC_BAD_ACCESS crash from objc_msgSend. Below is the stacktrace.
libobjc.A.dylib  objc_msgSend
libdispatch.dylib  _dispatch_sync_invoke_and_complete_recurse
libdispatch.dylib  _dispatch_sync_f_slow
[symbolication failed]
libdispatch.dylib  _dispatch_client_callout
libdispatch.dylib  _dispatch_lane_barrier_sync_invoke_and_complete
AVFCore  -[AVCustomVideoCompositorSession(AVCustomVideoCompositorSession_FigCallbackHandling) _customCompositorShouldCancelPendingFrames]
AVFCore  _customCompositorShouldCancelPendingFramesCallback
MediaToolbox  remoteVideoCompositor_HandleVideoCompositorClientMessage
CoreMedia  __figXPCConnection_CallClientMessageHandlers_block_invoke
libdispatch.dylib  _dispatch_call_block_and_release
libdispatch.dylib  _dispatch_client_callout
libdispatch.dylib  _dispatch_lane_serial_drain
libdispatch.dylib  _dispatch_lane_invoke
libdispatch.dylib  _dispatch_root_queue_drain_deferred_wlh
libdispatch.dylib  _dispatch_workloop_worker_thread
libsystem_pthread.dylib  _pthread_wqthread
libsystem_pthread.dylib  start_wqthread
What stood out to me is that this is only being reported from IOS 26.0+ devices. A part of the stacktrace failed to be symbolicated [symbolication failed]. I'm 90% confident that this is Apple code, not my app's code.
I cannot reproduce this locally. Is this a known issue? What are the possible root-causes, and how can I verify/eliminate them?
Thanks,
                    
                  
                
                    
                      My app has been using the iTunes Search API (itunes.apple.com/search) for a few years now, but at some point over the last week or so (late Sept. 2025) it is no longer returning track results with explicit content, regardless of whether I provide "explicit=Yes" (which is the default anyway, according to the API documentation - https://performance-partners.apple.com/search-api).  Has anyone else experienced this with this API and have you figured out a workaround?
FYI, I do also use the more robust Apple Music API in another part of my app, which isn't going through this issue, so I know it's technically an alternative.  I just need to stick with iTunes Search API in this particular case.  Thanks.
                    
                  
                
                    
                      add a currently playing track endpoint on the apple music api. its kinda wild how apple music goes after spotify without having such a useful endpoint.
                    
                  
                
                    
                      Hello,
I'm working on a Flutter app targeting both Android and iOS, where I implemented ShazamKit.
In order to achieve that, I first tried with the flutter_shazam_kit package, but since it's not maintained anymore, I forked it here, and tried to update it to meet the Google Play Store requirements, as you can see here:
https://github.com/mregnauld/flutter_shazam_kit/tree/fix-16k
Unfortunately, after trying everything, my app still doesn't meet the (not so) new 16 KB native library alignment. Also, I'm 100% sure it comes from that because the error message disappears if I remove that package from my app.
So after investigating, it seems that the problem comes from the ShazamKit for Android (that you can find here: https://developer.apple.com/download/all/?q=Android%20ShazamKit), and especially the .so files in the .aar file.
Is there anything I can do to fix that, or should I wait before the ShazamKit team fix that?
I'm totally stuck with that so any help is highly appreciated.
Thanks.
                    
                  
                
                    
                      I'm wondering if someone happened issues with currentPlaybackRate in released version of iOS 26.0?
There is no issue happened in case of iOS 18.5.
When I changed currentPlaybackRate on iOS 26.0, it seems to unexpectedly change currentPlaybackRate to be 0 or 1.0 forcibly no matter DRM or non-DRM contents. And also, playing music will be abnormal behavior and unstable with noise if currentPlaybackRate is not 1.0. And changes stop state and play state frequently.
                    
                  
                
                    
                      Hello Apple Engineers,
I am developing a feature related to SpeechRecognizer (import Speech), and I have two questions:
After adding the NSSpeechRecognitionUsageDescription key in my Info.plist, when I initialize an SFSpeechRecognizer instance, the system shows an authorization alert. The alert contains a pre-defined message from Apple:
“Speech data from this app will be sent to Apple to process your requests. This will also help Apple improve its speech recognition technology.”
Is it possible to remove or customize this message?
My app runs in a network environment with a whitelist. I need to know which URL the SFSpeechRecognizer instance connects to, and which port it uses, so that I can add it to the whitelist.
Thank you very much for your support!
Best regards,
Yu Cheng
                    
                  
                
                    
                      I have some question about the TTS
My device default language is zh-HK. (cantonese)
my device is iPhone 16 Pro, IOS 18.6
I create a function speakMandarin
I want the device to speak the zh-CN (putonghua)
however the device only can speak zh-HK (cantonese).
I already set the AVSpeechSynthesisVoice language as zh-CN
func speakMandarin(text: String) {
        print("speakMandarin, \(text)")
        lastError = nil // Reset error
        // Stop any ongoing speech before starting new
        if synthesizer.isSpeaking {
            synthesizer.stopSpeaking(at: .immediate)
        }
        
        // Configure speech utterance
        let utterance = AVSpeechUtterance(ssmlRepresentation: text)!
        utterance.rate = 0.5 // Natural speaking speed
        utterance.pitchMultiplier = 1.0
        utterance.volume = 1.0
        utterance.voice = AVSpeechSynthesisVoice(language: "zh-CN")
        let preferredLanguages = [  "zh-CN" , "zh-TW"]
        var selectedVoice: AVSpeechSynthesisVoice?
        for lang in preferredLanguages {
            if let voice = AVSpeechSynthesisVoice(language: lang) {
                print(lang)
                selectedVoice = voice
                utterance.voice = voice
                break
            }
        }
        
        // If no Mandarin voice found, use system default
        if selectedVoice == nil {
            selectedVoice = AVSpeechSynthesisVoice(language: nil)
            lastError = "未偵測到普通話語音包,將使用系統預設語音"
            print (lastError)
        }
        
        utterance.voice = selectedVoice
        print(utterance)
        synthesizer.speak(utterance)
    }
here is my log
speakMandarin, <speak>你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢?</speak>
zh-CN
[AVSpeechUtterance 0x1194efb80] String: 你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢?
Voice: [AVSpeechSynthesisVoice 0x104ceff90] Language: zh-CN, Name: Tingting, Quality: Default [com.apple.voice.compact.zh-CN.Tingting]
Rate: 0.50
Volume: 1.00
Pitch Multiplier: 1.00
Delays: Pre: 0.00(s) Post: 0.00(s)
                    
                  
                
                    
                      mac os 系统版本:26.0 (25A354)
Xcode版本:Version 26.0 (17A324)
项目编译报错
`SwiftExplicitDependencyCompileModuleFromInterface arm64 /Users/zhz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/AssetsLibrary-HTIJ05N58KN3.swiftmodule
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:10:25: error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead
8 | public import _StringProcessing
9 | public import _SwiftConcurrencyShims
10 | extension AssetsLibrary.ALAssetsLibrary {
|                         `- error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead
11 |   #if compiler(>=5.3) && $NonescapableTypes
12 |   @available(iOS, introduced: 9.0, deprecated: 9.0, obsoleted: 26.0)
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/System/Library/Frameworks/AssetsLibrary.framework/Headers/ALAssetsLibrary.h:80:12: note: 'ALAssetsLibrary' was obsoleted in iOS 26.0
78 |
79 | OS_EXPORT AL_DEPRECATED(4, "Use PHPhotoLibrary from the Photos framework instead")
80 | @interface ALAssetsLibrary : NSObject {
|            `- note: 'ALAssetsLibrary' was obsoleted in iOS 26.0
81 | @package
82 |     id _internal;
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:1:1: error: failed to build module 'AssetsLibrary'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.17.14 clang-1700.3.17.1)', while this compiler is 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.19.9 clang-1700.3.19.1)'). Please select a toolchain which matches the SDK.
                    
                  
                
                    
                      Hi,
In the iOS13 and macOS Catalina release notes it says:
Metal CIKernel instances now support arguments with arbitrarily structured data.
I've been trying to use this functionality in a CIKernel with mixed results. I'm particularly interested in passing data in the form of a dynamically sized array. It seems to work up to a certain size. Beyond the threshold excessive data is discarded and the kernel becomes unstable. I assume there is some kind of memory alignment issue going on, but I've tried various types in my array and always get a similar result.
I have not found any documentation or sample code regarding this. It would be great to know how this is intended to work and what the limitations are.
In the forums there are two similar unanswered questions about data arguments, so I'm sure there are a few out there with similar issues.
Thanks!
Michael
                    
                  
                
                    
                      Queria saber quando lança o IOS 26 oficialmente
I wanted to know when iOS 26 will be officially released.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      My Environment:
Device: Mac (Apple Silicon, arm64)
OS: macOS 15.6.1
Description:
I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state.
I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue.
Steps to Reproduce:
Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B):
Start playback in App A.
Start playback in App B. (App B now has audio focus, and App A is still playing).
Attempt to pause App A via the system's Control Center or its own UI.
Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive.
If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state.
My Questions:
Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application?
Is this a known issue or expected behavior in macOS?
Are there any official workarounds or solutions to ensure the UI updates correctly?
                    
                  
                
                    
                      Hello, I want to know if there are any restrictions with MusicKit to be used in a mobile app to be able to manipulate audio with an EQ on tracks coming from Apple Music, without modifying the actual track structure/data of course, just the audio output.
                    
                  
                
                    
                      Hello,
I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it.
I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully).
> GET /v1/feed/song/latest HTTP/2
> Host: api.media.apple.com
> user-agent: insomnia/2023.5.8
> authorization: Bearer [REDACTED]
> accept: */*
< HTTP/2 401 
< content-type: application/json; charset=utf-8
< content-length: 0
< x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q
< x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod
and also the Apple provided Python example code, which gives me authentication errors too.
$ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path "/Users/foxt/Documents/am-feed/NMBH[...].p8" --out-dir .
running....
INFO:__main__:Sending requests to https://api.media.apple.com
INFO:__main__:Getting the latest export for feed artist
Exception: Authentication Failed. Did you provide the correct team id, key id, and p8 file?
Does this API need to be enabled on my account separately from the main Apple Music API? The documentation reads to me as if anyone with an Apple Developer Programme membership can use this API and I did not see any information regarding any other requirements