Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Created

MusicKit UPCs changing and handling that
I use Universal Product Codes (UPC) in my app to reliably identify albums after having used albumIDs for a time. AlbumIDs can change over time for no obvious reasons (see here for songIDs) so I switched to UPCs since I believed they cannot change. Well apparently they can. A few days ago I populated a JSON with UPCs including 196871067713. Today trying to perform a MusicCatalogResourchRequest for the UPC does not return anything. When using that UPC and putting it into an Apple Music link like https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB redirects to https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB so I assume the UPC has changed from 196871067713 to 1683337782. Apple Music can handle that and redirects to the new upc both in the app and as a website. But a MusicCatalogResourceRequest cannot do that. I filed a suggestion for that (FB15167146) but I need a solution quicker. Can I somehow detect where the URL is redirecting to? Is there a way MusicCatalogResourceRequest can do this? Performing a MusicCatalogSearchRequest can be an option but seems unreliable when using the title as search term. Other ideas? Thank you
1
1
592
Sep ’24
How to fetch all albums after performing a library request
Hi, In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]). The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue. I'm using MusicKit and targeting iOS18 and visionOS 2. Could someone please point me towards the best way to approach this?
1
0
591
Sep ’24
Anyone has any issues with voice control and high quality visuals in the Vision Pro
We are working on an app for the vision pro which has high polygons count and lots of high resolution textures. Everything looks smooth and in general very well, The issue is the moment we turn on voice control even if it is not being used the visuals at the center start to stutter left to right. Has anyone seen this ?, it must be a bug, any workaround ? Thanks, Guillermo
1
0
504
Sep ’24
Implementing Enhanced Dialogue on tvOS 18
How does a third party developer go about supporting the new Enhanced Dialogue option for video apps in tvOS 18? If an app is using the standard AVPlayerViewController, I had assumed it would be a simple-ish matter of building against the tvOS 18 SDK but apparently not, the options don't appear, not even greyed out.
1
1
565
Sep ’24
How to get Song Information From Queue?
I have a recent post kind of outlining a similar question here. This time though I'm confident that inserting an array of Track works when inserting into the ApplicationMusicPlayer.shared.queue but now I'm not sure how I can initialize the queue to display song title and artwork for example. I'm also not sure how to get the current item in the queue's artist information and album information which I feel should be easy to do so maybe I'm missing something obvious. Hope this paints of what I'm trying to do and I'm going to post the neccessary code here to help me debug/figure out this problem. import SwiftUI import MusicKit struct PlayBackView: View { @Environment(\.scenePhase) var scenePhase @Environment(\.openURL) private var openURL // Adding Enum Here for Question Sake enum PlayState { case play case pause } @State var song: Track @Binding var songs: [Track]? @State var isShuffled = false @State private var playState: PlayState = .pause @State private var songTimer: Int = Int.random(in: 5...30) @State private var roundTimer: Int = 5 @State private var isTimerActive = false // @State private var volumeValue = VolumeObserver() @State private var isFirstPlay = true @State private var isDancing = false @State private var player = ApplicationMusicPlayer.shared private var isPlaying: Bool { return (player.state.playbackStatus == .playing) } let timer = Timer.publish(every: 1, on: .main, in: .common).autoconnect() var playPauseImage: String { switch playState { case .play: "pause.fill" case .pause: "play.fill" } } var body: some View { VStack { // Album Cover HStack(spacing: 20) { if let artwork = player.queue.currentEntry?.artwork { ArtworkImage(artwork, height: 100) } else { Image(systemName: "music.note") .resizable() .frame(width: 100, height: 100) } VStack(alignment: .leading) { /* This is where I want to display song title, album title, and artist name for example */ // Song Title Text(player.queue.currentEntry?.title ?? "Unable to Find Song Title") .font(.title) .fixedSize(horizontal: false, vertical: true) // Album Title // Text(player.queue.currentEntry ?? "Album Title Not Found") // .font(.caption) // .fixedSize(horizontal: false, vertical: true) // I don't know what the subtitle actually grabs Text(player.queue.currentEntry?.subtitle ?? "Artist Name Not Found") .font(.caption) } } .padding() // Play/Pause Button Button(action: { handlePlayButton() isFirstPlay = false }, label: { Text(playState == .play ? "Pause" : isFirstPlay ? "Play" : "Resume") .frame(maxWidth: .infinity) }) .buttonStyle(.borderedProminent) .padding() .font(.largeTitle) .tint(.red) } .padding() // Maybe I should use the `.task` modifier here? .onAppear { // I'm sure this code could be improved but don't think it'll help answer the question at the moment. Task { if let songs = songs { do { if isShuffled { let shuffledSongs = songs.shuffled() try await player.queue.insert(shuffledSongs, position: .tail) handlePlayButton() } else { try await player.queue.insert(songs, position: .tail) } } catch { print(error.localizedDescription) } } } } } private func handlePlayButton() { Task { if isPlaying { player.pause() playState = .pause isTimerActive = false } else { playState = .play await playTrack() isTimerActive = true } } } @MainActor private func playTrack() async { do { try await player.play() } catch { print(error.localizedDescription) } } } //#Preview { // PlayBackView() //}
2
0
732
Sep ’24
iOS: I need to convert CMSampleBuffer Linear PCM audio to MPEG-4 AAC
Hello, I am a deaf-blind wheelchair user, and I program in Swift using a braille display. I’m reaching out for your help on an issue I’ve been struggling to solve. Basically, when I extract a CMSampleBuffer from an AVAsset of a video, it comes with the Audio Format ID as Linear PCM. However, when I try to pass this CMSampleBuffer to write another video using AVAssetWriter, the video ends up muted. The audio settings of the output video are configured to MPEG-4 AAC, but the input CMSampleBuffer has the Audio Format ID as Linear PCM. I would like to request an extension for CMSampleBuffer that converts Linear PCM audio to MPEG-4 AAC. I’ve searched extensively and couldn’t find anything. Looking forward to your help. Thank you.
1
0
659
Sep ’24
AudioConverterFillComplexBuffer not working for (E)AC3 in tvOS 18
Since upgrading to tvOS 18, the above function isn't working for me in converting a stream with these formats. It does work in decoding AAC, however. https://developer.apple.com/documentation/audiotoolbox/1503098-audioconverterfillcomplexbuffer?language=objc I pass a valid ioOutputDataPacketSize in, but it always comes out as zero. Has anyone else observed this too? I wonder if this is related to the issue being discussed widely about 5.1 sound being broken for many people after upgrading to tvOS 18? https://discussions.apple.com/thread/255769102?login=true&sortBy=rank EDIT: further information; the callback gets called once, asking for 1 packet (which is ok). I give it one packet and return noErr. However, after this, the callback is never invoked again. Must be a bug? EDIT2: the same code continues to work correctly on macOS in decoding the same audio stream.
4
2
672
Oct ’24
Why Aren't All Songs Being Added to the Queue?
Hi, I've recently been working with the Apple Music API and have had success in loading all the playlists on my account, loading songs from each playlist, and adding songs to the ApplicationMusicPlayer.share.queue. The problem I'm running into is that not all songs from the playlist are being added to the queue, despite confirming all the songs are being based on the PlaybackView.swift I'm about to share with you. I would also like to answer other underlying questions if possible. I am also open to any other suggestions. In this scenario were also assuming isShuffled is true every time. How can I determine when a song has ended? How can I get the album title information? How can I get the current song title, album information, and artist information without pressing play? I can only seem to update the screen when I select my play meaning ApplicationMusicPlayer.shared.play() is being called. How do I get the endTime of the song? I believe it should be ApplicationMusicPlayer.shared.queue.currentEntry.endTime but this doesn't seem to work. // // PlayBackView.swift // // Created by Justin on 8/16/24. // import SwiftUI import MusicKit import Foundation enum PlayState { case play case pause } struct PlayBackView: View { @State var song: Track @Binding var songs: [Track]? @State var isShuffled = false @State private var playState: PlayState = .pause @State private var isFirstPlay = true private let player = ApplicationMusicPlayer.shared private var isPlaying: Bool { return (player.state.playbackStatus == .playing) } var body: some View { VStack { // Album Cover HStack(spacing: 20) { if let artwork = player.queue.currentEntry?.artwork { ArtworkImage(artwork, height: 100) } else { Image(systemName: "music.note") .resizable() .frame(width: 100, height: 100) } VStack(alignment: .leading) { // Song Title Text(player.queue.currentEntry?.title ?? "Song Title Not Found") .font(.title) .fixedSize(horizontal: false, vertical: true) // How do I get AlbumTitle from here?? // Artist Name Text(player.queue.currentEntry?.subtitle ?? "Artist Name Not Found") .font(.caption) } } .padding() Spacer() // Progress View // endTime doesn't work and not sure why. ProgressView(value: player.playbackTime, total: player.queue.currentEntry?.endTime ?? 1.00) .progressViewStyle(.linear) .tint(.red.opacity(0.5)) // Duration View HStack { Text(durationStr(from: player.playbackTime)) .font(.caption) Spacer() if let duration = player.queue.currentEntry?.endTime { Text(durationStr(from: duration)) .font(.caption) } } Spacer() Button { Task { do { try await player.skipToNextEntry() } catch { print(error.localizedDescription) } } } label: { Label("", systemImage: "forward.fill") .tint(.white) } Spacer() // Play/Pause Button Button(action: { handlePlayButton() isFirstPlay = false }, label: { Text(playState == .play ? "Pause" : isFirstPlay ? "Play" : "Resume") .frame(maxWidth: .infinity) }) .buttonStyle(.borderedProminent) .padding() .font(.largeTitle) .tint(.red) } .padding() .onAppear { if isShuffled { songs = songs?.shuffled() if let songs, let firstSong = songs.first { player.queue = .init(for: songs, startingAt: firstSong) player.state.shuffleMode = .songs } } } .onDisappear { player.stop() player.queue = [] player.playbackTime = .zero } } private func handlePlayButton() { Task { if isPlaying { player.pause() playState = .pause } else { playState = .play await playTrack() } } } @MainActor private func playTrack() async { do { try await player.play() } catch { print(error.localizedDescription) } } private func durationStr(from duration: TimeInterval) -> String { let seconds = Int(duration) let minutes = seconds / 60 let remainder = seconds % 60 // Format the string to ensure two digits for the remainder (seconds) return String(format: "%d:%02d", minutes, remainder) } } //#Preview { // PlayBackView() //}
2
0
634
Oct ’24
Distorted Audio When Recording External Mics With AVCaptureSession and AVAssetWriter
I’m working on a macOS app, written in Swift. My goal is to record audio from an external microphone, e.g., one connected via USB. For this, I’m using an AVCaptureSession and recording its output with an AVAssetWriter. This works perfectly in principle (and reliably with internal microphones, for example). The problem occurs after my app has successfully completed the first recording and I then want to make additional recordings (which makes me think it might be process-dependent, because it works again after restarting the app). The problem: Noisy or distorted-sounding audio files. In addition, the following error message appears in the Console from CoreAudio / its AudioConverter: Input data proc returned inconsistent 512 packets for 2048 bytes; at 3 bytes per packet, that is actually 682 packets It is easy to reproduce. This problem is reproducible even if I don’t configure the AVAssetWriter manually and instead let it receive its audioSettings using a preset from an AVOutputSettingsAssistant. I’m running on macOS 15.0 (24A335). I’ve filed a feedback including a demo project → FB15333298 🎟️ I would greatly appreciate any help! Have a great day, Martin
6
0
804
Oct ’24
Audio Interruption Not Being Intercepted in AVAudioSession with Classification
Hi everyone, I’m experiencing an issue where audio interruptions (e.g., phone calls) are not being intercepted while running sound classification in an app that uses the AVAudioSession. Classification works fine, but interruptions aren’t handled, even though I’ve followed Apple’s guidelines on handling audio interruptions [1_Document]. The classification was initially based on [2_Classifer], where it worked perfectly. However, when I adopted classification in a more camera-focused app using [3_Cam], the interruption behavior stopped working. The classification setup is functioning with [3_Cam], but audio interruptions are not triggered. The listener is invoked before starting sound analysis as suggested in [2_Classifier]. startListeningForAudioSessionInterruptions() try startAnalyzing([(request, observer)]) FYI, one change I have made for classifications is following. This works fine in all cases. // try audioSession.setCategory(.record, mode: .default) try audioSession.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth]) I suspect the issue might be related to the AVAudioSession configuration or how the app handles recording and playback together. Is there anything else I should check related to AVAudioSession? Are there additional APIs I could use to pre-check or better handle audio interruptions? Any suggestions or guidance would be greatly appreciated! Platform: Swift 5, Xcode 16, iOS 18. References: Document Classifier Cam Best Regards
1
0
501
Oct ’24
Toggling AVMusicTrack isMuted
Hi! I have an AVAudioSequencer with some AVMusicTracks that are filled with AVParameterEvents. If I toggle the isMuted property of a track, it will instantly mute when changed to true. However, after turning the muting to false, the events will only triggers on the next round of a loop and not instantly. Is this intended behaviour, and is there some way to get the events to trigger immediately after toggling the isMuted to be false?
2
0
636
Oct ’24
Noise occurs when playing on iOS 18.0 device + AirPods Pro 2
When we tested the audio quality of our VoIP App, we found that when the iOS18.0 device was played with AirPods Pro 2, we could hear noises similar to peak clipping and distortion, especially when the sound source played was loud and high-pitched. Here is the device information we tested: Model: iPhone 16 Pro Max, iPhone 15 Pro System version: iOS 18.0 (22A3354) Bluetooth headset model: AirPods Pro 2 Bluetooth firmware version: 6F8 We tested multiple apps (including phone calls, FaceTime, Zoom, WeChat, Tencent Meeting), and they all had the above noise problem. We also found two phenomena: If we use the same iOS 18 device to connect HUAWEI FreeBuds Pro or FreeBuds 2, there is no such noise problem; If we use an iOS 17 device to connect to the same AirPods Pro 2 for testing, there is no such noise problem; Therefore, we suspect that it is caused by the compatibility problem between iOS 18.0 and AirPods firmware 6F8. The firmware version of our AirPods Pro 2 is 6F8, which was released on June 26, and iOS 18.0 was released on September 16. Maybe they are not very compatible. I hope that subsequent firmware updates can fix this problem.
1
0
628
Oct ’24
Inquiry About Background Volume Button Event in iOS App Development
I’m working on an iOS app for a client, and I have a question regarding a specific feature we're looking to implement. We want the app to respond to a user pressing the volume button three times while the app is in the background. The goal is to allow users to discreetly trigger a safety feature without drawing attention, particularly in situations where they may be in danger or at risk. This feature is critical for the app and would be a valuable addition, as it could potentially help protect users in emergency situations. However, I haven’t found much information on whether iOS allows background listening for volume button presses. Therefore, I would greatly appreciate your insights on the following: Is it possible to listen for volume button presses when the app is in the background, or are there system-level restrictions that prevent this? If it's not directly possible, are there any special provisions, APIs, or entitlements that can be requested from Apple to enable this functionality? In case this feature is not supported, are there alternative approaches to achieve a similar discreet activation mechanism? If this is something that requires special permission or a process, could you please guide me on how to proceed? I understand that maintaining user privacy and security is a priority for iOS, and I want to ensure that any implementation fully complies with Apple's guidelines. Thanks in advance for your help!
1
0
359
Oct ’24
Seeking Help to Verify Authenticity of M4A Metadata
I'm seeking information about the original file schema for an m4a file recorded directly on an iPhone (iPhone 5 running iOS 9.2.0). I currently have two files from which I extracted metadata using ExifTool. The first file was provided to me by someone who claims it was recorded on an iPhone 5 with iOS 9.2.0. I would like to verify whether this file has been edited. File Permissions: -rwx------ Content Create Date: 2016:03:01 14:21:08+07:00 The second file was recorded by me on the same device model and iOS version. File Permissions: -rw-r--r-- Date/Time Original: 2024:10:03 11:44:16+07:00 As you can see, the file permissions differ, and the key for the recording date also differs: one uses "Content Create Date" while the other uses "Date/Time Original." I would like to determine if the first file was edited, but I haven't been able to find any official documentation on the m4a schema or metadata structure from audio recorder apps. I reached out to support, and they directed me to this forum. Any insights or help would be appreciated.
1
0
431
Oct ’24
How to observe AVCaptureDevice.DiscoverySession devices property?
At Apple Developer documentation, https://developer.apple.com/documentation/avfoundation/avcapturedevice/discoverysession you can find the sentence You can also key-value observe this property to monitor changes to the list of available devices. But how to use it? I tried it with the code above and tested on my MacBook with EarPods. When I disconnect the EarPods, nothing was happened. MacBook Air M2 macOS Sequoia 15.0.1 Xcode 16.0 import Foundation import AVFoundation let discovery_session = AVCaptureDevice.DiscoverySession.init(deviceTypes: [.microphone], mediaType: .audio, position: .unspecified) let devices = discovery_session.devices for device in devices { print(device.localizedName) } let device = devices[0] let observer = Observer() discovery_session.addObserver(observer, forKeyPath: "devices", options: [.new, .old], context: nil) let input = try! AVCaptureDeviceInput(device: device) let queue = DispatchQueue(label: "queue") var output = AVCaptureAudioDataOutput() let delegate = OutputDelegate() output.setSampleBufferDelegate(delegate, queue: queue) var session = AVCaptureSession() session.beginConfiguration() session.addInput(input) session.addOutput(output) session.commitConfiguration() session.startRunning() let group = DispatchGroup() let q = DispatchQueue(label: "", attributes: .concurrent) q.async(group: group, execute: DispatchWorkItem() { sleep(10) session.stopRunning() }) _ = group.wait(timeout: .distantFuture) class Observer: NSObject { override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) { print("Change") if keyPath == "devices" { if let newDevices = change?[.newKey] as? [AVCaptureDevice] { print("New devices: \(newDevices.map { $0.localizedName })") } if let oldDevices = change?[.oldKey] as? [AVCaptureDevice] { print("Old devices: \(oldDevices.map { $0.localizedName })") } } } } class OutputDelegate : NSObject, AVCaptureAudioDataOutputSampleBufferDelegate { func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { print("Output") } }
6
0
815
Oct ’24
Missing for a decade, full quality + remote control
Is anyone developing a way for users to control an iOS or PadOS device playing Apple Music to a DAC via USB to amp from another iOS or PadOS device wirelessly? Specifically, full control. Not Accessibility, not to Apple TV, not HomePods, not firmware downgraded Airport Expresses to a DAC or other hacks mentioned for the past decade this “connect” like feature has been desired by audiophiles seeking exclusive mode on a device with that (iOS/PadOS) but — control it while sitting on a couch or in a wheel chair across the room. Exclusive mode being the key feature iOS and PadOS offer that is desired with full or nearly full Apple Music control.
2
0
399
Oct ’24
Error on connect AudioEngin with AudioPlayerNoded with AVAudioPCMFormatInt16
Hi community, I'm trying to setup an AVAudioFormat with AVAudioPCMFormatInt16. But, i've an error : AVAEInternal.h:125 [AUInterface.mm:539:SetFormat: ([[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr])] returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 "(null)" If i understand the error code 10868, the format is not correct. But, how i can use PCM Int16 format ? Here is my method : - (void)setupAudioDecoder:(double)sampleRate audioChannels:(double)audioChannels { if (self.isRunning) { return; } self.audioEngine = [[AVAudioEngine alloc] init]; self.audioPlayerNode = [[AVAudioPlayerNode alloc] init]; [self.audioEngine attachNode:self.audioPlayerNode]; AVAudioChannelCount channelCount = (AVAudioChannelCount)audioChannels; self.audioFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatInt16 sampleRate:sampleRate channels:channelCount interleaved:YES]; NSLog(@"Audio Format: %@", self.audioFormat); NSLog(@"Audio Player Node: %@", self.audioPlayerNode); NSLog(@"Audio Engine: %@", self.audioEngine); // Error on this line [self.audioEngine connect:self.audioPlayerNode to:self.audioEngine.mainMixerNode format:self.audioFormat]; /**NSError *error = nil; if (![self.audioEngine startAndReturnError:&error]) { NSLog(@"Erreur lors de l'initialisation du moteur audio: %@", error); return; } [self.audioPlayerNode play]; self.isRunning = YES;*/ } Also, i see the audioEngine seem not running ? Audio Engine: ________ GraphDescription ________ AVAudioEngineGraph 0x600003d55fe0: initialized = 0, running = 0, number of nodes = 1 Anyone have already use this format with AVAudioFormat ? Thank you !
1
0
630
Oct ’24