Integrate music and other audio content into your apps.

Posts under Audio tag

60 Posts

Post

Replies

Boosts

Views

Activity

Making music….
Maybe a bit of a simplistic question, but…. Best ways to create music - programmatically - within an app? I have a few small, simple games that I would like to add some background music to. I’ve fiddled with AVAudio, SKAudio, and now AudioKit. I’m just not quite finding the holy grail of music generation. I don’t know if it’s more technique or tool. So I thought I’d hit up the pool of minds here for some suggestions…?
0
0
111
Jun ’25
Essentials of macOS to read and write mp3 and mp4 audio files
Hi, On macOS I used to open MP3 and MP4 files with ExtAudioFile. For a few years it doesn't work anymore. So I decided to try different macOS API using the AudioFileID of AudioToolbox framework. I decided to write a test: https://gist.github.com/joelkraehemann/7f5b241b52ca38c3a765c138fb647588 It fails right here: AudioFileOpenWithCallbacks() By telling OSStatus error 1954115647, which means kAudioFileUnsupportedFileTypeError. The filename was set to an MP4 file: ~/Music/test.mp4 Howto fix this? regards, Joël
1
0
190
Jun ’25
Is it possible to show a permission dialog when the device is locked?
Hello, I'm trying to handle the following use case right after app installation: Display a microphone permission modal on the lock screen before answering an incoming call notification However, I've been searching for a way to show permission modals while the screen is locked, but I couldn't find any solution in other forums or documentation. I've also checked several calling apps, and it appears that none of them display permission modals either. Is this an OS specification/limitation? Are there any workarounds available?
1
0
134
Jun ’25
Airplay selection not working
I'm trying to implement airplay into my app. I can successfully playback sound and trigger the airplay selector sheet. If the target device is a Bluetooth only device I can connect with no problem and stream the audio to the Bluetooth device, but if the audio device is a airplay specific device like a HomePod or an Apple TV when I select it, I get a spinning icon, indicating that it is trying to connect, and eventually it times out and stops without connecting. I don't believe it is an AirPlay audio issue because if I go to a different app, for example a podcast app and select my HomePods for output, and then switch back to my app. My audio will correctly stream to the HomePod. Not only that, I have it so that my icon will change color to indicate that it is connected via airplay and it is correctly indicating that it is connected via AirPlay. But I cannot then disconnect it using the Airplay selector. The issue appears to be in the AirPlay selection side, which I have spent several days attempting to troubleshoot mostly using ChatGPT to suggest code different than what I have to maybe work around the issue. Mostly it is focused on the audio player section, but it doesn't seem like that is really the route that is the problem.
2
0
230
Jun ’25
AVAudioSession dropping wired USBAudio source
My app inputs electrical waveforms from an IV485B39 2 channel USB device using an AVAudioSession. Before attempting to acquire data I make sure the input device is available as follows: AVAudiosSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory :AVAudioSessionCategoryRecord error:&err]; NSArray *inputs = [audioSession availableInputs]; I have been using this code for about 10 years. My app is scriptable so a user can acquire data from the IV485B29 multiple times with various parameter settings (sampling rates and sample duration). Recently the scripts have been failing to complete and what I have notice that when it fails the list of available inputs is missing the USBAudio input. While debugging I have noticed that when working properly the list of inputs includes both the internal microphone as well as the USBAudio device as shown below. VIB_TimeSeriesViewController:***Available inputs = ( "<AVAudioSessionPortDescription: 0x11584c7d0, type = MicrophoneBuiltIn; name = iPad Microphone; UID = Built-In Microphone; selectedDataSource = Front>", "<AVAudioSessionPortDescription: 0x11584cae0, type = USBAudio; name = 485B39 200095708064650803073200616; UID = AppleUSBAudioEngine:Digiducer.com :485B39 200095708064650803073200616:000957 200095708064650803073200616:1; selectedDataSource = (null)>" ) But when it fails I only see the built in microphone. VIB_TimeSeriesViewController:***Available inputs = ( "<AVAudioSessionPortDescription: 0x11584cef0, type = MicrophoneBuiltIn; name = iPad Microphone; UID = Built-In Microphone; selectedDataSource = Front>" ) If I only see the built in microphone I immediately repeat the three lines of code and most of the "inputs" contains both the internal microphone and the USBAudioDevice AVAudiosSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory :AVAudioSessionCategoryRecord error:&err]; NSArray *inputs = [audioSession availableInputs]; This fix always works on my M2 iPadPro and my iPhone 14 but some of my customers have older devices and even with 3 tries they still get faults about 1 in 10 tries. I rolled back my code to a released version from about 12 months ago where I know we never had this problem and compiled it against the current libraries and the problem still exists. I assume this is a problem caused by a change in the AVAudioSession framework libraries. I need to find a way to work around the issue or get the library fixed.
0
0
94
May ’25
Audio session activation occasionally fails from CarPlay
I'm working on adding CarPlay support to an audio app and am running into an issue. Occasionally, when a user opens the app from CarPlay while the main app scene is either not connected or is currently in the background, I will receive an error when attempting to activate the audio session. The code below mimics my setup: do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .spokenAudio) try AVAudioSession.sharedInstance().setActive(true) } catch { print(error) // NSOSStatusErrorDomain - 560557684: Session activation failed } That error code maps to AVAudioSession.ErrorCode.cannotInterruptOthers. Once in this state, all subsequent attempts to play different pieces of content will fail. However, things will start working normally if the user opens the app on their phone and tries again from CarPlay (while the app is in the foreground on their phone). I'm not sure why it would behave this way and want to note that I do have the audio background mode capability enabled. Has anyone else encountered this? Are there any workarounds or changes I could make to prevent this from happening?
0
1
181
Apr ’25
Does USB Audio in macOS support 12 channels with 16bit PCM samples?
I have a custom USB Audio Class 2 (UAC2) compatible device. When I connect this custom device to a MacBook with a configuration of up to 10 channels (16-bit), everything seems to work fine. However, when I increase the channel count to 12, the MacBook does not recognize the 12 channels. It only shows the channel count as 0. TN2274 is the only source where I found some information about Apple's Audio Class Drivers, but it doesn't mention any limitations regarding channel counts. Could you let me know the current limitations of the Audio Class Drivers on the latest macOS versions? What configuration should I use to get 12 channels working? P.S. I also found that a 12-channel, 8-bit configuration is detected by the MacBook, bit I want it to work with 16bits. For more detail please check FB17098863
6
0
261
Apr ’25
AVPlayerItem. externalMetadata not available
According to the documentation (https://developer.apple.com/documentation/avfoundation/avplayeritem/externalmetadata), AVPlayerItem should have an externalMetadata property. However it does not appear to be visible to my app. When I try, I get: Value of type 'AVPlayerItem' has no member 'externalMetadata' Documentation states iOS 12.2+; I am building with a minimum deployment target of iOS 18. Code snippet: import Foundation import AVFoundation /// ... in function ... // create metadata as described in https://developer.apple.com/videos/play/wwdc2022/110338 var title = AVMutableMetadataItem() title.identifier = .commonIdentifierAlbumName title.value = "My Title" as NSString? title.extendedLanguageTag = "und" var playerItem = await AVPlayerItem(asset: composition) playerItem.externalMetadata = [ title ]
0
0
85
Apr ’25
iOS Mobile Video Audio Playback Issues in React
I'm experiencing issues with audio playback in my React video player component specifically on iOS mobile devices (iPhone/iPad). Even after implementing several recommended solutions, including Apple's own guidelines, the audio still isn't working properly on iOS Safari. It works completely fine on Android. On iOS, I ensured the video doesn't autoplay (it requires user interaction). Here are all the details: Environment iOS Safari (latest version) React 18 TypeScript Video files: MP4 with AAC audio codec Current Implementation const VideoPlayer: React.FC<VideoPlayerProps> = ({ src, autoplay = true, }) => { const videoRef = useRef<HTMLVideoElement>(null); const isIOSDevice = isIOS(); // Custom iOS detection const [touchStartY, setTouchStartY] = useState<number | null>(null); const [touchStartTime, setTouchStartTime] = useState<number | null>(null); // Handle touch start event for gesture detection const handleTouchStart = (e: React.TouchEvent) => { setTouchStartY(e.touches[0].clientY); setTouchStartTime(Date.now()); }; // Handle touch end event with gesture validation const handleTouchEnd = (e: React.TouchEvent) => { if (touchStartY === null || touchStartTime === null) return; const touchEndY = e.changedTouches[0].clientY; const touchEndTime = Date.now(); // Validate if it's a legitimate tap (not a scroll) const verticalDistance = Math.abs(touchEndY - touchStartY); const touchDuration = touchEndTime - touchStartTime; // Only trigger for quick taps (< 200ms) with minimal vertical movement if (touchDuration < 200 && verticalDistance < 10) { handleVideoInteraction(e); } setTouchStartY(null); setTouchStartTime(null); }; // Simplified video interaction handler following Apple's guidelines const handleVideoInteraction = (e: React.MouseEvent | React.TouchEvent) => { console.log('Video interaction detected:', { type: e.type, timestamp: new Date().toISOString() }); // Ensure keyboard is dismissed (iOS requirement) if (document.activeElement instanceof HTMLElement) { document.activeElement.blur(); } e.stopPropagation(); const video = videoRef.current; if (!video || !video.paused) return; // Attempt playback in response to user gesture video.play().catch(err => console.error('Error playing video:', err)); }; // Effect to handle video source and initial state useEffect(() => { console.log('VideoPlayer props:', { src, loadingState }); setError(null); setLoadingState('initial'); setShowPlayButton(false); // Never show custom play button on iOS if (videoRef.current) { // Set crossOrigin attribute for CORS videoRef.current.crossOrigin = "anonymous"; if (autoplay && !hasPlayed && !isIOSDevice) { // Only autoplay on non-iOS devices dismissKeyboard(); setHasPlayed(true); } } }, [src, autoplay, hasPlayed, isIOSDevice]); return ( <Paper shadow="sm" radius="md" withBorder onClick={handleVideoInteraction} onTouchStart={handleTouchStart} onTouchEnd={handleTouchEnd} > <video ref={videoRef} autoPlay={!isIOSDevice && autoplay} playsInline controls crossOrigin="anonymous" preload="auto" onLoadedData={handleLoadedData} onLoadedMetadata={handleMetadataLoaded} onEnded={handleVideoEnd} onError={handleError} onPlay={dismissKeyboard} onClick={handleVideoInteraction} onTouchStart={handleTouchStart} onTouchEnd={handleTouchEnd} {...(!isFirefoxBrowser && { "x-webkit-airplay": "allow", "x-webkit-playsinline": true, "webkit-playsinline": true })} > <source src={videoSrc} type="video/mp4" /> </video> </Paper> ); }; Apple's Guidelines Implementation Removed custom play controls on iOS Using native video controls for user interaction Ensuring audio playback is triggered by user gesture Following Apple's audio session guidelines Properly handling the canplaythrough event Current Behavior Video plays but without sound on iOS mobile Mute/unmute button in native video controls doesn't work Audio works fine on desktop browsers and Android devices Videos are confirmed to have AAC audio codec No console errors related to audio playback User interaction doesn't trigger audio as expected Questions Are there any additional iOS-specific requirements I'm missing? Could this be related to iOS audio session handling? Are there known issues with React's handling of video elements on iOS? Should I be implementing additional audio context initialization? Any insights or suggestions would be greatly appreciated!
0
0
486
Mar ’25
How to show/hide Now Playing button on CPTabBarTemplate
I'm working on adding CarPlay support to an audio app and I'd like to mimic the behavior of the Apple Music app on launch. Forgive me, but I think using Gherkin syntax here will help to best describe the desired behavior: GIVEN the Apple Music app is in a cold state (not launched or in memory) AND another audio app is actively playing audio WHEN I launch the Apple Music app from CarPlay THEN the Now Playing template is shown via a push AND the appropriate Now Playing info is shown AND the Now Playing button is shown on the tab bar AND the actively playing audio from another audio app is NOT interrupted The current behavior I see in my own app is that I can push on the Now Playing template and fill out the MPNowPlayingInfoCenter's info dictionary, but it won't render the info or show the Now Playing button on the tab bar until I start playing audio. Also, is there a way to hide the Now Playing button after the queue of content has finished playing? I'm able to pop the Now Playing template, but the Now Playing button is still present and tapping it will navigate the user to the now blank Now Playing template.
1
0
360
Mar ’25
USB Audio Input Blocked When Screen Mirroring via AirPlay
Hello Apple Developers, I am experiencing an issue where USB audio input (e.g., external USB microphone) is blocked when using AirPlay screen mirroring from my iPhone to a Mac or Apple TV. However, the built-in microphone continues to work without any problem. Issue Details: I am using an iPhone 15 (or latest device) running iOS [your iOS version]. I connect a USB audio interface/microphone via a USB-C adapter or Lightning adapter. The USB microphone works perfectly for audio input before starting AirPlay. The moment I enable AirPlay screen mirroring, the USB microphone stops working, and it disappears from available audio input sources. The built-in microphone continues to function, but I cannot use the USB microphone while mirroring. When I stop screen mirroring, the USB microphone immediately becomes available again. Expected Behavior: I would expect iOS to allow me to continue using an external USB microphone while mirroring my screen, just like it allows the built-in microphone to work. Questions: Is this an intentional restriction in iOS? Is there any workaround to enable USB audio input while using AirPlay screen mirroring? Is there a way to request a feature or configuration option to allow external USB microphones during AirPlay? I appreciate any insights or guidance from the Apple team or fellow developers. Thanks in advance! Best regards,
0
0
322
Mar ’25
Why do I cant download my music?
A month ago I was listening to music in Apple Music but then I saw a song, not anything weird, then I wanted to download it but it took a longer than usual and then an error apeared, I cant upload the picture of the text but it says, "The Song Name / Album / Artist 0% - Stopped (err = -12884)", I uninstall the app and then install it and it fixes, but when i want to enable Lossless Audio, the error appears, the error still aprearing until today and Im tired of uninstalling and installing the app so much times, can anyone help me?
1
0
352
Feb ’25
Distorted Audio When Recording External Mics With AVCaptureSession and AVAssetWriter
I’m working on a macOS app, written in Swift. My goal is to record audio from an external microphone, e.g., one connected via USB. For this, I’m using an AVCaptureSession and recording its output with an AVAssetWriter. This works perfectly in principle (and reliably with internal microphones, for example). The problem occurs after my app has successfully completed the first recording and I then want to make additional recordings (which makes me think it might be process-dependent, because it works again after restarting the app). The problem: Noisy or distorted-sounding audio files. In addition, the following error message appears in the Console from CoreAudio / its AudioConverter: Input data proc returned inconsistent 512 packets for 2048 bytes; at 3 bytes per packet, that is actually 682 packets It is easy to reproduce. This problem is reproducible even if I don’t configure the AVAssetWriter manually and instead let it receive its audioSettings using a preset from an AVOutputSettingsAssistant. I’m running on macOS 15.0 (24A335). I’ve filed a feedback including a demo project → FB15333298 🎟️ I would greatly appreciate any help! Have a great day, Martin
6
0
915
Feb ’25
Best Approach for Reliable Background Audio Playback with Audio Ducking on Command from Server
I am developing an iOS app that needs to play spoken audio on demand from a server, while ducking the audio of background music from another app (e.g., SoundtrackYourBrand or Apple Music). This must work even when the app is in the background, and the server dictates when and what audio is played. Ideally, the message should be played within a minute of the server requesting it. Current Attempt & Observations I initially tried using Firebase Cloud Messaging (FCM) silent notifications to send a URL to an audio file, which the app would then play using AVPlayer. This works consistently when the app is active, but in the background, it only works about 60% of the time. In cases where it fails, iOS ducks the background music (e.g., from SoundtrackYourBrand) but never plays the spoken audio. Interestingly, when I play the audio without enabling audio ducking, it seems to work 100% of the time from my limited testing, even in the background. The app has background modes enabled for Audio, Background Fetch, and Remote Notifications. Best Approach to Achieve This? I’d like guidance on the best Apple-compliant approach to reliably play audio on command from the server, even when the app is in the background. Some possible paths: Ensuring the app remains active in the background – Are there recommended ways to prevent the app from getting suspended, such as background tasks, a special background mode, or a persistent connection to the server? Alternative triggering mechanisms – Would something like VoIP, Push-to-Talk, or another background service be better suited for this use case? Built-in iOS speech synthesis (AVSpeechSynthesizer) – If playing external audio is unreliable, would generating speech dynamically from text be a more robust approach? Streaming audio instead of sending a URL – Could continuous streaming from the server keep the app active and allow playback at the right moment? I want to ensure the solution is reliable and works 100% of the time when needed. Any recommendations on the best approach for this would be greatly appreciated. Thank you for your time and guidance.
0
1
398
Feb ’25
Why is the volume very low when using the real-time recording and playback feature with AEC?
I’ve been researching how to achieve a recording playback effect in iOS similar to the hands-free calling effect in the system’s phone app. How can this be implemented? I tried using the voice chat recording method, but found that the volume of the speaker output is too low. How should this issue be addressed? I couldn’t find a suitable API. Could you provide me with some documentation or sample code? Thank you.
1
0
428
Feb ’25
PTTFramework w/ AVAudioSession
Hi all, I have spent a lot of time reading the tech note and watching the WDDC video that introduce the PTTFramework on iOS. I currently have a custom setup where I am using AVAudioEngine to schedule and play buffers that are being streamed through a call. I am looking to use the PTTFramework to allow a user to trigger this push to talk behavior from the lock screen and the various places with the system UI it provides. However I am unsure what the correct behavior is regarding the handling of the audio session. Right now I am using .playback when there is no active voice transmission so that devices such as AirPods can be in AD2P mode where applicable, and then transitioning to .playbackAndRecord category only when the mic input should become active. Following this change in my AVAudioEngine manager I am then manually activating and deactivating the audio session manually when the engine is either playing/recording or idle. In the documentation it states that you should not attempt to activate or deactivate your audio session directly, but allow the framework to handle it. Does that mean that I need to either call the request to transmit delegate function or set an active participant on the channel manager first, and then wait for the didBecomeActive delegate method to trigger before I actually attempt to play or record any audio? (I am using the fullDuplex mode currently.) I noticed that that delegate method will only trigger if the audio session wasn't active before doing one of the above (setting active participant, requesting transmit). Lastly, when using the PTTFramework it also mentions that we get support for PTT devices and I notice on the didBeginTransmittingFrom property we have a handsfreeButton case. Is there any documentation or resources for what is actually supported out of the box for this? I am currently working on handling a lot of the push to talk through bluetooth LE, and wanted to make sure there wasn't overlap with what the system provides. Thank you!
2
0
617
Feb ’25