Integrate music and other audio content into your apps.

Posts under Audio tag

85 Posts
Sort by:
Post not yet marked as solved
14 Replies
10k Views
I am experiencing an issue where my Mac's speakers will crackle and pop when running an app on the Simulator or even when previewing SwiftUI with Live Preview. I am using a 16" MacBook Pro (i9) and I'm running Xcode 12.2 on Big Sur (11.0.1). Killing coreaudiod temporarily fixes the problem however this is not much of a solution. Is anyone else having this problem?
Posted
by
Post not yet marked as solved
1 Replies
2.3k Views
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
Posted
by
Post not yet marked as solved
1 Replies
1.1k Views
How can I processing on the AudioQueue callback AudioQueueOutputCallback for FFT . Just like this func static void audioQueueOutpuCallBack(void *input, AudioQueueRef inQueue, AudioQueueBufferRef outQueueBuffer) {    SYAudioQueue *aq = (__bridge SYAudioQueue *)input;   dispatch_semaphore_wait(aq-m_mutex, DISPATCH_TIME_FOREVER);   [aq enterQueue:inQueue withBuffer:outQueueBuffer];   dispatch_semaphore_signal(aq-m_mutex); } I know that AVAudioEngine can be processing for FFT under the AVAudioPCMBuffer . or How can I convert AudioQueueBufferRef to AVAudioPCMBuffer
Posted
by
Post not yet marked as solved
11 Replies
8.8k Views
An error is reported when playing h5 audio or video elements in wkwebview: Error acquiring assertion: Error Domain=RBSAssertionErrorDomain Code=3 "Required client entitlement is missing" UserInfo={RBSAssertionAttribute=RBSDomainAttribute| domain:"com.apple.webkit" name:"MediaPlayback" sourceEnvironment:"(null)", NSLocalizedFailureReason=Required client entitlement is missing} Then the performance of the webview will become very poor. There is an audio element and a button button in my HTML file. Click the button to play audio. body button onclick="handleClick()"PLAY/button audio id="audio" src="https://ac-dev.oss-cn-hangzhou.aliyuncs.com/test-2022-music.mp3"/audio script function handleClick() { document.getElementById("audio").play(); } /script /body Create a wkwebview to load the html file in my demo APP. class ViewController: UIViewController , WKUIDelegate{ var webView: WKWebView! override func loadView() { let config = WKWebViewConfiguration() config.preferences.javaScriptEnabled = true config.allowsInlineMediaPlayback = true webView = WKWebView(frame: .zero, configuration: config) //.zero webView.uiDelegate = self view = webView } override func viewDidLoad() { super.viewDidLoad() let myURL = URL(string: "https://ac-dev.oss-cn-hangzhou.aliyuncs.com/test-2022-py.html") let myRequest = URLRequest(url: myURL!) webView.load(myRequest) } } Click the button in the HTML to play the audio, and you can see the error report on the xcode. iPadN[2133:855729] [assertion] Error acquiring assertion: Error Domain=RBSAssertionErrorDomain Code=3 "Required client entitlement is missing" UserInfo={RBSAssertionAttribute=RBSDomainAttribute| domain:"com.apple.webkit" name:"MediaPlayback" sourceEnvironment:"(null)", NSLocalizedFailureReason=Required client entitlement is missing} To sum up, this error will appear when playing audio or video in HTML. Then the app performance will drop a lot, and the interactive response will be very slow.
Posted
by
Post not yet marked as solved
1 Replies
2.2k Views
I've noticed that enabling voice processing on AVAudioInputNode change the node's format - most noticeably channel count. let inputNode = avEngine.inputNode print("Format #1: \(inputNode.outputFormat(forBus: 0))") // Format #1: <AVAudioFormat 0x600002bb4be0:  1 ch,  44100 Hz, Float32> try! inputNode.setVoiceProcessingEnabled(true) print("Format #2: \(inputNode.outputFormat(forBus: 0))") // Format #2: <AVAudioFormat 0x600002b18f50:  3 ch,  44100 Hz, Float32, deinterleaved> Is this expected? How can I interpret these channels? My input device is an aggregate device where each channel comes from a different microphone. I then record each channels to separate files. But when voice processing messes up with the channels layout, I cannot rely on this anymore.
Posted
by
Post not yet marked as solved
1 Replies
1.6k Views
I am trying to figure out why there is no audio on my iPhone or iPad, my code is working on other devices. I am on IPad iOS 15.3.1 and I test on my computer using Safari. Video is working, and both the video and audio work on Android, Chrome, etc. This is just an audio problem on iOS. From my WebRTC I have HTML5 Audio Track tracks as such: <audio muted="false" autoplay="1" id="xxxx"></audio> When debugging, I connect my IPad and have run this volume check: document.getElementById('***').volume And it returns the value of 1, so the volume is on its loudest (I think according to HTML5 audio tags range from 0, 0.1, 0.2, xxxx 1). document.getElementById('***').end The ended returns false. Next I try to run the play() function as such: $('#***')[0].play() .then((resp) => { console.log("Success"); console.log(resp) }) .catch(error => {console.log(error)}) And it executes the success response. But there is still no sound. What could be causing this issue on iOS and Safari only?enter code here
Posted
by
Post not yet marked as solved
1 Replies
1.6k Views
Hi, I have multiple audio files I want to decide which channel goes to which output. For example, how to route four 2-channel audio files to an 8-channel output. Also If I have an AVAudioPlayerNode playing a 2-channel track through headphones, can I flip the channels on the output for playback, i.e flip left and right? I have read the following thread which seeks to do something similar, but it is from 2012 and I do not quite understand how it would work in modern day. Many thanks, I am a bit stumped.
Posted
by
Post not yet marked as solved
9 Replies
3.6k Views
I am getting an error in iOS 16. This error doesn't appear in previous iOS versions. I am using RemoteIO to playback live audio at 4000 hz. The error is the following: Input data proc returned inconsistent 2 packets for 186 bytes; at 2 bytes per packet, that is actually 93 packets This is how the audio format and the callback is set: // Set the Audio format AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 4000; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; audioFormat.mFramesPerPacket = 1; audioFormat.mChannelsPerFrame = 1; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = 2; audioFormat.mBytesPerFrame = 2; AURenderCallbackStruct callbackStruct; // Set output callback callbackStruct.inputProc = playbackCallback; callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self); status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &callbackStruct, sizeof(callbackStruct)); Note that the mSampleRate I set is 4000 Hz. In iOS 15 I get 0.02322 seconds of buffer duration (IOBufferDuration) and 93 frames in each callback. This is expected, because: number of frames * buffer duration = sampling rate 93 * 0.02322 = 4000 Hz However, in iOS 16 I am getting the aforementioned error in the callback. Input data proc returned inconsistent 2 packets for 186 bytes; at 2 bytes per packet, that is actually 93 packets Since the number of frames is equal to the number of packets, I am getting 1 or 2 frames in the callback and the buffer duration is of 0.02322 seconds. This didn't affect the playback of the "raw" signal, but it did affect the playback of the "processed" signal. number of frames * buffer duration = sampling rate 2 * 0.02322 = 0.046 Hz That doesn't make any sense. This error appears for different sampling rates (8000, 16000, 32000), but not for 44100. However I would like to keep 4000 as my sampling rate. I have also tried to set the sampling rate by using the setPreferredSampleRate(_:) function of AVAudioSession, but the attempt didn't succeed. The sampling rate was still 44100 after calling that function. Any help on this issue would be appreciated.
Posted
by
Post not yet marked as solved
2 Replies
1.2k Views
We noticed iOS 16 doesn't seem to support these commands anymore: MPRemoteCommandCenter.shared().likeCommand MPRemoteCommandCenter.shared().dislikeCommand MPRemoteCommandCenter.shared().bookmarkCommand Or is there another way to show a menu in lieu of the previous button on the lock screen?
Posted
by
Post not yet marked as solved
14 Replies
2.8k Views
Hello, after updating to iOS 16.4 I have major issues when trying to play music through my 2015 BMW with both bluetooth and USB. I have used other phones with earlier iOS versions, and it works flawlessly so I know it's 16.4. I have tried restarting, updating the BMW software, and disconnecting/reconnecting, but no luck as it's certainly a 16.4 issue. The problems are as follows (for apple music, spotify, soundcloud, any audio streaming) No album artwork No song title No album title No ability to change songs unless on phone When attempting to play a song, it will only play the first ~30 seconds or so before restarting to the first song in my library. This happens over, and over, and over again 16.4 has made being able to enjoy music in my car obsolete. I have tried submitting 2 tickets in the feedback app with no response, and when I try to contact Apple they just tell me to submit feedback and are unable to help. Hoping a dev or someone sees this and is able to fix it. Thank you.
Posted
by
Post not yet marked as solved
1 Replies
1k Views
Hi, I am using WkWebView to display some dynamic content made in cocos creator game engine. For sound to play without user interaction I was using the "mediaTypesRequiringUserActionForPlayback" flag for WKWebViewConfiguration as '0' i.e. no user interaction is required to play audio/video. The above setting was working fine and sound autoplay was working up untill iOS 16.0, after updating my iPad to iOS 16.3.1 the autoplay of sound stopped working. It can only be played if I call AudioContext.resume() on user interaction. Can anyone advice how to get the audio autoplay working correctly again. Thanks
Posted
by
Post not yet marked as solved
0 Replies
661 Views
Hi Everyone! I am writing a script to help organize all of the songs in my library, and part of this workflow is to call the Get All Library Songs API. To do this, I tried both creating the request.get call myself, and using a modified fork of an Apple Music Python repo which does work with other APIs to for examples, search for music. The problem is that in both cases, I always get a 403 error saying that authentication is required. Here is some info about a request I made: In addition to the above modifications, I tried running my python script both standalone and wrapped in an Automator app (I created a shell script to run the python script, and then used Automator to wrap it in an application called test), and both resulted in the same error. I verified that the app has access to media & apple music here (and logged out and logged back into my personal Apple Music account to see if anything changed) However, my Apple Music account doesn't show the app at all, nor is there a prompt to allow access for it to my account either: My test app's Info.plist file shows the corresponding flag to enable Apple Music access as well: Can anyone help identify what could be happening here? Is there an explicit permission that I may need to enable to unblock me? Is there an issue with this API specifically? Thanks!
Posted
by
Post not yet marked as solved
1 Replies
431 Views
Hello We have published Voix Audio Recorder Application. The app seems to work but we have very strange issue on one particular iPad device: Application is requesting permission for microphone access, but as you can see on attached images, the permission for microphone is totally missing from permission list. On similar iPad devices, we have not noticed similar issue. What could be cause of this issue and how can we avoid it?
Posted
by
Post not yet marked as solved
0 Replies
459 Views
My application's music/media player is defined in my app as: @State var musicPlayer = MPMusicPlayerController.applicationMusicPlayer I want to change the playback rate and increase it slightly, but I'm unsure how to do this. The documentation is here which states it's part of the MPMediaPlayback protocol but I'm unsure how I can get this to work together. I tried doing the following: self.musicPlayer.currentPlaybackRate(1.2) But I just get the following error: Cannot call value of non-function type 'Float' When I play a song (self.musicPlayer.play()), how can I set the playback rate at the same time please?
Posted
by
Post not yet marked as solved
0 Replies
384 Views
To reproduce this error, first you need to install a build version (for example, version 14). Then, you need to generate some internal audio files within the application. After that, if you update the system to, for example, version 15, you will notice that some of the audio files are not persisted. I made a video on youtube to explain my point: https://youtu.be/fbZ5okq2ddo I am a Flutter developer. The problem is that after the build update, some data persists while others do not. The program generates several audio files during its normal use. Some of these audio files are generated directly by recording with a microphone, while others are generated by concatenating pre-existing audio files. Interestingly, the audio files generated by concatenation are not persisting after the build update. Here is the address of one of the audio files in the set that is not persisting: /var/mobile/Containers/Data/Application/F7288BFF-6A62-49BF-961C-615C17DAE0FE/Library/Application Support/guerreirodafogueiragmailcom_autocura_folder/transMentaltmenv4yT2uHbj8G9p6T.mp3 And here is the address of one of the audio files that is persisting: /var/mobile/Containers/Data/Application/F7288BFF-6A62-49BF-961C-615C17DAE0FE/Library/Application Support/guerreirodafogueiragmailcom_autocura_folder/revoltas.mp3 Other data, such as lists of strings and images, are not lost. Any suggestion to solve this issue? The iphone used to run the app: IPhone SE iOS 15.7.5 My XCode version is: 14.2 (14C18) My Flutter doctor result: [✓] Flutter (Channel stable, 3.0.5, on macOS 13.2.1 22D68 darwin-arm, locale pt-BR) [✓] Android toolchain - develop for Android devices (Android SDK version 32.0.0) [✓] Xcode - develop for iOS and macOS (Xcode 14.2) [✓] Chrome - develop for the web [✓] Android Studio (version 2021.2) [✓] VS Code (version 1.77.0) [✓] Connected device (3 available) [✓] HTTP Host Availability
Posted
by
Post not yet marked as solved
0 Replies
425 Views
Hi, Anyone know if there are a lot of issue requiring Audio Entitlements for CarPlay ? I know a lot of developer have request these entitlements without reply from Apple. Thanks for help me! Andrea
Posted
by
Post not yet marked as solved
0 Replies
565 Views
Hello, Here is an issue I encountered recently. Does anybody have feedback on this? Issue encountered AVAudioFile throws when opening WAV files and MPEG-DASH files with .mp3 extension, works fine with many other tested combinations of formats and extension (for example, an AIFF file with .mp3 extension is read by AVAudioFile without error). The Music app, AVAudioFile and ExtAudioFile all fail on the same files. However, previewing an audio file in Finder (select the file and hit the space bar) works regardless of the file extension. Why do I consider this an issue? AVAudioFile seems to rely on extension sometimes but not always to guess the audio format of the file, which leads to unexpected errors. I would expect AVAudioFile to deal properly with wrong extensions for all supported audio formats. ⚠️ This behaviour can cause real trouble in iOS and macOS applications using audio files coming from the user, which often have unreliable extensions. I published some code to easily reproduce the issue: https://github.com/ThomasHezard/AVAudioFileFormatIssue Thank you everybody, have a great day 😎
Posted
by