Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Post

Replies

Boosts

Views

Activity

AudioConverterFillComplexBuffer crash (NativeInt16ToFloat32Scaled_ARM)
Here is the process in My application.Mic -> AVCaptureOutput -> Audio(PCM) -> Audio Encoder -> AAC Packet (Encoded)Camera -> AVCaptureOutput -> Image -> Video Encoder -> H.264 Video Packet.(Encoded)So, My App is Movie Encoder.Crash is happened when camera is switched. (Front Camera <-> Back Camera)Crash line is AudioConverterFillComplexBuffer.maybe NativeInt16ToFloat32Scaled_ARM..what does that mean??? why???0 AudioCodecs 0x0000000183fbe2bc NativeInt16ToFloat32Scaled_ARM + 1321 AudioCodecs 0x0000000183f63708 AppendInputData(void*, void const*, unsigned int*, unsigned int*, AudioStreamPacketDescription const*) + 562 AudioToolbox 0x000000018411aaac CodecConverter::AppendExcessInput(unsigned int&) + 1963 AudioToolbox 0x000000018411a59c CodecConverter::EncoderFillBuffer(unsigned int&, AudioBufferList&, AudioStreamPacketDescription*) + 6604 AudioToolbox 0x0000000184124ec0 AudioConverterChain::RenderOutput(CABufferList*, unsigned int, unsigned int&, AudioStreamPacketDescription*) + 1165 AudioToolbox 0x0000000184100d98 BufferedAudioConverter::FillBuffer(unsigned int&, AudioBufferList&, AudioStreamPacketDescription*) + 4446 AudioToolbox 0x00000001840d8c9c AudioConverterFillComplexBuffer + 3407 MovieEncoder 0x0000000100341fd4 __49-[AACEncoder encodeSampleBuffer:completionBlock:]_block_invoke (AACEncoder.m:247)
1
1
959
Jan ’16
Playing slow motion videos from camera make it loses slow motion effect
So, I've never faced a problem like this for so long. I have basically scrutinized every possible website on the internet looking for a solution but have found nothing so far.I have a custom picker controller in which I can select videos and play them. Beforehand, I was struggling to play slow motion videos (only no-slow-motion videos were playing) but after searching I found the solution here.http://stackoverflow.com/questions/26152396/how-to-access-nsdata-nsurl-of-slow-motion-videos-using-photokitSo, my code to get videos became this:let videoOptions = PHVideoRequestOptions() videoOptions.version = PHVideoRequestOptionsVersion.Original PHImageManager.defaultManager().requestAVAssetForVideo(asset!, options: videoOptions , resultHandler: { (asset, audioMix, info) -> Void in if let asset = asset as? AVURLAsset { let videoData = NSData(contentsOfURL: asset.URL) let videoPath = NSTemporaryDirectory() + "tmpMovie.MOV" let videoURL = NSURL(fileURLWithPath: videoPath) let writeResult = videoData?.writeToURL(videoURL, atomically: true) if let writeResult = writeResult where writeResult { print("success") let videoVC = VideoViewController(videoUrl: videoURL) imagePicker.presentViewController(videoVC, animated: false, completion: nil) } else { print("failure") } } }) Now, slow motion videos are playing but in a normal way instead of in a slow-motion way. This questions relates to the problem.https://devforums.apple.com/message/903937#903937I've seen a lot of comments saying they solved the problem by using Photos framework, but I have no idea how to achieve this and they didn't explain either. It might be something to do PHAssetMediaSubtypeVideoHighFrameRate. So, how would be possible to play slow motion videos? Do I need to change the fps somehow?Please help, I am quite desperate :/ Objective-C code is welcome as well.
4
0
2.1k
Mar ’16
Processing / tapping an HLS audio stream (or global app output)
I'm trying to do some realtime audio processing on audio served from an HLS stream (i.e. an AVPlayer created using an M3U HTTP URL). It doesn't seem like attaching an AVAudioMix configured with with an `audioTapProcessor` has any effect; none of the callbacks except `init` are being invoked. Is this a known limitation? If so, is this documented somewhere?If the above is a limitation, what are my options using some of the other audio APIs? I looked into `AVAudioEngine` as well but it doesn't seem like there's any way I can configure any of the input node types to use an HLS stream. Am I wrong? Are there lower level APIs available to play HLS streams that provide the necessary hooks?Alternatively, is there some generic way to tap into all audio being output by my app regardless of its source?Thanks a lot!
8
0
4.1k
Apr ’16
How to preserve kCMFormatDescriptionExtension_FieldXXX after encoding?
I am currently working with AVAssetWriterInput with interlaced source.I want to know the proper procedure to preserve 'fiel' format description extension through AVAssetWriterInput compression operation.Do you anyone have an answer on this?>>Technical Note TN2162>>Uncompressed Y´CbCr Video in QuickTime Files>>The 'fiel' ImageDescription Extension: Field/Frame Information//What I have tried is following:First decode into sample buffer with Field/Frame information- Decode DV-NTSC source and get current FieldCount/Detail of source data- Create a kCVPixelFormatType_422YpCbCr8 CVImageBuffer from source- CMSetAttachments() kCVImageBufferFieldCountKey/kCVImageBufferFieldDetailKey with FieldCount=2/FieldDetail=14 to imageBuffer, same as DV-NTSC source- CMVideoFormatDescriptionCreateForImageBuffer()- CMSampleBufferCreateForImageBuffer()- Now decompressed samplebuffer has kCMFormatDescriptionExtension_FieldXXX same as sourceSecond encode using ProRes422- Next I create AVAssetWriterInput to transcode from _422YpCbCr8 into AVVideoCodecAppleProRes422- Now AVAssetWriter can produce ProRes422 mov file, but it does not contain the format description extension 'fiel'.- ProRes422 Encoder does preserve 'colr', 'pasp', 'clap' same as source, but no 'fiel' is there.I have also tried to serve format description with kCMFormatDescriptionExtension_FieldXXX with initWithMediaType:outputSettings:sourceFormatHint:but no success.//
1
0
670
Sep ’16
Problems with AVAudioPlayerNode's scheduleBuffer function
Hi,I'm having two problems using the scheduleBuffer function of AVAudioPlayerNode.Background: my app generates audio programatically, which is why I am using this function. I also need low latency. Therefore, I'm using a strategy of scheduling a small number of buffers, and using the completion handler to keep the process moving forward by scheduling one more buffer for each one that completes.I'm seeing two problems with this approach:One, the total memory consumed by my app grows steadily while the audio is playing, which suggests that the audio buffers are never being deallocated or some other runaway process is underway. (The Leaks tool doesn't detect any leaks, however).Two, audio playback sometimes stops, particularly on slower devices. By "stops", what I mean is that at some point I schedule a buffer and the completion block for that buffer is never called. When this happens, I can't even clear the problem by stopping the player.Now, regarding the first issue, I suspected that if my completion block recursively scheduled another buffer with another completion block, I would probably end up blowing out the stack with an infinite recursion. To get around this, instead of directly scheduling the buffer in the completion block, I set it up to enqueue the schedule in a dispatch queue. However, this doesn't seem to solve the problem.Any advice would be appreciated. Thanks.
9
0
3.7k
Feb ’17
AudioComponentInstanceNew takes several seconds to complete
I'm using a VoiceProcessingIO audio unit in my VoIP application on Mac. The problem is, at least since Mojave, AudioComponentInstanceNew blocks for at least 2 seconds. Profiling shows that internally it's waiting on some mutex and then on some message queue. My code to initialize the audio unit is as follows: OSStatus status; AudioComponentDescription desc; AudioComponent inputComponent; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO; desc.componentFlags = 0; desc.componentFlagsMask = 0; desc.componentManufacturer = kAudioUnitManufacturer_Apple; inputComponent = AudioComponentFindNext(NULL, &desc); status = AudioComponentInstanceNew(inputComponent, &unit);Here's a profiler screenshot showing the two system calls in question.So, is this a bug or an intended behavior?
4
0
2.2k
Oct ’18
Is AudioOutputUnitStop synchronous?
That is, will my render callback ever be called after AudioOutputUnitStop() returns?In other words will it be safe to free resources used by the render callback or do I need to add realtime safe communication between the stopping thread and the callback thread?This question is intended for both macOS HAL Output and iOS Remote IO output units.
5
0
4.8k
Jun ’19
AVAssetReaderOutput.copyNextSampleBuffer() sometimes hangs forever
I'm using AVAssetReaders with AVSampleBufferDisplayLayers to display multiple videos at once. I'm seeing this issue on iOS 13.1.3, 13.2b2, on various hardware like iPad 10.5 and iPad 12.9.It works well for a while, then a random call to copyNextSampleBuffer never returns, blocking that thread indefinitely and eating up resources.I have tried different threading approaches with no avail:If copyNextSampleBuffer() and reader.cancelReading() are done on the same queue, then copyNextSampleBuffer() gets stuck and the cancelReading() never gets processed because the queue is blocked. If I manually (with the debugger) jump in on that blocked queue and execute cancelReading(), immediately an EXC_BREAKPOINT crashes the appIf copyNextSampleBuffer() and reader.cancelReading() are done on different queues, then copyNextSampleBuffer() crashes with EXC_BAD_ACCESSHere's the stacktrace (same queue approach). I don't understand why it's stuck, my expectation is that copyNextSampleBuffer should always return (ie. with nil in error case).VideoPlayerView: UIView with AVSampleBufferDisplayLayerAVAssetFactory: Singleton with the queue that creates & manages all AVAssetReader / AVAsset* objects* thread #22, queue = 'AVAssetFactory' frame #0: 0x00000001852355f4 libsystem_kernel.dylib`mach_msg_trap + 8 frame #1: 0x0000000185234a60 libsystem_kernel.dylib`mach_msg + 72 frame #2: 0x00000001853dc068 CoreFoundation`__CFRunLoopServiceMachPort + 216 frame #3: 0x00000001853d7188 CoreFoundation`__CFRunLoopRun + 1444 frame #4: 0x00000001853d68bc CoreFoundation`CFRunLoopRunSpecific + 464 frame #5: 0x000000018f42b6ac AVFoundation`-[AVRunLoopCondition _waitInMode:untilDate:] + 400 frame #6: 0x000000018f38f1dc AVFoundation`-[AVAssetReaderOutput copyNextSampleBuffer] + 148 frame #7: 0x000000018f3900f0 AVFoundation`-[AVAssetReaderTrackOutput copyNextSampleBuffer] + 72 * frame #8: 0x0000000103309d98 Photobooth`closure #1 in AVAssetFactory.nextSampleBuffer(reader=0x00000002814016f0, retval=(Swift.Optional<CoreMedia.CMSampleBuffer>, Swift.Optional<AVFoundation.AVAssetReader.Status>) @ 0x000000016dbd1cb8) at AVAssetFactory.swift:108:34 frame #9: 0x0000000102f4f480 Photobooth`thunk for @callee_guaranteed () -> () at <compiler-generated>:0 frame #10: 0x0000000102f4f4a4 Photobooth`thunk for @escaping @callee_guaranteed () -> () at <compiler-generated>:0 frame #11: 0x000000010bfe6c04 libdispatch.dylib`_dispatch_client_callout + 16 frame #12: 0x000000010bff5888 libdispatch.dylib`_dispatch_lane_barrier_sync_invoke_and_complete + 124 frame #13: 0x0000000103309a5c Photobooth`AVAssetFactory.nextSampleBuffer(reader=0x00000002814016f0, self=0x0000000281984f60) at AVAssetFactory.swift:101:20 frame #14: 0x00000001032ab690 Photobooth`closure #1 in VideoPlayerView.setRequestMediaLoop(self=0x000000014b8da1d0, handledCompletion=false) at VideoPlayerView.swift:254:70 frame #15: 0x0000000102dce978 Photobooth`thunk for @escaping @callee_guaranteed () -> () at <compiler-generated>:0 frame #16: 0x000000018f416848 AVFoundation`-[AVMediaDataRequester _requestMediaDataIfReady] + 80 frame #17: 0x000000010bfe5828 libdispatch.dylib`_dispatch_call_block_and_release + 24 frame #18: 0x000000010bfe6c04 libdispatch.dylib`_dispatch_client_callout + 16 frame #19: 0x000000010bfedb74 libdispatch.dylib`_dispatch_lane_serial_drain + 744 frame #20: 0x000000010bfee744 libdispatch.dylib`_dispatch_lane_invoke + 500 frame #21: 0x000000010bff9ae4 libdispatch.dylib`_dispatch_workloop_worker_thread + 1324 frame #22: 0x000000018517bfa4 libsystem_pthread.dylib`_pthread_wqthread + 276I've tried all kinds of other things like making sure the AVAssets and all objects are made on one queue, and stopping the AVAssetReaders from ever deallocing to see if that helps. Nothing works. Any ideas?
4
0
2.3k
Oct ’19
CA::Render::Encoder::grow - iOS 13 - some app crashes reported in Organizer
We have an iOS app in the App Store. Recently, we see in Organizer / Crashes for that app a new type of crash that seems bound to iOS 13 devices (21 devices affected in the last few weeks, all with iOS 13.1, .2, or .3). Xcode reports that the crashes occurred in QuartzCore, on CA::Render::Encoder::grow(unsigned long).Any idea what can it be and how it could be fixed? It doesn't look related to our code. Can it be a recent Apple bug? (We didn't have these crash reports before iOS 13.) Thank you in advance.Here is an extract from the crash log:OS Version: iPhone OS 13.3 (17C54) Release Type: User Baseband Version: n/a Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Exception Note: EXC_CORPSE_NOTIFY Triggered by Thread: 0 Thread 0 name: Thread 0 Crashed: 0 libsystem_kernel.dylib 0x0000000191d27ec4 __pthread_kill + 8 1 libsystem_pthread.dylib 0x0000000191c431d8 pthread_kill$VARIANT$mp + 136 (pthread.c:1458) 2 libsystem_c.dylib 0x0000000191b97844 abort + 100 (abort.c:110) 3 QuartzCore 0x00000001989af340 CA::Render::Encoder::grow(unsigned long) + 304 (render-coding.cpp:562) 4 QuartzCore 0x00000001989afa80 CA::Render::Encoder::encode_data_async(void const*, unsigned long, void (*)(void const*, void*), ... + 172 (render-coding.h:272) 5 QuartzCore 0x000000019886c358 CA::Render::Image::encode(CA::Render::Encoder*) const + 748 (render-image.cpp:401) 6 QuartzCore 0x000000019888510c CA::Render::Layer::encode(CA::Render::Encoder*) const + 112 (render-coding.h:388) 7 QuartzCore 0x00000001989b3c2c CA::Render::encode_set_object(CA::Render::Encoder*, unsigned long, unsigned int, CA::Render::Obje... + 192 (render-coding.cpp:2151) 8 QuartzCore 0x00000001988f78a4 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double) + 1568 (CAContextInternal.mm:1632) 9 QuartzCore 0x00000001989add88 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 364 (CALayer.mm:2647) 10 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 11 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 12 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 13 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 14 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 15 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 16 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 17 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 18 QuartzCore 0x00000001989add00 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block... + 228 (CALayer.mm:2633) 19 QuartzCore 0x00000001988f6924 CA::Context::commit_transaction(CA::Transaction*, double) + 2872 (CAContextInternal.mm:2288) 20 QuartzCore 0x000000019891fc08 CA::Transaction::commit() + 676 (CATransactionInternal.mm:438) 21 UIKitCore 0x0000000195769698 -[_UIContextBinder updateBindableOrderWithTest:force:] + 704 (_UIContextBinder.m:280) 22 UIKitCore 0x0000000195769214 -[_UIContextBinder createContextsWithTest:creationAction:] + 100 (_UIContextBinder.m:233) 23 UIKitCore 0x00000001961d2510 -[UIWindowScene _prepareForResume] + 84 (UIWindowScene.m:712) 24 UIKitCore 0x00000001955e364c -[UIScene _emitSceneSettingsUpdateResponseForCompletion:afterSceneUpdateWork:] + 860 (UIScene.m:1055) 25 UIKitCore 0x00000001955e45b8 -[UIScene scene:didUpdateWithDiff:transitionContext:completion:] + 220 (UIScene.m:1315) 26 UIKitCore 0x0000000195b57248 -[UIApplicationSceneClientAgent scene:handleEvent:withCompletion:] + 464 (UIApplicationSceneClientAgent.m:80) 27 FrontBoardServices 0x0000000197051248 -[FBSSceneImpl updater:didUpdateSettings:withDiff:transitionContext:completion:] + 544 (FBSSceneImpl.m:551) 28 FrontBoardServices 0x0000000197075d28 __88-[FBSWorkspaceScenesClient sceneID:updateWithSettingsDiff:transitionContext:completion:]_bloc... + 120 (FBSWorkspaceScenesClient.m:356) 29 FrontBoardServices 0x000000019705af04 -[FBSWorkspace _calloutQueue_executeCalloutFromSource:withBlock:] + 232 (FBSWorkspace.m:357) 30 FrontBoardServices 0x0000000197075c5c __88-[FBSWorkspaceScenesClient sceneID:updateWithSettingsDiff:transitionContext:completion:]_bloc... + 184 (FBSWorkspaceScenesClient.m:355) 31 libdispatch.dylib 0x0000000191bfd184 _dispatch_client_callout + 16 (object.m:495) 32 libdispatch.dylib 0x0000000191ba5fd8 _dispatch_block_invoke_direct$VARIANT$mp + 224 (queue.c:466) 33 FrontBoardServices 0x000000019709a418 __FBSSERIALQUEUE_IS_CALLING_OUT_TO_A_BLOCK__ + 40 (FBSSerialQueue.m:173) 34 FrontBoardServices 0x000000019709a0e4 -[FBSSerialQueue _queue_performNextIfPossible] + 404 (FBSSerialQueue.m:216) 35 FrontBoardServices 0x000000019709a60c -[FBSSerialQueue _performNextFromRunLoopSource] + 28 (FBSSerialQueue.m:247) 36 CoreFoundation 0x0000000191eaea00 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 (CFRunLoop.c:1922) 37 CoreFoundation 0x0000000191eae958 __CFRunLoopDoSource0 + 80 (CFRunLoop.c:1956) 38 CoreFoundation 0x0000000191eae0f0 __CFRunLoopDoSources0 + 180 (CFRunLoop.c:1992) 39 CoreFoundation 0x0000000191ea923c __CFRunLoopRun + 1080 (CFRunLoop.c:2882) 40 CoreFoundation 0x0000000191ea8adc CFRunLoopRunSpecific + 464 (CFRunLoop.c:3192) 41 GraphicsServices 0x000000019be2e328 GSEventRunModal + 104 (GSEvent.c:2246) 42 UIKitCore 0x0000000195fa3ae0 UIApplicationMain + 1936 (UIApplication.m:4773) 43 <Our app name> 0x0000000100d9d32c main + 88 (main.m:14) 44 libdyld.dylib 0x0000000191d32360 start + 4
4
1
2.4k
Jan ’20
Random 561015905 on AVAudioSession setActive
I'm trying to make an app that plays audio when push is received, even while the ringer is silent.I have the audio background mode enabled and the app works (usually) fine, but sometimes i get a mysterious 561015905 error.The error isAVAudioSession.ErrorCode.cannotStartPlayingAnd docs indicate:This error type can occur if the app’s Information property list doesn’t permit audio use. It can also occur if the app is in the background and using a category that doesn’t allow background audio.https://developer.apple.com/documentation/avfoundation/avaudiosession/errorcode/cannotstartplayingHowever, the audio mode is definitely enabled and the category is always playback that should allow background playing.(this works 95-99% of times)So, with intesnse testing i have found that once every 20-100 incoming pushes i get this error:Error: Error Domain=NSOSStatusErrorDomain Code=561015905 "(null)"Why?
2
1
2k
Jun ’20
How to cache an HLS video while playing it
Hi, I'm working on an app where a user can scroll through a feed of short videos (a bit like TikTok). I do some pre-fetching of videos a couple positions ahead of the user's scroll position, so the video can start playing as soon as he or she scrolls to the video. Currently, I just pre-fetch by initializing a few AVPlayers. However, I'd like to add a better caching system. I'm looking for the best way to: get videos to start playing as possible, while making sure to minimize re-downloading of videos if a user scrolls away from a video and back to it. Is there a way that I can cache the contents of an AVPlayer that has loaded an HLS video? Alternatively, I've explored using AVAssetDownloadTask to download the HLS videos. My issue is that I can't download the full video and then play it - I need to start playing the video as soon as the user scrolls to it, even if it's not done downloading. Is there a way to start an HLS download with an AVAssetDownloadTask, and then start playing the video while it continues downloading? Thank you!
10
0
6.5k
Jun ’20
iOS 14 - HLS metadata not showing user defined URL frames within id3 tag
Hello, I am using HLS audio stream and I get metadata to show the song info, artist info etc., in the app using AVPlayer. In iOS 14, I see the artwork data (which embeds in WXXX frame of id3 tag) is coming over across along with other metadata. The same stream is working fine and showing artwork data in iOS 13. Does anything changed in iOS 14 which effects this metadata? Please let me know Thank you
4
0
1.5k
Dec ’20
AVAssetWriter leading to huge kernel_task memory usage
I am currently working on a macOS app which will be creating very large video files with up to an hour of content. However, generating the images and adding them to AVAssetWriter leads to VTDecoderXPCService using 16+ GB of memory and the kernel-task using 40+ GB (the max I saw was 105GB). It seems like the generated video is not streamed onto the disk but rather written to memory for it to be written to disk all at once. How can I force it to stream the data to disk while the encoding is happening? Btw. my app itself consistently needs around 300MB of memory, so I don't think I have a memory leak here. Here is the relevant code: func analyse() { 				self.videoWritter = try! AVAssetWriter(outputURL: outputVideo, fileType: AVFileType.mp4) 				let writeSettings: [String: Any] = [ 						AVVideoCodecKey: AVVideoCodecType.h264, 						AVVideoWidthKey: videoSize.width, 						AVVideoHeightKey: videoSize.height, 						AVVideoCompressionPropertiesKey: [ 								AVVideoAverageBitRateKey: 10_000_000, 						] 				] 				self.videoWritter!.movieFragmentInterval = CMTimeMake(value: 60, timescale: 1) 				self.frameInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writeSettings) 				self.frameInput?.expectsMediaDataInRealTime = true 				self.videoWritter!.add(self.frameInput!) 				if self.videoWritter!.startWriting() == false { 						print("Could not write file: \(self.videoWritter!.error!)") 						return 				} } func writeFrame(frame: Frame) { 				/* some more code to determine the correct frame presentation time stamp */ 				let newSampleBuffer = self.setTimeStamp(frame.sampleBuffer, newTime: self.nextFrameStartTime!) 				self.frameInput!.append(newSampleBuffer) 				/* some more code */ } func setTimeStamp(_ sample: CMSampleBuffer, newTime: CMTime) -> CMSampleBuffer { 				var timing: CMSampleTimingInfo = CMSampleTimingInfo( 						duration: CMTime.zero, 						presentationTimeStamp: newTime, 						decodeTimeStamp: CMTime.zero 				) 				var newSampleBuffer: CMSampleBuffer? 				CMSampleBufferCreateCopyWithNewTiming( 						allocator: kCFAllocatorDefault, 					 sampleBuffer: sample, 					 sampleTimingEntryCount: 1, 					 sampleTimingArray: &timing, 					 sampleBufferOut: &newSampleBuffer 			 ) 				return	newSampleBuffer! 		} My specs: MacBook Pro 2018 32GB Memory macOS Big Sur 11.1
5
0
2.0k
Dec ’20
What causes "issue_type = overload" in coreaudiod with USB audio interface?
I have a USB audio interface that is causing kernel traps and the audio output to "skip" or dropout every few seconds. This behavior occurs with a completely fresh install of Catalina as well as Big Sur with the stock Music app on a 2019 MacBook Pro 16 (full specs below). The Console logs show coreaudiod got an error from a kernel trap, a "USB Sound assertion" in AppleUSBAudio/AppleUSBAudio-401.4/KEXT/AppleUSBAudioDevice.cpp at line 6644, and the Music app "skipping cycle due to overload." I've added a short snippet from Console logs around the time of the audio skip/drop out. The more complete logs are at this gist: https://gist.github.com/djflux/08d9007e2146884e6df1741770de5105 I've also opened a Feedback Assistant ticket (FB9037528): https://feedbackassistant.apple.com/feedback/9037528 Does anyone know what could be causing this issue? Thanks for any help. Cheers, Flux aka Andy. Hardware Overview:  Model Name: MacBook Pro  Model Identifier: MacBookPro16,1  Processor Name: 8-Core Intel Core i9  Processor Speed: 2.4 GHz  Number of Processors: 1  Total Number of Cores: 8  L2 Cache (per Core): 256 KB  L3 Cache: 16 MB  Hyper-Threading Technology: Enabled  Memory: 64 GB  System Firmware Version: 1554.80.3.0.0 (iBridge: 18.16.14347.0.0,0) System Software Overview: System Version: macOS 11.2.3 (20D91)  Kernel Version: Darwin 20.3.0  Boot Volume: Macintosh HD  Boot Mode: Normal  Computer Name: mycomputername  User Name: myusername  Secure Virtual Memory: Enabled  System Integrity Protection: Enabled USB interface: Denon DJ DS1 Snippet of Console logs error 21:07:04.848721-0500 coreaudiod HALS_IOA1Engine::EndWriting: got an error from the kernel trap, Error: 0xE00002D7 default 21:07:04.848855-0500 Music HALC_ProxyIOContext::IOWorkLoop: skipping cycle due to overload default 21:07:04.857903-0500 kernel USB Sound assertion (Resetting engine due to error returned in Read Handler) in /AppleInternal/BuildRoot/Library/Caches/com.apple.xbs/Sources/AppleUSBAudio/AppleUSBAudio-401.4/KEXT/AppleUSBAudioDevice.cpp at line 6644 ... default 21:07:05.102746-0500 coreaudiod Audio IO Overload inputs: 'private' outputs: 'private' cause: 'Unknown' prewarming: no recovering: no default 21:07:05.102926-0500 coreaudiod   CAReportingClient.mm:508  message {   HostApplicationDisplayID = "com.apple.Music";   cause = Unknown;   deadline = 2615019;   "input_device_source_list" = Unknown;   "input_device_transport_list" = USB;   "input_device_uid_list" = "AppleUSBAudioEngine:Denon DJ:DS1:000:1,2";   "io_buffer_size" = 512;   "io_cycle" = 1;   "is_prewarming" = 0;   "is_recovering" = 0;   "issue_type" = overload;   lateness = "-535";   "output_device_source_list" = Unknown;   "output_device_transport_list" = USB;   "output_device_uid_list" = "AppleUSBAudioEngine:Denon DJ:DS1:000:1,2"; }: (null)
37
12
15k
Apr ’21
Background audio issues with AVPictureInPictureController
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
1
0
1.1k
Dec ’21
How to obtain an AVAudioFormat for a canonical format?
I receive a buffer from[AVSpeechSynthesizer convertToBuffer:fromBuffer:] and want to schedule it on an AVPlayerNode. The player node's output format need to be something that the next node could handle and as far as I understand most nodes can handle a canonical format. The format provided by AVSpeechSynthesizer is not something thatAVAudioMixerNode supports. So the following:   AVAudioEngine *engine = [[AVAudioEngine alloc] init];   playerNode = [[AVAudioPlayerNode alloc] init];   AVAudioFormat *format = [[AVAudioFormat alloc] initWithSettings:utterance.voice.audioFileSettings];   [engine attachNode:self.playerNode];   [engine connect:self.playerNode to:engine.mainMixerNode format:format]; Throws an exception: Thread 1: "[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 \"(null)\"" I am looking for a way to obtain the canonical format for the platform so that I can use AVAudioConverter to convert the buffer. Since different platforms have different canonical formats, I imagine there should be some library way of doing this. Otherwise each developer will have to redefine it for each platform the code will run on (OSX, iOS etc) and keep it updated when it changes. I could not find any constant or function which can make such format, ASDB or settings. The smartest way I could think of, which does not work:   AudioStreamBasicDescription toDesc;   FillOutASBDForLPCM(toDesc, [AVAudioSession sharedInstance].sampleRate,                      2, 16, 16, kAudioFormatFlagIsFloat, kAudioFormatFlagsNativeEndian);   AVAudioFormat *toFormat = [[AVAudioFormat alloc] initWithStreamDescription:&toDesc]; Even the provided example for iPhone, in the documentation linked above, uses kAudioFormatFlagsAudioUnitCanonical and AudioUnitSampleType which are deprecated. So what is the correct way to do this?
4
0
1.7k
Feb ’22
Low playback volume when using kAudioUnitSubType_VoiceProcessingIO audio unit
I’m developing a voice communication app for the iPad with both playback and record and using AudioUnit of type kAudioUnitSubType_VoiceProcessingIO to have echo cancellation. When playing the audio before initializing the recording audio unit, volume is high. But if I'm playing the audio after initializing the audio unit or when switching to remoteio and then back to vpio the playback volume is low. It seems like a bug in iOS, any solution or workaround for this? Searching the net I only found this post without any solution: https://developer.apple.com/forums/thread/671836
1
2
1.9k
Feb ’22
Enabling Voice Processing changes channels count on the input node.
I've noticed that enabling voice processing on AVAudioInputNode change the node's format - most noticeably channel count. let inputNode = avEngine.inputNode print("Format #1: \(inputNode.outputFormat(forBus: 0))") // Format #1: <AVAudioFormat 0x600002bb4be0:  1 ch,  44100 Hz, Float32> try! inputNode.setVoiceProcessingEnabled(true) print("Format #2: \(inputNode.outputFormat(forBus: 0))") // Format #2: <AVAudioFormat 0x600002b18f50:  3 ch,  44100 Hz, Float32, deinterleaved> Is this expected? How can I interpret these channels? My input device is an aggregate device where each channel comes from a different microphone. I then record each channels to separate files. But when voice processing messes up with the channels layout, I cannot rely on this anymore.
1
1
2.3k
Jul ’22
Multi Channel Audio With AVAudioEngine, Flipping Audio Channels
Hi, I have multiple audio files I want to decide which channel goes to which output. For example, how to route four 2-channel audio files to an 8-channel output. Also If I have an AVAudioPlayerNode playing a 2-channel track through headphones, can I flip the channels on the output for playback, i.e flip left and right? I have read the following thread which seeks to do something similar, but it is from 2012 and I do not quite understand how it would work in modern day. Many thanks, I am a bit stumped.
1
1
1.8k
Sep ’22