AVAudioEngine

RSS for tag

Use a group of connected audio node objects to generate and process audio signals and perform audio input and output.

AVAudioEngine Documentation

Posts under AVAudioEngine tag

56 Posts
Sort by:
Post not yet marked as solved
3 Replies
1.5k Views
I receive a buffer from[AVSpeechSynthesizer convertToBuffer:fromBuffer:] and want to schedule it on an AVPlayerNode. The player node's output format need to be something that the next node could handle and as far as I understand most nodes can handle a canonical format. The format provided by AVSpeechSynthesizer is not something thatAVAudioMixerNode supports. So the following:   AVAudioEngine *engine = [[AVAudioEngine alloc] init];   playerNode = [[AVAudioPlayerNode alloc] init];   AVAudioFormat *format = [[AVAudioFormat alloc] initWithSettings:utterance.voice.audioFileSettings];   [engine attachNode:self.playerNode];   [engine connect:self.playerNode to:engine.mainMixerNode format:format]; Throws an exception: Thread 1: "[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 \"(null)\"" I am looking for a way to obtain the canonical format for the platform so that I can use AVAudioConverter to convert the buffer. Since different platforms have different canonical formats, I imagine there should be some library way of doing this. Otherwise each developer will have to redefine it for each platform the code will run on (OSX, iOS etc) and keep it updated when it changes. I could not find any constant or function which can make such format, ASDB or settings. The smartest way I could think of, which does not work:   AudioStreamBasicDescription toDesc;   FillOutASBDForLPCM(toDesc, [AVAudioSession sharedInstance].sampleRate,                      2, 16, 16, kAudioFormatFlagIsFloat, kAudioFormatFlagsNativeEndian);   AVAudioFormat *toFormat = [[AVAudioFormat alloc] initWithStreamDescription:&toDesc]; Even the provided example for iPhone, in the documentation linked above, uses kAudioFormatFlagsAudioUnitCanonical and AudioUnitSampleType which are deprecated. So what is the correct way to do this?
Posted
by
Post not yet marked as solved
1 Replies
1.8k Views
I’m developing a voice communication app for the iPad with both playback and record and using AudioUnit of type kAudioUnitSubType_VoiceProcessingIO to have echo cancellation. When playing the audio before initializing the recording audio unit, volume is high. But if I'm playing the audio after initializing the audio unit or when switching to remoteio and then back to vpio the playback volume is low. It seems like a bug in iOS, any solution or workaround for this? Searching the net I only found this post without any solution: https://developer.apple.com/forums/thread/671836
Posted
by
Post not yet marked as solved
1 Replies
2.2k Views
I've noticed that enabling voice processing on AVAudioInputNode change the node's format - most noticeably channel count. let inputNode = avEngine.inputNode print("Format #1: \(inputNode.outputFormat(forBus: 0))") // Format #1: <AVAudioFormat 0x600002bb4be0:  1 ch,  44100 Hz, Float32> try! inputNode.setVoiceProcessingEnabled(true) print("Format #2: \(inputNode.outputFormat(forBus: 0))") // Format #2: <AVAudioFormat 0x600002b18f50:  3 ch,  44100 Hz, Float32, deinterleaved> Is this expected? How can I interpret these channels? My input device is an aggregate device where each channel comes from a different microphone. I then record each channels to separate files. But when voice processing messes up with the channels layout, I cannot rely on this anymore.
Posted
by
Post not yet marked as solved
1 Replies
1.6k Views
Hi, I have multiple audio files I want to decide which channel goes to which output. For example, how to route four 2-channel audio files to an 8-channel output. Also If I have an AVAudioPlayerNode playing a 2-channel track through headphones, can I flip the channels on the output for playback, i.e flip left and right? I have read the following thread which seeks to do something similar, but it is from 2012 and I do not quite understand how it would work in modern day. Many thanks, I am a bit stumped.
Posted
by
Post not yet marked as solved
10 Replies
3.8k Views
I work on a video conferencing application, which makes use of AVAudioEngine and the videoChat AVAudioSession.Mode This past Friday, an internal user reported an "audio cutting in and out" issue with their new iPhone 14 Pro, and I was able to reproduce the issue later that day on my iPhone 14 Pro Max. No other iOS devices running iOS 16 are exhibiting this issue. I have narrowed down the root cause to the videoChat AVAudioSession.Mode after changing line 53 of the ViewController.swift file in Apple's "Using Voice Processing" sample project (https://developer.apple.com/documentation/avfaudio/audio_engine/audio_units/using_voice_processing) from: try session.setCategory(.playAndRecord, options: .defaultToSpeaker) to try session.setCategory(.playAndRecord, mode: .videoChat, options: .defaultToSpeaker) This only causes issues on my iPhone 14 Pro Max device, not on my iPhone 13 Pro Max, so it seems specific to the new iPhones only. I am also seeing the following logged to the console using either device, which appears to be specific to iOS 16, but am not sure if it is related to the videoChat issue or not: 2022-09-19 08:23:20.087578-0700 AVEchoTouch[2388:1474002] [as] ATAudioSessionPropertyManager.mm:71  Invalid input size for property 1684431725 2022-09-19 08:23:20.087605-0700 AVEchoTouch[2388:1474002] [as] ATAudioSessionPropertyManager.mm:225  Invalid input size for property 1684431725 I am assuming 1684431725 is 'dfcm' but I am not sure what Audio Session Property that might be.
Posted
by
Post not yet marked as solved
0 Replies
480 Views
I've created an app that applies stereo EQ to songs with AVAudioUnitEQ and AVAudioEngine. It works great on MPMediaItems with no protected assets. I'd like to apply the EQ to Apple Music tracks for users that have an active subscription. In order to play tracks with AVAudioEngine, I create an AVAudioFile from the MPMediaItem assetURL. When I try to get the URL of an Apple Music track, it returns nil even though I have an active subscription. Is it possible to get the URL of an Apple Music track that an active subscriber has downloaded to their library? If so, I think I'd be all set getting it to work with AVAudioEngine. If it's not possible to get the URL, does anyone know if there's some other method to play Apple Music tracks with AVAudioEngine?
Posted
by
Post not yet marked as solved
2 Replies
1k Views
I am using AVSpeechSynthesizer to get audio buffer and play, I am using AVAudioEngine and AVAudioPlayerNode to play the buffer. But I am getting error. [avae] AVAEInternal.h:76 required condition is false: [AVAudioPlayerNode.mm:734:ScheduleBuffer: (_outputFormat.channelCount == buffer.format.channelCount)] 2023-05-02 03:14:35.709020-0700 AudioPlayer[12525:308940] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: _outputFormat.channelCount == buffer.format.channelCount' Can anyone please help me to play the AVAudioBuffer from AVSpeechSynthesizer write method?
Posted
by
Post marked as solved
1 Replies
1k Views
hi I know there's many ways to record Voice in IOS by using one of these famous framework : AVFoundation AudioToolbox Core Audio but what I want todo is to be able record a phone call but there's many challenges interruptions : when I got a call the system interrupt any running app in order to handle this call so how I can I make the voice recording app record voice in background so after I receive a call I open the app and run the record function even though I solved the previous issue , how can I record the sound comes from the phone top speaker if anyone have an idea or any thoughts , please share it with me
Posted
by
Post not yet marked as solved
0 Replies
554 Views
Our app uses playAndRecord category with options of interruptSpokenAudioAndMixWithOthers, allowBluetoothA2DP, allowAirPlay & defaultToSpeaker, and while backgrounded, Spotify or other music apps play normally when the iPhone is not connected to anything or is connected to Bluetooth. However, once iPhone is connected to CarPlay either via Bluetooth or via USB cable, the music apps' playback quality become flat which suggests the recording is using the Hands Free Profile (which would also happen if allowBluetooth is set and connected to Bluetooth devices) - is there any way to keep using the iPhone microphone to record while connected to CarPlay so that we can allow music apps to play normally?
Posted
by
Post not yet marked as solved
0 Replies
504 Views
private func updateNowPlayingInfo() { var nowPlayingInfo = [String: Any]() nowPlayingInfo[MPMediaItemPropertyTitle] = songLabel.text nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = Int(Double(audioLengthSamples) / audioSampleRate) nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = Int(Double(currentPosition) / audioSampleRate) print(isPlaying) print("updateNow") let playbackRate = isPlaying ? self.timeEffect.rate : 0.0 print(playbackRate) nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = playbackRate MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo } Whenever I press my play/pause button in the app, I expect control center and the lock screen to reflect this. However, control center symbols stays as pause regardless of what I do in the app. Am I missing anything? Running on device with iOS 16.4.1. private func configureRemoteCommandCenter() { let commandCenter = MPRemoteCommandCenter.shared() // Play command commandCenter.playCommand.isEnabled = true commandCenter.playCommand.addTarget { [weak self] event in // Handle the play command self?.playOrPauseFunction() return .success } // Pause command commandCenter.pauseCommand.isEnabled = true commandCenter.pauseCommand.addTarget { [weak self] event in // Handle the pause command self?.playOrPauseFunction() return .success } commandCenter.togglePlayPauseCommand.isEnabled = true commandCenter.togglePlayPauseCommand.addTarget { [weak self] event in self?.playOrPauseFunction() return .success } commandCenter.changePlaybackRateCommand.isEnabled = true commandCenter.changePlaybackPositionCommand.isEnabled = true commandCenter.changePlaybackPositionCommand.addTarget { [unowned self] event in guard let event = event as? MPChangePlaybackPositionCommandEvent else { return .commandFailed } currentPosition = AVAudioFramePosition(event.positionTime * audioSampleRate) scrubSeek(to: Double(currentPosition)) updateNowPlayingInfo() return .success } // Add handlers for other remote control events here... }
Posted
by
Post not yet marked as solved
0 Replies
526 Views
I am developing a MacOS video/audio chat app that uses the audio input + audio only intermittently. The rest of the time I need to stop and tear down AVAudioEngine to allow other applications such as music players to use audio. I have found that just pausing or stopping the engine is not enough, I need to completely tear it down and force a deinit by setting engine = nil in my objective C code (with ARC enabled). What I have learned is that I have to make sure to tear down and detach absolutely everyhing, otherwise AVAudioEngine will fail to start the next time, especially when using a bluetooth headset. However, after months of trial and error, I have something that appears to be almost stable. However, I am sometimes hitting the crash show below after alloc + init of AVAudioEngine instance, when enabling voice processing. The crash is found when building with address-sanitizer enabled, and the logging above the line is my own: stopping audio engine disabling voice processing... voice processing disabled engine stopped waiting for engine... starting audio engine... enabling voice processing... ==75508==ERROR: AddressSanitizer: heap-buffer-overflow on address 0x000111e11be0 at pc 0x000103123360 bp 0x00016d231c90 sp 0x00016d231450 WRITE of size 52 at 0x000111e11be0 thread T218 #0 0x10312335c in wrap_memcpy+0x244 (libclang_rt.asan_osx_dynamic.dylib:arm64e+0x1b35c) (BuildId: f0a7ac5c49bc3abc851181b6f92b308a32000000200000000100000000000b00) #1 0x1077f407c (CoreAudio:arm64e+0xc07c) (BuildId: 3318bd64e64f3e69991d605d1bc10d7d32000000200000000100000000030d00) #2 0x1078f1484 (CoreAudio:arm64e+0x109484) (BuildId: 3318bd64e64f3e69991d605d1bc10d7d32000000200000000100000000030d00) #3 0x1a3d661a0 in AudioUnitGetProperty+0x1c0 (AudioToolboxCore:arm64e+0x2101a0) (BuildId: 3a76e12cd37d3545bb42d52848e0bd7032000000200000000100000000030d00) #4 0x207d8be38 in AVAudioIOUnit_OSX::_GetHWFormat(unsigned int, unsigned int*)+0x76c (AVFAudio:arm64e+0xbde38) (BuildId: 4a3f05007b8c35c98be4e78396ca9eeb32000000200000000100000000030d00) #5 0x207d8aea4 in invocation function for block in AVAudioIOUnit::IOUnitPropertyListener(void*, ComponentInstanceRecord*, unsigned int, unsigned int, unsigned int)+0x15c (AVFAudio:arm64e+0xbcea4) (BuildId: 4a3f05007b8c35c98be4e78396ca9eeb32000000200000000100000000030d00) #6 0x103149f74 in __wrap_dispatch_async_block_invoke+0xc0 (libclang_rt.asan_osx_dynamic.dylib:arm64e+0x41f74) (BuildId: f0a7ac5c49bc3abc851181b6f92b308a32000000200000000100000000000b00) #7 0x1a1d4a870 in _dispatch_call_block_and_release+0x1c (libdispatch.dylib:arm64e+0x2870) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #8 0x1a1d4c3fc in _dispatch_client_callout+0x10 (libdispatch.dylib:arm64e+0x43fc) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #9 0x1a1d53a84 in _dispatch_lane_serial_drain+0x298 (libdispatch.dylib:arm64e+0xba84) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #10 0x1a1d545f4 in _dispatch_lane_invoke+0x17c (libdispatch.dylib:arm64e+0xc5f4) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #11 0x1a1d5f240 in _dispatch_workloop_worker_thread+0x284 (libdispatch.dylib:arm64e+0x17240) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #12 0x1a1ef8070 in _pthread_wqthread+0x11c (libsystem_pthread.dylib:arm64e+0x3070) (BuildId: b401cfb38dfe32db92b3ba8af0f8ca6e32000000200000000100000000030d00) #13 0x1a1ef6d90 in start_wqthread+0x4 (libsystem_pthread.dylib:arm64e+0x1d90) (BuildId: b401cfb38dfe32db92b3ba8af0f8ca6e32000000200000000100000000030d00) 0x000111e11be0 is located 0 bytes to the right of 32-byte region [0x000111e11bc0,0x000111e11be0) allocated by thread T218 here: #0 0x10314ae68 in wrap_malloc+0x94 (libclang_rt.asan_osx_dynamic.dylib:arm64e+0x42e68) (BuildId: f0a7ac5c49bc3abc851181b6f92b308a32000000200000000100000000000b00) #1 0x207d8bdd4 in AVAudioIOUnit_OSX::_GetHWFormat(unsigned int, unsigned int*)+0x708 (AVFAudio:arm64e+0xbddd4) (BuildId: 4a3f05007b8c35c98be4e78396ca9eeb32000000200000000100000000030d00) #2 0x207d8aea4 in invocation function for block in AVAudioIOUnit::IOUnitPropertyListener(void*, ComponentInstanceRecord*, unsigned int, unsigned int, unsigned int)+0x15c (AVFAudio:arm64e+0xbcea4) (BuildId: 4a3f05007b8c35c98be4e78396ca9eeb32000000200000000100000000030d00) #3 0x103149f74 in __wrap_dispatch_async_block_invoke+0xc0 (libclang_rt.asan_osx_dynamic.dylib:arm64e+0x41f74) (BuildId: f0a7ac5c49bc3abc851181b6f92b308a32000000200000000100000000000b00) #4 0x1a1d4a870 in _dispatch_call_block_and_release+0x1c (libdispatch.dylib:arm64e+0x2870) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #5 0x1a1d4c3fc in _dispatch_client_callout+0x10 (libdispatch.dylib:arm64e+0x43fc) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #6 0x1a1d53a84 in _dispatch_lane_serial_drain+0x298 (libdispatch.dylib:arm64e+0xba84) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #7 0x1a1d545f4 in _dispatch_lane_invoke+0x17c (libdispatch.dylib:arm64e+0xc5f4) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #8 0x1a1d5f240 in _dispatch_workloop_worker_thread+0x284 (libdispatch.dylib:arm64e+0x17240) (BuildId: 8e87dc0ea5703933b37d5e05ad51620632000000200000000100000000030d00) #9 0x1a1ef8070 in _pthread_wqthread+0x11c (libsystem_pthread.dylib:arm64e+0x3070) (BuildId: b401cfb38dfe32db92b3ba8af0f8ca6e32000000200000000100000000030d00) #10 0x1a1ef6d90 in start_wqthread+0x4 (libsystem_pthread.dylib:arm64e+0x1d90) (BuildId: b401cfb38dfe32db92b3ba8af0f8ca6e32000000200000000100000000030d00) This is on a Macbook M2 Pro running MacOS 13.3.1 (a) (22E772610a). What is the best way to proceed with this, it looks to me like a bug in AVAudioEngine/CoreAudio. Best regards, Jacob Gorm Hansen
Posted
by
Post not yet marked as solved
1 Replies
981 Views
I'm trying to change the audio input (microphone) between all the available devices from AVAudioSession.sharedInstance().availableInputs. I'm using AVAudioSession.routeChangeNotification to get automatic route changes when devices get connected/disconnected and change the preferred input with setPreferredInput, then I restart my audioEngine and it works fine. But when I try to change the preferred input programmatically It doesn't change the audio capture inputNode. But keeps the last connected device and capturing. Even the AVAudioSession.sharedInstance().currentRoute.inputs changes but the audioEngine?.inputNode doesn't change to setPreferredInput call. WhatsApp seems to have done that without any issues. Any suggestions or leads are highly appreciated. Thanks.
Posted
by
Post not yet marked as solved
0 Replies
498 Views
Thank you for this new API. Today, when using AUVoiceIO, voice gets processed ahead of rendering to ensure the echo canceller is capable of filtering it out from the input. Will other audio be processed in the same way? For example, rendered as mono in a 16kHz sampling rate? I'm asking because I'm wondering if this API will unlock the ability to use wide-band, stereo, high quality other audio (for example game audio) simultaneously while using voice. Thanks!
Posted
by
Post not yet marked as solved
0 Replies
919 Views
New on iOS17 we can control the amount of 'other audio' ducking through the AVAudioEngine. Is this also possible on AVAudioSession? In my app I don't voice input, but I do play voice audio while music from other apps plays in the background. Often the music either drowns to voice, if I use the .mixWithOthers category, or it's not loud enough if I use .duckOthers. It would be awesome to have the level of control that AVAudioEngine has.
Posted
by
Post not yet marked as solved
0 Replies
478 Views
Hi, I am developing a POC music player app. I use AVAudioSession; I have implemented background music and integration with command center. I am focusing now on volume. I am able to receive volume changes, with systemVolumeDidChange. About setting the volume, I am able to set it using MPVolumeView, but not for remote wifi audio device (for example, HomePods). I have the following open points: the native Podcast app is able to control volume when connected to HomePods. How does it do? the native Podcast app has icons for AirPods, HomePods, even Car bluetooth. Are there icon propeties for audioSession.currentRoute.outputs? Or what should I use instead? Here an example of what I would like to achieve:
Posted
by
Post not yet marked as solved
0 Replies
623 Views
The Situation I'm on macOS and I have an AVCaptureSession with camera and audio device inputs which are fed into an AVCaptureMovieFileOutput. What I am looking for is a way to map audio device input channels to file output audio channels, preferably using an explicit channel map. By default, AVCaptureMovieFileOutput takes (presumably) the maximum number of input channels from an audio device that matches an audio format supported by the capture output, and records all of them. This works as expected for mono devices like the built-in microphone and stereo USB mics, the result being either a 1ch mono or a 2ch stereo audio track in the recorded media file. However, the user experience breaks down for 2ch input devices that have an input signal on only one channel, which is reasonable for a 2ch audio interface with one mic connected. This produces a stereo track with the one input channel panned hard to one side. It gets even weirder for multichannel interfaces. For example, an 8ch audio input device results in a 7.1 audio track in the recorded media file with input audio mapped to separate tracks. This is far from ideal during playback, where audio sources are surprisingly coming from seemingly random directions. The Favored Solution Ideally, users should be able to select which channels of their audio input device will be mapped to which audio channel in the recorded media file via UI. The resulting channel map would be configured somewhere on the capture session. The Workaround I have found that AVCaptureFileOutput does not respond well to channel layouts that are not standard audio formats like mono, stereo, quadrophonic, 5.1, and 7.1. This means, channel descriptions and channel bitmaps are out of the question. What does work, is configuring the output with one of the supported channel layouts and disabling audio channels via AVCaptureConnection. With that, the output's encoder produces reasonable results for mono and stereo input devices, if the configured channel layout is kAudioChannelLayoutTag_Stereo, but anything else is mixed down to mono. I am somewhat sympathetic to this solution in so far that in lieu of an explicit channel map the best guess the audio encoder could make, is mixing every enabled channel down to mono. But, as described above, this breaks for 2ch input devices where only one channel is connected to a signal source. The result is a stereo track with audio hard panned to one side. The Question Is there a way to implement the described favored solution with AVCapture* API only, and if not, what's the preferred way of dealing with this scenario - going directly for AVAudioEngine and AVAssetWriter?
Posted
by
Post not yet marked as solved
2 Replies
1.3k Views
I cannot seem to create an AVAudioFile from a URL to be played in an AVAudioEngine. Here is my complete code, following the documentation. import UIKit import AVKit import AVFoundation class ViewController: UIViewController { let audioEngine = AVAudioEngine() let audioPlayerNode = AVAudioPlayerNode() override func viewDidLoad() { super.viewDidLoad() streamAudioFromURL(urlString: "https://samplelib.com/lib/preview/mp3/sample-9s.mp3") } func streamAudioFromURL(urlString: String) { guard let url = URL(string: urlString) else { print("Invalid URL") return } let audioFile = try! AVAudioFile(forReading: url) let audioEngine = AVAudioEngine() let playerNode = AVAudioPlayerNode() audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat) playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in /* Handle any work that's necessary after playback. */ } do { try audioEngine.start() playerNode.play() } catch { /* Handle the error. */ } } } I am getting the following error on let audioFile = try! AVAudioFile(forReading: url) Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.coreaudio.avfaudio Code=2003334207 "(null)" UserInfo={failed call=ExtAudioFileOpenURL((CFURLRef)fileURL, &_extAudioFile)} I have tried many other .mp3 file URLs as well as .wav and .m4a and none seem to work. The documentation makes this look so easy but I have been trying for hours to no avail. If you have any suggestions, they would be greatly appreciated!
Posted
by
Post not yet marked as solved
1 Replies
939 Views
I am analysing sounds by tapping the mic on the Mac. All is working well, but it disrupts other (what I assume) are low priority sounds e.g. dragging an item off the dock, sending a message is messages, speaking something in Shortcuts or Terminal. Other sounds like music.app playing, Siri speaking are not disrupted. The disruption sounds like the last part of the sound being repeated two extra times, very noticeable. This is the code: import Cocoa import AVFAudio class AudioHelper: NSObject { let audioEngine = AVAudioEngine() func start() async throws { audioEngine.inputNode.installTap(onBus: 0, bufferSize: 8192, format: nil) { buffer, time in } try audioEngine.start() } } I have tried increasing the buffer, changing the qos to utility (in the hope the sound analysis would become less important than the disrupted sounds),running on a non-main thread, but no luck. MacOS 13.4.1 Any assistance would be appreciated.
Posted
by
Post not yet marked as solved
0 Replies
1.1k Views
I have a timer in one of my apps. I now want to add audio that plays at the end of the timer. It's a workout app and the sound should remind the user that it is time for the next exercise. The audio should duck music playback (Apple Music / Spotify) and also work in background. Background audio is enabled for the app. I am not able to achieve everything at the same time. I set the audio session to category playback with options duckOthers. do { try AVAudioSession.sharedInstance().setCategory( .playback, options: .duckOthers ) } catch { print(error) } For playback I just use the AVAudioPlayer. When the user starts the timer, i schedule a timer in the future and play the sound. While this works perfectly in the foreground, the sound is not played back when going to background, as timers are not fired in the background, but rather when the user puts the app back in foreground. I have also tried using AVAudioEngine and AVAudioPlayerNode, as the latter can start playback delayed. The case from above works now, but the audio ducking begins immediately when initialising the AVAudioEngine, which is also not what i want. Is there any other approach that I am not aware of?
Posted
by
Post not yet marked as solved
0 Replies
682 Views
Hi, I'm working hard with Logic Pro and it's the 4th time that the application crashes. This is report I receive. What can I do to fix? Thank you in advance Translated Report (Full Report Below) Process: Logic Pro X [1433] Path: /Applications/Logic Pro X.app/Contents/MacOS/Logic Pro X Identifier: com.apple.logic10 Version: 10.7.7 (5762) Build Info: MALogic-5762000000000000~2 (1A85) App Item ID: 634148309 App External ID: 854029738 Code Type: X86-64 (Native) Parent Process: launchd [1] User ID: 501 Date/Time: 2023-07-01 09:16:42.7422 +0200 OS Version: macOS 13.3.1 (22E261) Report Version: 12 Bridge OS Version: 7.4 (20P4252) Anonymous UUID: F5E0021C-707D-3E26-12BC-6E1D779A746A Time Awake Since Boot: 2700 seconds System Integrity Protection: enabled Crashed Thread: 0 Dispatch queue: com.apple.main-thread Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [1433] VM Region Info: 0x10 is not in any region. Bytes before following region: 140737486778352 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> shared memory 7fffffe7f000-7fffffe80000 [ 4K] r-x/r-x SM=SHM Thread 0 Crashed:: Dispatch queue: com.apple.main-thread 0 Logic Pro X 0x108fe6972 0x108a75000 + 5708146 1 Logic Pro X 0x108def2d3 0x108a75000 + 3646163 2 Foundation 0x7ff80e4b3f35 __NSFireDelayedPerform + 440 3 CoreFoundation 0x7ff80d623478 CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION + 20 4 CoreFoundation 0x7ff80d622ff3 __CFRunLoopDoTimer + 807 5 CoreFoundation 0x7ff80d622c19 __CFRunLoopDoTimers + 285 6 CoreFoundation 0x7ff80d608f79 __CFRunLoopRun + 2206 7 CoreFoundation 0x7ff80d608071 CFRunLoopRunSpecific + 560 8 HIToolbox 0x7ff817070fcd RunCurrentEventLoopInMode + 292 9 HIToolbox 0x7ff817070dde ReceiveNextEventCommon + 657 10 HIToolbox 0x7ff817070b38 _BlockUntilNextEventMatchingListInModeWithFilter + 64 11 AppKit 0x7ff81069a7a0 _DPSNextEvent + 858 12 AppKit 0x7ff81069964a -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214 13 Logic Pro X 0x10a29885d 0x108a75000 + 25311325 14 MAToolKit 0x1117f0e37 0x1116ec000 + 1068599 15 MAToolKit 0x1117f64ae 0x1116ec000 + 1090734 16 AppKit 0x7ff8108864b1 -[NSWindow(NSEventRouting) _handleMouseDownEvent:isDelayedEvent:] + 4330 17 AppKit 0x7ff8107fdcef -[NSWindow(NSEventRouting) _reallySendEvent:isDelayedEvent:] + 404 18 AppKit 0x7ff8107fd93f -[NSWindow(NSEventRouting) sendEvent:] + 345 19 Logic Pro X 0x108ebf486 0x108a75000 + 4498566 20 AppKit 0x7ff8107fc319 -[NSApplication(NSEvent) sendEvent:] + 345 21 Logic Pro X 0x10a2995f4 0x108a75000 + 25314804 22 Logic Pro X 0x10a2990c9 0x108a75000 + 25313481 23 Logic Pro X 0x10a29337f 0x108a75000 + 25289599 24 Logic Pro X 0x10a29962e 0x108a75000 + 25314862 25 Logic Pro X 0x10a2990c9 0x108a75000 + 25313481 26 AppKit 0x7ff810ab6bbe -[NSApplication _handleEvent:] + 65 27 AppKit 0x7ff81068bcdd -[NSApplication run] + 623 28 AppKit 0x7ff81065fed2 NSApplicationMain + 817 29 Logic Pro X 0x10956565d 0x108a75000 + 11470429 30 dyld 0x7ff80d1d441f start + 1903 Thread 1:: caulk.messenger.shared:17 0 libsystem_kernel.dylib 0x7ff80d4ef52e semaphore_wait_trap + 10 1 caulk 0x7ff816da707e caulk::semaphore::timed_wait(double) + 150 2 caulk 0x7ff816da6f9c caulk::concurrent::details::worker_thread::run() + 30 3 caulk 0x7ff816da6cb0 void* caulk::thread_proxy<std::__1::tuple<caulk::thread::attributes, void (caulk::concurrent::details::worker_thread::)(), std::__1::tuplecaulk::concurrent::details::worker_thread*>>(void) + 41 4 libsystem_pthread.dylib 0x7ff80d52e1d3 _pthread_start + 125 5 libsystem_pthread.dylib 0x7ff80d529bd3 thread_start + 15 Thread 2:: com.apple.NSEventThread 0 libsystem_kernel.dylib 0x7ff80d4ef5b2 mach_msg2_trap + 10 1 libsystem_kernel.dylib 0x7ff80d4fd72d mach_msg2_internal + 78 2 libsystem_kernel.dylib 0x7ff80d4f65e4 mach_msg_overwrite + 692 3 libsystem_kernel.dylib 0x7ff80d4ef89a mach_msg + 19 4 SkyLight 0x7ff81219f7ac CGSSnarfAndDispatchDatagrams + 160 5 SkyLight 0x7ff8124b8cfd SLSGetNextEventRecordInternal + 284 6 SkyLight 0x7ff8122d8360 SLEventCreateNextEvent + 9 7 HIToolbox 0x7ff81707bfea PullEventsFromWindowServerOnConnection(unsigned int, unsigned char, __CFMachPortBoost*) + 45 8 HIToolbox 0x7ff81707bf8b MessageHandler(__CFMachPort*, void*, long, void*) + 48 9 CoreFoundation 0x7ff80d637e66 __CFMachPortPerform + 244 10 CoreFoundation 0x7ff80d60a5a3 CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE1_PERFORM_FUNCTION + 41 11 CoreFoundation 0x7ff80d60a4e3 __CFRunLoopDoSource1 + 540 12 CoreFoundation 0x7ff80d609161 __CFRunLoopRun + 2694 13 CoreFoundation 0x7ff80d608071 CFRunLoopRunSpecific + 560 14 AppKit 0x7ff8107fa909 _NSEventThread + 132 15 libsystem_pthread.dylib 0x7ff80d52e1d3 _pthread_start + 125 16 libsystem_pthread.dylib 0x7ff80d529bd3 thread_start + 15
Posted
by