AVAudioSession

RSS for tag

Use the AVAudioSession object to communicate to the system how you intend to use audio in your app.

AVAudioSession Documentation

Posts under AVAudioSession tag

84 Posts
Sort by:
Post not yet marked as solved
1 Answers
646 Views
I'm trying to update my SwiftUI view when system volume changes. Ultimately, my use case is to display a low volume warning overlay when the system volume is low. Right now, though, let's say I'm just trying to show the current volume in a Text label. I've tried using onReceive with a publisher on AVAudioSession.sharedInstance().outputVolume, as in below, but my onReceive block only fires once when the view appears, and is not called when the volume subsequently changes. Why does the below not work, and what is the best way to update SwiftUI views when volume changes? struct Test: View { @State var volume: Float = 0 var body: some View { Text("current volume is \(volume)") .onReceive(AVAudioSession.sharedInstance().publisher(for: \.outputVolume), perform: { value in self.volume = value }) } } Thanks!
Posted Last updated
.
Post not yet marked as solved
0 Answers
42 Views
Every time the AVAudioSession category is re-activated ( after being inactivated) and the audioengine is restarted (calling stop() and play()), the output from audioplayer node seems to ignore the audio session category, until explicitly connecting the audio player node again using audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat) The documentation regarding this behavior is not clear and would like to clarify the following. Should audioEngine.connect be called everytime the AVAudioSession is activated? Should audioEngine.connect be called after the audio player engine is stopped (audioEngine.stop())?
Posted Last updated
.
Post not yet marked as solved
1 Answers
62 Views
I would like to know whether the following items can be implemented with swift and whether there are any restrictions on passing the Apple side inspection. Almost all of them are DND-related, but the part I understood is written in parentheses next to the function or function, and I want to know if it is possible if that function is used. I'm a beginner in swift development, so I can't find a solution with a search, so I'm writing this. Don't ring the phone Event catch O (CallKit) when making a call or during a call Turn off all notifications except for allowed first-party apps Send a notification when the company's app is in the background (shoot API communication) O Check whether the current phone is sound, vibration, or silent O (AVAudioSession) Forced switching of sound, vibration, and silence O (AVAudioSession) Turn off SMS notifications
Posted
by dehien.
Last updated
.
Post not yet marked as solved
0 Answers
53 Views
I am trying to record and play audio from keyboard extension in Swift, but it throws an error on line recordingSession.setActive(true) Error: failed to record The operation couldn’t be completed. (OSStatus error 561015905.) I have already set the key RequestsOpenAccess to true in info.plist and granted full access to the keyboard extension.
Posted Last updated
.
Post marked as solved
2 Answers
122 Views
WWDC 2022 announced CallKit for WatchOS 9, but didn't explain how to use it. I was able to convert SpeakerBox sample from iOS to WatchOS. The CallKit itself seems works fine, but I stuck with networking issue. I cannot understand how to stream RTP audio. I have background audio enabled. If I configure audio session for CallKit with .playAndRecord category, I cannot specify .longFormAudio policy, as it's not applicable for this category. Without .longFormAudio low lever API is denied ("Path was denied by NECP policy").
Posted
by Artem_XZ.
Last updated
.
Post not yet marked as solved
0 Answers
80 Views
I'v got a crash in ios16, and it worked well in other versions. Here is the code I called: *NSError error = nil; ret = [audioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&error]; Exception Info: **Exception Type: SIGABRT Exception Codes: #0 at 0x1d0524280 Triggered by Thread: 0 Application Specific Information: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[NSXPCEncoder _checkObject:]: This coder only encodes objects that adopt NSSecureCoding (object is of class 'NSNull').' Last Exception Backtrace: 0 CoreFoundation 0x00000001914400c8 + 164 1 libobjc.A.dylib 0x0000000191063098 objc_exception_throw + 60 2 Foundation 0x000000019201e7d0 + 308 3 Foundation 0x0000000191fbd184 + 40 4 Foundation 0x0000000192019474 + 184 5 Foundation 0x00000001920248f8 + 568 6 Foundation 0x000000019200901c + 488 7 Foundation 0x0000000191fdaad4 + 252 8 Foundation 0x0000000192007730 + 232 9 Foundation 0x0000000191fdd030 + 1220 10 CoreFoundation 0x00000001913d6dd0 + 1016 11 CoreFoundation 0x00000001913d61a0 _CF_forwarding_prep_0 + 96 12 AudioSession 0x000000019f738f38 + 316 13 AudioSession 0x000000019f738d8c + 156 14 AudioSession 0x000000019f7481c0 + 140 15 Runner 0x00000001025dc728 -[ARTCEngineKit switchAudioCategaryWithSpeaker:] ARTCEngineKit.mm:2013 (in Runner)**
Posted
by SheepWolf.
Last updated
.
Post marked as solved
1 Answers
189 Views
Hi There, Whenever I want to use the microphone for my ShazamKit app while connected to AirPods my app crashes with a "Invalid input sample rate." message. I've tried multiple formats but keep getting this crash. Any pointers would be really helpful. func configureAudioEngine() { do { try audioSession.setCategory(.playAndRecord, options: [.mixWithOthers, .defaultToSpeaker, .allowAirPlay, .allowBluetoothA2DP ,.allowBluetooth]) try audioSession.setActive(false, options: .notifyOthersOnDeactivation) } catch { print(error.localizedDescription) } guard let engine = audioEngine else { return } let inputNode = engine.inputNode let inputNodeFormat = inputNode.inputFormat(forBus: 0) let audioFormat = AVAudioFormat( standardFormatWithSampleRate: inputNodeFormat.sampleRate, channels: 1 ) // Install a "tap" in the audio engine's input so that we can send buffers from the microphone to the signature generator. engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { buffer, audioTime in self.addAudio(buffer: buffer, audioTime: audioTime) } } ```
Posted Last updated
.
Post not yet marked as solved
0 Answers
93 Views
Voice isolation does a great job with noise suppression when the user is holding the phone in hand (facetime use case). But when the phone is about 4-feet away from the user, voice isolation quality substantially drops and we are seeing that it is better to not use it. Our use case demands that user mounts phone on tripod and sites approximately 4 feet away from camera. In this case we are seeing worst performance from voice isolation, presumably because of heavy signal processing and lower original signal to begin with.
Posted Last updated
.
Post not yet marked as solved
0 Answers
78 Views
So I have an app that works like PTT app, right now I want to use the remote events detected by MPRemoteCommandCenter, but I cannot do it, I'm setting up the AVAudioSessionsetCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth]) is this possible? With this I cannot receive events correctly. If I change allowBluetooth to allowBluetoothA2DP it will receive the events from pressing any of the buttons in the wireless headset but it will not use the mic from the headset. Any suggestion? thanks!
Posted Last updated
.
Post not yet marked as solved
0 Answers
99 Views
In my Unity project I have a button that allows players to record their voice by accessing the microphone. When the recording ends, the app gets muted if the phone is in silent mode. It works well if silent mode is off on the phone. The Unity function that ends the recording is Microphone.End(). Thanks.
Posted
by senna125.
Last updated
.
Post not yet marked as solved
1 Answers
638 Views
Physical volume button pressed notification "AVSystemController_SystemVolumeDidChangeNotification" stopped working with the release of iOS 15. AVAudioSession::OutputVolume (https://developer.apple.com/documentation/avfaudio/avaudiosession/1616533-outputvolume?language=objc) seems like the preferred approach but the limitation is that it doesn't provide a callback when volume is at max (press up) or min (press down). is there any alternative to get the callback even when the volume is at max or min? There is a new notification available "SystemVolumeDidChange" that works similar to "AVSystemController_SystemVolumeDidChangeNotification". Please can you confirm if it a Private API or if it may be used to publish apps on App Store? Our use case requires the physical volume button press notification even if volume is already at max or min. is there any other alternative approach available?
Posted
by Yash12.
Last updated
.
Post not yet marked as solved
6 Answers
1.7k Views
What are the requirements to support Voice Isolation / Wide Spectrum microphone modes on iOS 15? I see that it's possible to programmatically display the selection menu but the new options say they are unavailable inside of my app. I have a dummy app that creates a standard AVAudioSession and sets the mode to .voiceChat (have tried a lot values here) but still can't seem to switch the microphone mode. It also has the VoIP flag enabled on the capabilities tab. Docs to show Microphone Mode prompt: https://developer.apple.com/documentation/avfoundation/avcapturedevice/systemuserinterface/microphonemodes
Posted Last updated
.
Post not yet marked as solved
0 Answers
216 Views
Since iOS 15.4 I am having a serious issue with AirPlay. Please note, this issue is against the AVAudio framework and I reproduced the issue with all projects that run this framework. For example: React Native TrackPlayer has this problem. Swift Radio Project has this issue too. And also, I found some modules on Github that say they have the same issue. What happens When I open my app and start playing my audio stream, it starts playing on the iPhone. When I change the output to my MacBook Pro or HomePod (or any other AirPlay device), there is no sound. The strange thing is that the HomePod lids up, but there is no sound. When stopping and starting the stream again while connected via AirPlay, it is working correctly. However, switching the output to the iPhone stops the stream again. What I expected I expected AirPlay to work as prior iOS 15.4. This means the audio keeps playing when switching from iPhone to HomePod and vice versa. I tested it with older iOS versions (15.2) that was available a few months ago, and it worked perfectly. So it seems this is certainly an issue with the newer versions. What causes the issue The issue seems to be only with live audo streams. When I test it with an MP3 (doesn't matter if it is offline or online), it works perfectly fine and as intended. This problem only happens with live audio streams. I have also sent a bug report many times, but Apple didn't reply and the bug is still there in iOS 15.5. I hoped Apple would be more interested in such breaking bugs. What I also did, is removing longformAudio en adding mixWithOthers. That works perfectly, but then the controls on the Control Center and home screen are missing, of course. Is there any way to solve it or a way to talk to the Apple developers so we can see what we can do about it?
Posted Last updated
.
Post not yet marked as solved
0 Answers
142 Views
I'm working on application that records audio. The user can choose between quality settings, and one of the settings results in FLAC recording. Here are the settings: [ AVFormatIDKey: kAudioFormatFLAC, AVSampleRateKey: 48000, AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue, AVEncoderBitRateKey: 192000, AVLinearPCMBitDepthKey: 32, AVLinearPCMIsBigEndianKey: false, AVLinearPCMIsFloatKey: false ] AVAudioRecorder's code is pretty simple: ... let recorder = try AVAudioRecorder(url: outputUrl, settings: audioQuality.settings) recorder.delegate = self recorder.isMeteringEnabled = true recorder.prepareToRecord() recorder.record() self.recorder = recorder ... And these settings used to work, at least on iOS 15.1. Now on iOS 15.4.1 (maybe earlier) the audio is corrupted. There is no issue with AVAudioPlayer output. It starts recording, displays time, and stops recording. And the AVAudioRecorder's delegate function func audioRecorderDidFinishRecording(_ recorder: AVAudioRecorder, successfully flag: Bool) is called after ... recorder.stop() ... with successfully flag == true. Works good with other settings. And used to work with given settings earlier. Any idea why it started to happen? PS I've tried to change settings to what I've found on stackoverflow (flac) but with no success.
Posted
by Qicetosh.
Last updated
.
Post not yet marked as solved
0 Answers
141 Views
I am trying to use AVAudioEngine for listening to mic samples and playing them simultaneously via external speaker or headphones (assuming they are attached to iOS device). I tried the following using AVAudioPlayerNode and it works, but there is too much delay in the audio playback. Is there a way to hear sound realtime without delay? Why the scheduleBuffer API has so much delay I wonder. var engine: AVAudioEngine! var playerNode: AVAudioPlayerNode! var mixer: AVAudioMixerNode! var audioEngineRunning = false public func setupAudioEngine() { self.engine = AVAudioEngine() let input = engine.inputNode let format = input.inputFormat(forBus: 0) playerNode = AVAudioPlayerNode() engine.attach(playerNode) self.mixer = engine.mainMixerNode engine.connect(self.playerNode, to: self.mixer, format: playerNode.outputFormat(forBus: 0)) engine.inputNode.installTap(onBus: 0, bufferSize: 4096, format: format, block: { buffer, time in self.playerNode.scheduleBuffer(buffer, completionHandler: nil) }) do { engine.prepare() try self.engine.start() audioEngineRunning = true self.playerNode.play() } catch { print("error couldn't start engine") audioEngineRunning = false } }
Posted Last updated
.
Post not yet marked as solved
0 Answers
160 Views
I am using AVAudioSession with playAndRecord category as follows: private func setupAudioSessionForRecording() { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setActive(false) try audioSession.setPreferredSampleRate(Double(48000)) } catch { NSLog("Unable to deactivate Audio session") } let options:AVAudioSession.CategoryOptions = [.allowAirPlay, .mixWithOthers] do { try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options) } catch { NSLog("Could not set audio session category \(error)") } do { try audioSession.setActive(true) } catch { NSLog("Unable to activate AudioSession") } } Next I use AVAudioEngine to repeat what I say in the microphone to external speakers (on the TV connected with iPhone using HDMI Cable). //MARK:- AudioEngine var engine: AVAudioEngine! var playerNode: AVAudioPlayerNode! var mixer: AVAudioMixerNode! var audioEngineRunning = false public func setupAudioEngine() { self.engine = AVAudioEngine() engine.connect(self.engine.inputNode, to: self.engine.outputNode, format: nil) do { engine.prepare() try self.engine.start() } catch { print("error couldn't start engine") } audioEngineRunning = true } public func stopAudioEngine() { engine.stop() audioEngineRunning = false } The issue is I hear some kind of reverb/humming noise after I speak for a few seconds that keeps getting amplified and repeated. If I use a RemoteIO unit instead, no such noise comes out of speakers. I am not sure if my setup of AVAudioEngine is correct. I have tried all kinds of AVAudioSession configuration but nothing changes. The link to sample audio with background speaker noise is posted [here] in the Stackoverflow forum (https://stackoverflow.com/questions/72170548/echo-when-using-avaudioengine-over-hdmi#comment127514327_72170548)
Posted Last updated
.
Post marked as solved
1 Answers
145 Views
We start a voice recording via self.avAudioRecorder = try AVAudioRecorder( url: self.recordingFileUrl, settings: settings ) self.avAudioRecorder.record() At certain point, we will stop the recording via self.avAudioRecorder.stop() I was wondering, is it safe to perform file copy on self.recordingFileUrl immediately, after self.avAudioRecorder.stop()? Is all recording data has been flushed to self.recordingFileUrl and self.recordingFileUrl file is closed properly?
Posted
by yccheok.
Last updated
.
Post not yet marked as solved
2 Answers
798 Views
I want to record both IMU data and Audio data from Airpods Pro. I have tried many times, and I failed. I can successfully record the IMU data and iPhone's microphone data simultaneously. When I choose Airpods Pro's microphone in the setCategory() function, the IMU data collection process stopped. If I change recordingSession.setCategory(.playAndRecord, mode: .default, options: .allowBluetooth) to ecordingSession.setCategory(.playAndRecord, mode: .default), everything is okay except the audio is recorded from the handphone. If I add options: .allowBluetooth, the IMU update will stop. Could you give me some suggestions for this? Below are some parts of my code.   let My_IMU = CMHeadphoneMotionManager()   let My_writer = CSVWriter()   var write_state: Bool = false   func test()   {     recordingSession = AVAudioSession.sharedInstance()     do {       try recordingSession.setCategory(.playAndRecord, mode: .default, options: .allowBluetooth)       try recordingSession.setActive(true)       recordingSession.requestRecordPermission() { [unowned self] allowed in         DispatchQueue.main.async {           if allowed == false {print("failed to record!")}         }       }     } catch {       print("failed to record!")     }                 let audioFilename = getDocumentsDirectory().appendingPathComponent("test_motion_Audio.m4a")     let settings = [       AVFormatIDKey: Int(kAudioFormatMPEG4AAC),       AVSampleRateKey: 8000,       AVNumberOfChannelsKey: 1,       AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue     ]     do     {       audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings)       audioRecorder.delegate = self       audioRecorder.record()     }     catch     {       print("Fail to record!")       finishRecording()     }           write_state.toggle()     let dir = FileManager.default.urls(      for: .documentDirectory,      in: .userDomainMask     ).first!           let filename = "test_motion_Audio.csv"     let fileUrl = dir.appendingPathComponent(filename)     My_writer.open(fileUrl)           APP.startDeviceMotionUpdates(to: OperationQueue.current!, withHandler: {[weak self] motion, error in       guard let motion = motion, error == nil else { return }       self?.My_writer.write(motion)     })   }
Posted
by wang0665.
Last updated
.