I'm using Release 34 (Safari 11.0, WebKit 12604.1.27.0.1)I use the following code to get microphone input and forward the audio to speakers -- note that I use headset so no feedback.var _streamSource; var _audioContext= new webkitAudioContext(); function startMic() { var constraints = { audio:true, video:false }; navigator.mediaDevices.getUserMedia(constraints). then(micStarted).catch(micDenied); } function micStarted(stream) { _streamSource= _audioContext.createMediaStreamSource(stream); _streamSource.connect(_audioContext.destination); } function micDenied(error) { console.log(error) }The audio playback of mic input is horribly distorted with a buzzing sound which makes it hard to listen to. I tried the same code in other browsers and the microphone audio plays back fine without buzz.
Search results for
Popping Sound
19,356 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
hi, i have a simple avaudioplayer, wich runs with a custom rate from -16% to +16%. If the rate is not exactly 1.0 the Sound quality is really bad.i tried 48khz aiff and wav format And set a 48khz sample rate To my Audio Session as well.quality is still bad... can anyone help me?
Dear AppleI am currently using your iPad Pro 12,9” for a lot of my work editing using Lightroom Mobila, Photoshop mobile and the LumaFusion video app.As a photographer it was great to get the support of RAW files, especially when importing the DSLR picture files from SDcard into the ipad.But when doing video editing on the iPad, I am missing the ability to import external recorded sound from an SDcard into the iPad. The Apple Lightning adapter is only supporting video and picture files, but not sound files. Whe is that?The only way I can get my external recorded sound into the LumaFusion video app, is to use a wifi hotspot unit that support SDcard. By using that unit i am able to connect my ipad to the hotspot and transfer the audio files into the ipad, where I then can use it for my video editing.Juse downloaded the beta version of iOS 11 on my iPad Pro and was hoping that the function of letting me import sound files into the ipad by using the original Apple ligh
when using my 2015 Version MacBook pro plugged into my audio interface for playback I'm getting a crackling noise. When I playback directly from my mac with headphones or through the CPU speakers it plays fine. I've tried replacing the interface, cords & wires and still no luck. if I cant fix this I'll probably sell my mac.
Hello,I want to do recording and playing back audio on my iOS device.Can someone give me a short hint, which classes are useful for this use case?I googled around this topic a lot and read a lot of different things.But I'm not really sure, what classes are the right one for me.Examples: AVAudioSinkNode, AVAudioSourceNode, AVAudioSession, AVAudioRecorder,AVAudioPlayer, AVAudioQueue etc. Maybe someone could show me a code snippet or sample project (if possible).My current state: I would try it with AVAudioRecorder and AVAudioPlayer.My Recorder class logic:if let dir = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first { let file = dir.appendingPathComponent(audiofile.wav) // setup audio session with crazy framework let format: [String: Any] = [AVSampleRateKey: 44100.0, AVNumberOfChannelsKey : 2, AVFormatIDKey: kAudioFormatLinearPCM, AVEncoderBitRateKey: 320000, AVEncoderAudioQualityKey : AVAudioQuality.high.rawValue] recorder = try AVAudioRecorder.init(url: file, sett
How to identify recording audio is pleasant or unpleasant in iOS. I am recording of a motor vehicle sound and want to measure sound is pleasant/unpleasant or noisy means basically want to measure motor sound of vehicle for maintenance in iOS.
Hi All I need to be able to capture the audio being sent to a specific audio driver (such as internal speakers) and manipulate it - is there a specific API that allows me to capture audio without inserting anything into the audio graph? (Use case is to add audio to an NDI video feed for ingestion into another application) Thanks!
Hello, I started to set audio stereo recording (both audio and video are recorded) and the audio quality seems to be lower than quality obtained with native camera application (configured for stereo). Using console to check the log, I found a difference between camera app and mine regarding MXSessionMode (of mediaserverd) in fact, camera application gives MXSessionMode = SpatialRecording and mine MXSessionMode = VideoRecording How can I configure capture session to finally have MXSessionMode = SpatialRecording? Any suggestion? Best regards
I have a link in a UITextView in a ViewController which is part of a UINavigationControler, which I present as Modal. When I do a Peek and pop (force touch /3d touch) twice on the link it opens the link in Safari which is default behavior. But When I go back to my app, the View controller is dismissed. Does anyone have encountered this issue and know how to stop the view controller being dismissed.Thanks in Advance.
i want to add vibration and sound with the notification on iwatch, when it arrives on apple watch. I dont know how to do it. Currently my notifications are being displayed on apple wtch but without any sound or vibrations. I want to add these stuff with the notification.
Is anyone having issues with inputs on external interfaces such as motu, apollo or presonus usb/firewire/thunderbolt 2/3 devices? Specifically the motu ultralite mk3 hybrid. the drivers work in high sierra but in mojave for some reason the inputs are showing in audio midi setup but are completely greyed out. Outputs are working fine
Hi Apple Team, We have a technical query regarding one feature- Audio Recognition and Live captioning. We are developing an app for deaf community to avoid communication barriers. We want to know if there is any possibility to recognize the sound from other applications in an iPhone and show live captions in our application (based on iOS).
I want to use the APPs to listen to spatialized audio that I am synthetically generating, and have that audio be in a certain location with respect to the head. As you turn your head, the sound moves. As you walk around you can hear the sound moving in different positions. Is there example code of this out there? How do I make this happen on iOS? There are so many APIs out there for outputting sound on iOS, which one would be the quickest to get the spatialized 3D audio working?
When I used Context Menu for each photo in three columns. So, the code is like below: List { LazyVGrid(columns: columns) { View() .contextMenu { Button() } } } This just pop up whole list not individual element when I long press the element to see context menu. The whole list is highlighted when context menu appears. How can I make this to highlight individual element not the whole list? If I remove any of List or LazyVGrid the problem fix. But the columns are not in the way I want. Also, I don't want to change List to VStack or ScrollView which don't have refreshControl. Thanks in advance.
In my app there is a spinning wheel. I am using KeyFrameAnimations. I want to add sound effects to the wheel rotation animation. I tried AVAudioPlayer. But I want the audio speed to change with the speed of wheel rotation. Please help me ASAP.