Search results for

Popping Sound

19,600 results found

Post

Replies

Boosts

Views

Activity

ios sound recognition: to what extent can developers access apple's built-in sound recognition?
hi, i am currently developing an app that has core functionalities reliant on detecting user laughter in the background. in our early stages we noticed apple's built-in sound recognition functionality. at the core, i am guessing that sound recognition requires permission from the user to access the microphone 24/7. currently, using the conventional avenue of background audio recording, a yellow indicator will be present on the top of the iphone screen indicating recording. this is not the case for sound recognition; instead. if all sound processing/recognition is kept on-device, is there any way to avoid the yellow dot and achieve sound laughter in a way that is similar to how apple's sound recognition does it? from the settings interface for sound recognition accessible to the user in the settings app, the only detectable people sounds are baby crying, coughing, and shouting. is it also possible to add laughter to this list some
2
0
894
Aug ’24
Play sound on watch
I tried to play sound, putting the files in WatchKit app, then: let soundURL = NSURL.fileURLWithPath(NSBundle.mainBundle().pathForResource(ShipBullet, ofType:wav)!) let asset = WKAudioFileAsset(URL:soundURL) let sound = WKAudioFilePlayerItem(asset:asset) let audioPlayer = WKAudioFilePlayer(playerItem:sound) audioPlayer!.play()And the app crashed. Am I doing something wrong?
13
0
6k
Jun ’15
Passkey QR code pop up Question
We are using performRequestsWithOptions to enable passkey on ios app. [authController performRequestsWithOptions:ASAuthorizationControllerRequestOptionPreferImmediatelyAvailableCredentials]; Based on apple doc, this will Tells the authorization controller to prefer credentials that are immediately available on the local device., and fail silently if there are no credentials available. However, in recent testing, we identified that on one device, we are seeing QR code popping up even though there's no credential on the device. Question is this a bug on the OS system? If this is a bug, what are the causes that will trigger this condition? Is there a recommendation to mitigate the issue? Should we move to the new api? Thank you.
1
0
539
Sep ’24
Audio Units v3 OS x: Instantiating custom audio units
I'm working on writing a custom audio unit using the v3 APIs, and I'm having trouble getting them to instantiate correctly under some circumstances:I can instantiate my Audio Unit if I include its view controller and AUAudioUnit subclass within the associated app (either directly or by linking them in from a framework) and then registering it using AUAudioUnit.registerSubclass(asComponentDescription:, name:, version:), but I haven't been able to get it to work by installing the .appex file anywhere. The system just doesn't find it. I have tried putting the .appex in ~/Library/Audio/Plug-Ins/Components, and /Library/Audio/Plug-Ins/Components but it never shows up in the list returned by AVAudioUnitComponentManager.sharedAudioUnitComponentManager().componentsMatchingDescription(). I created a minimal test project using the templates provided by XCode for an Audio Unit Extension, and I've been careful to be sure the Info.plist file for the extension has been filled in
6
0
4.8k
May ’16
sounds set to default
I have updated my Ipad air 2 to Ios 9 Beta4 and now all sounds except ringtones have been set to default.And If I try to change the tone for any of the other sounds, the list of different tones comes up but clicking on any of the tones returns me to the home screen and does not change the tone to that selection.
4
0
563
Jul ’15
How to merge audio
The app that I'm creating I'm trying to combine two audio files, one is larger than the other, I need the first one to run for a certain time and then the second, but all as one audio (the audio is playing while app is in the background)Here is what I got so farfunc merge(audio1: NSURL, audio2: NSURL, time:Double, date:NSDate) { var ok1 = false var ok2 = false / / var composition = AVMutableComposition() var compositionAudioTrack1:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) var compositionAudioTrack2:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) / var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first! as! NSURL var fileDestinationUrl = documentDirectoryURL.URLByAppendingPathComponent(resultmerge.wav) println(fileDestinationUrl) var url1 = audio1
0
0
1.7k
Aug ’15
Play .amr audio
How to play .amr audio-data using AVAudioPlayer? Can play .wav, .mp3, etc. audio, but not .amr.AVAudioPlayer initializing good using AMR data, can get correct duration time, but it wan't play...
0
0
698
Oct ’17
Audio Units for DAWs
Hello all,I've been working with custom AudioUnit for AVAudioEngine on iOS for a while (Audio Unit v3), and I now want to build an Audio Unit plug-in for macOS DAWs (Logic Pro or Ableton Live for example). I've been looking about the subject for a while, but I can't manage to understand how it works.It seems that part of the solution is Audio Unit Extensions, as I saw in WWDC 2015 - 508 and WWDC 2016 - 507, but I don't get how I can reuse the .appex product in Logic Pro or Ableton Live. I've used 3rd party audio units for these softwares but I always get them as .components bundles, which is to be copied in the /Library/Audio/Plug-Ins directory. I tried to copy the .appex in the same folder but it doesn't seem to be recognized by Logic Pro.Any idea about what am I missing here?Thank to you all 🙂Tom
2
0
1.7k
Apr ’18
Problems with Pop-up Login on PWA Packaged
Hi there, We're a Wine eCommerce company and we have a project in progress to launch our webstore on the App Store. To achieve this, we're utilizing PWABuilder to package our website using PWA technology, allowing us to submit it to the App Stores. Naturally, given that it's a Google technology, everything went smoothly with the Play Store, and our app is already available there. However, we're encountering some challenges with the App Store. We need to implement additional features in our webstore, such as AppleID login and ApplePay payment integration. But, apart from that, we are facing another issue: Our third-party login options (Facebook and Google) function as pop-ups on our website but do not work within our Swift Application. When we run our application in Xcode and try to login via third-party, we get this error. 2023-07-28 09:50:26.633655-0300 grandeadega[12858:1201551] [Process] 0x133858818 - [pageProxyID=6, webPageID=7, PID=12915] WebPageProxy::didFailProvisionalLoadForFrame: frameID=957
3
0
1.2k
Sep ’23
Solving 6S distorted audio - avoid bad audio format assumptions
The internal speaker on the iPhone 6S models only support a sample rate of 48kHz while previous iPhone models supported a collection of sample rates.Some developers are running into problems (generally classified as distorted or bad sounding audio) due to some incorrect assumptions being made when the requested “preferred sample rate ends up being different than the current actual hardware sample rate.If you ignore these types of difference and for example set the client format to the hardware format expecting 44.1kHz when the actual sample rate is 48kHz, your application will suffer problems like audio distortion with the further possibility of other failures.Additionally, even if your application is specifying a client format of 44.1kHz, for example the render callback (in the case of the AURemoteIO (kAudioUnitSubType_RemoteIO)) may call you for a varying number of frames in cases where sample rate conversion is involved. Therefore, it is important that the application never assum
0
0
12k
Oct ’15
Replacement for Inter-App Audio?
With iOS Inter-App audio and AUGraph being deprecated (according to the iOS 13 beta Release Notes), how does one expose the existence of an Audio Unit or custom AVAudioNode audio source to another audio destination app, and how does a destination app (recorder, player, and/or visualizer) get sent audio samples from the audio source app (synthesizer or on-screen keyboard)?
1
0
2.8k
Aug ’19