I tried to play sound, putting the files in WatchKit app, then: let soundURL = NSURL.fileURLWithPath(NSBundle.mainBundle().pathForResource(ShipBullet, ofType:wav)!) let asset = WKAudioFileAsset(URL:soundURL) let sound = WKAudioFilePlayerItem(asset:asset) let audioPlayer = WKAudioFilePlayer(playerItem:sound) audioPlayer!.play()And the app crashed. Am I doing something wrong?
Search results for
Popping Sound
19,350 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The app that I'm creating I'm trying to combine two audio files, one is larger than the other, I need the first one to run for a certain time and then the second, but all as one audio (the audio is playing while app is in the background)Here is what I got so farfunc merge(audio1: NSURL, audio2: NSURL, time:Double, date:NSDate) { var ok1 = false var ok2 = false / / var composition = AVMutableComposition() var compositionAudioTrack1:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) var compositionAudioTrack2:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) / var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first! as! NSURL var fileDestinationUrl = documentDirectoryURL.URLByAppendingPathComponent(resultmerge.wav) println(fileDestinationUrl) var url1 = audio1
We are using performRequestsWithOptions to enable passkey on ios app. [authController performRequestsWithOptions:ASAuthorizationControllerRequestOptionPreferImmediatelyAvailableCredentials]; Based on apple doc, this will Tells the authorization controller to prefer credentials that are immediately available on the local device., and fail silently if there are no credentials available. However, in recent testing, we identified that on one device, we are seeing QR code popping up even though there's no credential on the device. Question is this a bug on the OS system? If this is a bug, what are the causes that will trigger this condition? Is there a recommendation to mitigate the issue? Should we move to the new api? Thank you.
Topic:
Privacy & Security
SubTopic:
General
Tags:
Passkeys in iCloud Keychain
Authentication Services
I'm working on writing a custom audio unit using the v3 APIs, and I'm having trouble getting them to instantiate correctly under some circumstances:I can instantiate my Audio Unit if I include its view controller and AUAudioUnit subclass within the associated app (either directly or by linking them in from a framework) and then registering it using AUAudioUnit.registerSubclass(asComponentDescription:, name:, version:), but I haven't been able to get it to work by installing the .appex file anywhere. The system just doesn't find it. I have tried putting the .appex in ~/Library/Audio/Plug-Ins/Components, and /Library/Audio/Plug-Ins/Components but it never shows up in the list returned by AVAudioUnitComponentManager.sharedAudioUnitComponentManager().componentsMatchingDescription(). I created a minimal test project using the templates provided by XCode for an Audio Unit Extension, and I've been careful to be sure the Info.plist file for the extension has been filled in
1. create an animated image with nature method `+ (UIImage *)animatedImageWithImages: duration:`2. add an imageview to VC1, playing an animated image created in step13. push VC24. slide to pop VC2, using nature method tooAnd u will see that the imageview didn't display correctly, but if we tap `back` button on the navigation bar, the animated image displayed correctly.anyone got the same problem with me?since I don't known how to submit my simple demo, if someone get interested in the issue, contact me and i'll show you the demo.E-mail: welsonxl@163.com
I have updated my Ipad air 2 to Ios 9 Beta4 and now all sounds except ringtones have been set to default.And If I try to change the tone for any of the other sounds, the list of different tones comes up but clicking on any of the tones returns me to the home screen and does not change the tone to that selection.
How to play .amr audio-data using AVAudioPlayer? Can play .wav, .mp3, etc. audio, but not .amr.AVAudioPlayer initializing good using AMR data, can get correct duration time, but it wan't play...
Hello. I tried to port corehaptic to extension custum keyboard, but failed. So I adopted hapticfeedback as a suboptimal solution, but I haven't solved the audio yet. Do you know which chord to use for the sound effect that we usually use on the basic ios keyboard? Thank you for helping me.
hi,I'm a freshman and I'm interested in audio streaming and would like to build an app that can stream songs of one band. something like the bandcamp app, so simple and clear audio streaming. do you have some recommendations concerning where to start, frameworks and online courses to set up an app where songs are streamed?best, jonas
Hello all,I've been working with custom AudioUnit for AVAudioEngine on iOS for a while (Audio Unit v3), and I now want to build an Audio Unit plug-in for macOS DAWs (Logic Pro or Ableton Live for example). I've been looking about the subject for a while, but I can't manage to understand how it works.It seems that part of the solution is Audio Unit Extensions, as I saw in WWDC 2015 - 508 and WWDC 2016 - 507, but I don't get how I can reuse the .appex product in Logic Pro or Ableton Live. I've used 3rd party audio units for these softwares but I always get them as .components bundles, which is to be copied in the /Library/Audio/Plug-Ins directory. I tried to copy the .appex in the same folder but it doesn't seem to be recognized by Logic Pro.Any idea about what am I missing here?Thank to you all 🙂Tom
Is it possible to use a custom notification sound on watchOS?
Hi, I'm trying to play audio from my HomeKit camera, but can't figure out how to do it. The only documentation I can find is from WWDC when they show you how to play video in an HMCameraView. But how do I get the audio to play in my app? Thanks a lot in advance!
With iOS Inter-App audio and AUGraph being deprecated (according to the iOS 13 beta Release Notes), how does one expose the existence of an Audio Unit or custom AVAudioNode audio source to another audio destination app, and how does a destination app (recorder, player, and/or visualizer) get sent audio samples from the audio source app (synthesizer or on-screen keyboard)?
The internal speaker on the iPhone 6S models only support a sample rate of 48kHz while previous iPhone models supported a collection of sample rates.Some developers are running into problems (generally classified as distorted or bad sounding audio) due to some incorrect assumptions being made when the requested “preferred sample rate ends up being different than the current actual hardware sample rate.If you ignore these types of difference and for example set the client format to the hardware format expecting 44.1kHz when the actual sample rate is 48kHz, your application will suffer problems like audio distortion with the further possibility of other failures.Additionally, even if your application is specifying a client format of 44.1kHz, for example the render callback (in the case of the AURemoteIO (kAudioUnitSubType_RemoteIO)) may call you for a varying number of frames in cases where sample rate conversion is involved. Therefore, it is important that the application never assum
Hi there, We're a Wine eCommerce company and we have a project in progress to launch our webstore on the App Store. To achieve this, we're utilizing PWABuilder to package our website using PWA technology, allowing us to submit it to the App Stores. Naturally, given that it's a Google technology, everything went smoothly with the Play Store, and our app is already available there. However, we're encountering some challenges with the App Store. We need to implement additional features in our webstore, such as AppleID login and ApplePay payment integration. But, apart from that, we are facing another issue: Our third-party login options (Facebook and Google) function as pop-ups on our website but do not work within our Swift Application. When we run our application in Xcode and try to login via third-party, we get this error. 2023-07-28 09:50:26.633655-0300 grandeadega[12858:1201551] [Process] 0x133858818 - [pageProxyID=6, webPageID=7, PID=12915] WebPageProxy::didFailProvisionalLoadForFrame: frameID=957