Search results for

Popping Sound

19,350 results found

Post

Replies

Boosts

Views

Activity

Reply to Payment options for Premium account and sponsored item features
It sounds like there are two types of 'products' being sold - the actual physical goods and the premium listing (i.e.'seller can highlight' and 'premium account functionality'). Regarding the physical goods - you cannot use IAP and depending on how you interpret 11.2 you may, or may not, use a 'PayPal' like system within your app. You can use a PayPal like system accessed through your website for the physical goods. Regarding the premium listing - I think you must use IAP for that.
Topic: App & System Services SubTopic: StoreKit Tags:
Jul ’15
SpriteKit activates mic?
Hello,Everytime I run Spritekit via Xcode my Mac's mic is activiated. I first I thought it was a bug with Micro Snitches software. So I report it. However, they confirmed that it was and issue with SpriteKit after looking at my code.Here is an example of the log files:```Jul 3, 2015, 6:12:05 AM: Audio Device became inactive: Built-in Microphone – Internal MicrophoneJul 3, 2015, 4:23:41 PM: Audio Device became active: Built-in Microphone – Internal MicrophoneJul 3, 2015, 4:44:01 PM: Audio Device became inactive: Built-in Microphone – Internal MicrophoneJul 3, 2015, 4:58:52 PM: Audio Device became active: Built-in Microphone – Internal MicrophoneJul 3, 2015, 4:59:38 PM: Audio Device became inactive: Built-in Microphone – Internal MicrophoneJul 3, 2015, 5:29:34 PM: Audio Device became active: Built-in Microphone – Internal MicrophoneJul 3, 2015, 5:37:34 PM: Audio Device became inactive: Built-in Microphone – Internal Microphone```Who can I report thi
1
0
353
Jul ’15
AUMIDISynth - Anybody know how to load instruments into it?
In iOS 8 a new audio unit, called AUMIDISynth, has appeared in the core audo headers. The following comments in AudioUnitProperties suggest it's a multi-timbral GM synth for iOS, which would be a rather nice thing to have, particularly if it supports tuning messages:Line 3072: @abstract Audio unit property IDs used by AUMIDISynth (iOS) and DLSMusicDevice (OSX)Line 3081: For a multi-timbral music instrument, this property can return the number of independent patches that are available to be chosen as an active patch for the instrument. For instance, for Apple's DLS Music Device and AUMIDISynth, this value returns the number of patches that are found in a given DLS or SoundFont file when loaded.Line 3109: @discussion The AUMIDISynth audio unit lets a client create fully GM-compatible Midi Synth.However, there appears to be no documentation, and the only mention of this unit I've found on the web is from other people who've noticed it in the headers. In particular, I can't see any way
4
0
5.5k
Jul ’15
Is MPMusicPlayerController broken in iOS 8.4?
I have an app that can playback audio using either the iTunes built-in Music player, or play back tracks directly stored in the app. Since iOS 8.4, it appears that I can no longer play linked audio tracks found. When I call the play method, nothing happens. ANy one have any tips? This works fine in older versions of iOS. It appears the latest and greatest Apple Music has broken iOS frameworks. Oh yay.This is filed under iOS 8.4 since Apple hasn't updated that in the forums yet. I can only assume to know why.
1
0
319
Jul ’15
Reply to couldn't communicate with a helper application
It's peculiar that a reinstall didn't fix that.First try correctly setting the catalog using the following command in Terminal (the whole thing, including the text that appears as a link):sudo softwareupdate --set-catalog https://swscan.apple.com/content/catalogs/others/index-10.11seed-10.11-10.10-10.9-mountainlion-lion-snowleopard-leopard.merged-1.sucatalog.gzAlso, have you tried booting into Safe Mode, or booting normally but creating a new temporary admin account and updating from there?Follow these steps to start up into safe mode.Start or restart your Mac.Immediately after you hear the startup sound, press and hold the Shift key.Release the Shift key when you see the Apple logo appear on the screen.After the Apple logo appears, it might take longer than usual to reach the login screen or your desktop. This is because your Mac performs a directory check of your startup disk as part of safe mode.If you're not sure if your Mac is started in safe mode, you can use System Information to check this. T
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’15
Audio Unit Extensions - multiple plugin instances
HiWill the Audio Unit Extensions framework for iOS provide us with a means of running multiple instances of a plugin? I accept this would only be a single instance loaded by iOS, but can somehow multiple audio channels and interfaces of the same effect be running at once and the host picks the right view for embedding? Or are we limited to only being able to have one instance of an effect at a time like now with IAA and AudioBus? Being able to use multiple effects of the same type in a single project would really be a strong capability for mobile (i.e. garage band being able to run two instances of the same third party instrument). As a third party vendor it is a stronger value proposition if your users can use more than one instance of what they have paid for at once and will really help the ecosystem grow if hosts aren't needing to lean so heavily on the internal effects for key things like quality synths, reverb and filtering.ThanksMatt
5
0
2k
Jul ’15
Simple Swift sound code(iOS)?
Hi everyone I am 15 years old boy who struggles with the ability to play sound in my iOS app...So here are some lines that I've already tried and everytime I build the app I just get some errors...01. var audioPlayer = AVAudioPlayer()02. let audioPath = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(myFile, ofType:mp3))03. audioPlayer = AVAudioPlayer(contentsOfURL: audioPath, error: nil)04. audioPlayer.delegate = self05. audioPlayer.prepareToPlay()06. audioPlayer.play()Xcode 7.0 beta(7A120f) says on line:02. Value of optional type String? not unwrapped. (If I try to fix it I get another error witch has nothing in common with this line of code...And I am like WHAAAAT?)04. Cannot assign a value.Here's another code I've tried(simular to the first one):var player : AVAudioPlayer! = nil //Initialization @IBAction func playMyFile(sender: AnyObject?){ let path = NSBundle.mainBundle().pathForResource(myFile, ofType:mp3) let fileURL = NSURL(fileURLWithPath: path) player = AVAudioPlayer(contentsO
3
0
13k
Jul ’15
Reply to AUMIDISynth - Anybody know how to load instruments into it?
To use the MIDISynth AU, you need to include a SoundFont 2 (.sf2) or Downloadable Sounds (.dls) bank file with your app.When you first set up your AU, pass it the URL for the bank file using the kMusicDeviceProperty_SoundBankURL property. This is different that what the Sampler AU does with samples -- here, you are passing it the specific path using [NSBundle bundleForClass: pathForResource:].The MIDISynth loads instrument presets out of the bank you specify in response to program change messages in your MIDI file. If you use the MIDISynth in conjunction with the MusicPlayer/Sequence API, it will take care of the extra step you will need to pre-load all the instruments during the preroll stage, before you start playing.If you goal is straightforward MIDI file playback, I *strongly* recommend the new AVMIDIPlayer which is part of the AVFoundation audio API. This lets you get away from the older C-based APIs completely.The other possibility, if you need more flexibility, is the new iOS 9 AVAud
Topic: Media Technologies SubTopic: Audio Tags:
Jul ’15
AVAudioEngine offline render?
I'm looking for a way to do an offline render with AVAudioEngine. In Audio Toolbox terms, this means creating an AUGraph with AUGenericOutput at the end (rather than AURemoteIO or AUHAL) and then calling AudioUnitRender() on this unit to pull samples through the graph and get all the unit effects applied, rather than being connected to actual output hardware and calling AUGraphStart().Looking at the AV Foundation audio API, I can't quite see a way to so it. The AVAudioEngine is effectively the graph, and there are AVAudioNodes to wrap the nodes within the graph, but the AVAudioOutputNode (and parent AVAudioIONode) don't expose any kind of render: method. AudioIONode exposes the underlying AudioUnit as a property, but it's read-only, so I assume this is just AURemoteIO or AUHAL as appropriate, and can't be used to insert an AUGenericOutput.So… can this be done, or am I writing an enhancement request tonight?Oh, since someone might ask: reason I need this is so that I can post-process some audio
4
0
1.8k
Jul ’15
iOS 9: Playback pauses after device lock (if display inactive)
Hi, I'm using AUGraph to playback audio and found the strange issue like pause after device lock (plus works fine in background and after lock if display active)and of course I set up the state and category for audio session:[[AVAudioSession sharedInstance] setActive:YES error:&activationError];[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError];and add background mode to the Info.plist <key>UIBackgroundModes</key> <array> <string>audio</string> </array>Can anyone check? (It's my mistake or just a bug inside iOS 9)Thanks
0
0
233
Jul ’15
Reply to External Audio Files for AUSampler
Thanks for clearing that up!I'd also just like to mention, it doesn't seem we can use the NSDownloadsDirectory either, as trying to create the directory returns an error (The operation couldn’t be completed. Operation not permitted). So it would appear that the only options on iOS are loading from the bundle or from the Documents directory. Not ideal when you also want to enable iTunes File Sharing for other content.That being said, I did think of a workaround: put the content in an arbitrary location, and when loading the .aupreset file, rewrite all the file references to refer to that location before setting it on the Audio Unit. For now I'm just going to use the documents directory but if I ever need to enable iTunes File Sharing I may have to resort to the workaround.
Topic: Media Technologies SubTopic: Audio Tags:
Jul ’15
Reply to Xcode code sense thinks Objective-C file is regular plain-text file
It could be that you've unintentionally instructed Xcode to treat your file as plaintext. Open your file in the main editor and open the File Inspector. Right below the file name is a pop-up button labeled Type. Make sure that's set to Objective-C Source. If that doesn't work, try quitting and relaunching Xcode. That will reset the indexing and code completion subsystems.
Jul ’15