Hi,I am new to iOS programming.I want to learn more about audio related features in SWIFT.How can I get a better understanding the concept of Nodes, Audio Unit toolBox and AU Graphs using SWIFT?Are there any resources with examples available?I want to produce audio spectrums etc.Your input will be greatly appreciated.Thank you,Fizza
Search results for
Popping Sound
19,352 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I try to get Streaming Audio to work in WatchKIT. I just simple use the AVPlayer with a remote URL.The Code working on iOS but WatchOS give me error.I Added Background Audio to info.plst.do{ try AVAudioSession.sharedInstance().setCategory(.playback) try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation) } catch { NSLog(error.localizedDescription) }let playBackURL = URL(string: https://facyremoturl/stream.mp3)PlayerItem = AVPlayerItem(url: playBackURL!)PlayerItem.addObserver(self, forKeyPath: timedMetadata, options: NSKeyValueObservingOptions(), context: nil)Player = AVPlayer(playerItem: PlayerItem)Player.play()The Logs now show this error:2019-06-05 14:31:09.149728+0200 watchtest WatchKit Extension[92922:10939355] [] [14:31:09.148] <<< CFByteFlume >>> FigByteFlumeCreateWithHTTP: [0x786984c0] Created CFHTTP byte flume 0x786984b0. 2019-06-05 14:31:09.210511+0200 watchtest WatchKit Extension[92922:10938788] Task <C4B1C312-11B6-4547-8072-EC
Hi, can someone help me, with this very very annoying issue I have with iMovie. I'm making some videos where I take an MP4 video I've filmed and then I'm taking an MP3 file and using as sound. The sound and the clip are both exactly 13 seconds long, but for some reason, iMovie keep cutting off the audio about 1 second before the clip. When i delete the audio from the project and add it back again it cuts again on a different place. So I have to keep remove it and add it back until it stops cutting it. It's as if it sets the cut at some random point between the end and up to something like 2 seconds before the end of the clip. How do I fix this? Kind Regards Ben
How to configure audio session in CallKit for a app which is using webrtc framework. I tried setting the audio mode to record and play but I don't see my voice is heard by the caller. But I was able to hear his. Can you please suggest me what to do.
I'm trying to build an audio unit that can load another one. The first step, listing and getting the components only works in the example code if and when the audio unit is loaded by its accompanying main app. But when loaded inside Logic Pro for example, the listed components will be limited to Apple-manufactured ones. On another forum, although not for an AppEx, someone indicated that the solution was to enable Inter-App Audio, which is now deprecated. Tried all three methods of AVAudioUnitComponentManager to get the component list. Please advise. Many thanks, Zoltan
Hey,I would like to know if it's possible to play one of the system sounds for a UILocalNoification?For example I would like to do something like this: localNotification.soundName = /Library/RingtonesApex.m4rBut that doesn't work. Is there a solution for this? Because I don't want to import custom sounds if there are already useable sounds for notifications on the device.
I'm looking for the control that allows the user to select from a list that pops up when the user clicks the control. I need to know the name of the control so that I can find it in the Object Library.
I'm just noodling around with Xcode and this alert appears every time I run my app in Xcode. Seems to be new with 15.0. I didn't see anything in Settings.
Hello. I'm Game Sound Designer. My project supports Dolby Atmos by Wwise plugin 'Dolby Atmos Renderer'. When I connected the test device (iPhone 12 Pro Max) to the iMac and checked it with the 'Console', I was able to check the log message in the way Dolby mentioned. (Message: DA4MG Atmos_7_1_4 Spatializing [0] times with input channel [12]) I think this is Dolby Atmos working fine. However, the iPhone does not indicate that Dolby Atmos is operating.(Unlike Apple Music, 'Dolby Atmos' or 'Spatial Audio On' cannot be found.) My Question is 1.Does the method of using 'Dolby Atmos Renderer' meet Apple's spatial sound standards? 2.Does only Apple-certified applications display 'Spatial Audio On' or 'Dolby Atmos'? 3.How do I display 'Spatial Audio On' or 'Dolby Atmos'? I also asked Apple, but I post it on the forum because other developers may know the solution. Even the Tower of Fantasy supporting Atmos doesn't see the Spatial Audio On.
I assumethat dolby digital passthrough will work for HLS, but is there any other way to do surround sound without using AV player? Does the Apple TV support multichannel PCM through HDMI?
After installing xcode13.3, a prompt keeps popping up asking me to install the command line developer tools, but I have already installed it, and this prompt keeps popping up, what should I do?
In voip application , when the CallKit is enabled if we try playing a video through AVplayer the video content is updated frame by frame and the audio of the content is not audible . This issue is observed only in iOS 17, any idea how can we resolve this
Hi, What are the requirements / restrictions for a custom iOS Notification sound? Is this widely available to new apps on the App Store? I know BBC News & Tinder have custom sounds. Thanks
Hi there, I'd like to create and expose an Audio Unit Extension for iOS, but there are no parameters that a user can set. Is it possible to expose an Audio Unit Extension without a storyboard? If I leave the storyboard-entry empty in the plist, the extension will crash ...
I am trying to play audio from a scnnode in SceneKit. I have looked at the sample code and have tried to build my own code using an ARimageanchor. I'm not sure how to proceed. ARkit recognizes the images. I just need help with the next steps for playing the audio.