Is it possible (and if not, why isn't it possible) for developers to access live phone call audio through an app? I know of a few apps that allow users to equalize sound from music/media apps, but none that can manipulate phone call audio. Is this just a security feature to prevent malicious developers from recording calls?
Search results for
Popping Sound
19,352 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, My account was approved for the com.apple.developer.playable-content entitlement, but now it's deprecated and I want to switch to the new one com.apple.developer.carplay-audio. I have some problems making the transition, do I need to submit a new request to Apple for the new entitlement? Thanks.
Hi, I have an app ready from Construct 3, everything is fine but I have audio problems only in Xcode. I exported the game from Construct 3 to Xcode using Cordova, but some high audio frequencies are distorting (aliasing). I think is a problem with WebM files. If I could add these audio files as m4a directly in Xcode, I think the problem would be solved. is this something that can be done? Thank you in advance
Using WKWebView in iOS, I encountered a problem with controlling audio output level from <audio> element in HTML page.I tried the 'volume' property of the <audio> and also the Web Audio API 'GainNode'. Neither approach worked. The player's output stays/reported as 1.0.Curiously, within the same scope of code I can change player's other properties such as playback rate; this does work. But calls to 'volume' or 'GainNode' are flatly ignored. Two observations make me believe that this is a bug.The first one: if I use instead of WKWebView the old deprecated UIWebView, everything works fine; even Web Audio API AudioContext, Splitter, Merger, etc.The second observation: in the version of the app for macOS the very same HTML page and <audio> element behave as expected.Any suggestions for 'workaround' would be much appreciated,Igor Borodin
My question is very high level being totally new to streaming media.Is it feasible to intercept the audio component of a TV broadcast stream, and then replace segments of that audio?Conceptually, this capability/app would run on the streaming device (e.g. Apple TV) and be configurable by the end user.Any guidance on the feasibility appreciated. And if so, would appreciate as well guidance on technical standards, technical components, SDK, classes, etc. that I could begin to study related to this audio handling.ThanksP.S. - Are there any legal ramifications of manipulating the audio output (presented to end user/viewer only) of broadcast from “x” network / service (fill in your content provider of choice).
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is the
I don't see the option anywhere but I have a game which is a straight viewcontroller using the focus engine, and everytime the focus changes the tv plays a sound. I was curious if there is a way to disable that so I can play my own game's sounds?
Hey guys, is it possible to play a sound through the Apple Watch speaker like when you tap on the watch face and Mickey or Minnie say the time? I want to be able to have AudioKit play a sound of my choice (not a sound file / URL) on some user interaction however it appears that is not possible.The documentation indicates that you can only play a sound file (URL) through a connected Bluetooth device. Is that the current limitation?Thanks,Michael
I have an AVComposition playback via AVPlayer where AVComposition has multiple audio tracks with audioMix applied. My question is how is it possible to compute audio meter values for the audio playing back through AVPlayer? Using MTAudioProcessingTap it seems you can only get callback for one track at a time. But if that route has to be used, it's not clear how to get sample values of all the audio tracks at a given time in a single callback?
On macOS Monterey beta 7, the Twitter app is no longer producing any sound. I have to share the tweets, and open the link in Safari, to get any sound.
How many audio files/sources or different sound effects, like gunshots, or birds chirping, can iPhone hardware reproduce at same time? What are the upper limit and restrictions of sound reproduction? Is there an iPhone spec sheet, which deals with this specific hardware aspects?
Hi,Is it possible using MusicKit to play songs through audio filters, such as pitch shift or time stretch filters?Is it possible to play two songs at once for the purpose of fading between them?Is there a way that I can access tempo information about songs?Thanks,Frank
I'm looking for a sample code project on integrating Spatial Audio into my app, Tunda Island, a music-loving, make friends and dating app. I have gone as far as purchasing a book Exploring MusicKit by Rudrank Riyam but to no avail.
I try to get Streaming Audio to work in WatchKIT. I just simple use the AVPlayer with a remote URL.The Code working on iOS but WatchOS give me error.I Added Background Audio to info.plst.do{ try AVAudioSession.sharedInstance().setCategory(.playback) try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation) } catch { NSLog(error.localizedDescription) }let playBackURL = URL(string: https://facyremoturl/stream.mp3)PlayerItem = AVPlayerItem(url: playBackURL!)PlayerItem.addObserver(self, forKeyPath: timedMetadata, options: NSKeyValueObservingOptions(), context: nil)Player = AVPlayer(playerItem: PlayerItem)Player.play()The Logs now show this error:2019-06-05 14:31:09.149728+0200 watchtest WatchKit Extension[92922:10939355] [] [14:31:09.148] <<< CFByteFlume >>> FigByteFlumeCreateWithHTTP: [0x786984c0] Created CFHTTP byte flume 0x786984b0. 2019-06-05 14:31:09.210511+0200 watchtest WatchKit Extension[92922:10938788] Task <C4B1C312-11B6-4547-8072-EC
Hi all,I'm looking for documentation about the new Audio Unit extension. Just found 'AudioUnitV3Example'' in the included documentation, but it throws a 'page not found' error.Any suggestion?Thanks!