Hi Apple Team, We have a technical query regarding one feature- Audio Recognition and Live captioning. We are developing an app for deaf community to avoid communication barriers. We want to know if there is any possibility to recognize the sound from other applications in an iPhone and show live captions in our application (based on iOS).
Search results for
Popping Sound
19,757 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want to use the APPs to listen to spatialized audio that I am synthetically generating, and have that audio be in a certain location with respect to the head. As you turn your head, the sound moves. As you walk around you can hear the sound moving in different positions. Is there example code of this out there? How do I make this happen on iOS? There are so many APIs out there for outputting sound on iOS, which one would be the quickest to get the spatialized 3D audio working?
Dear all,My app is being rejected for almost 16 times now citing te below reason.We discovered one or more bugs in your app when reviewed on iPhone and iPad running iOS 11.2.5 on Wi-Fi connected to an IPv6 network.Specifically, after inputing the demo account credentials in the fields provided and tapping Login, we are presented with a blank alert with “OK” listed at the bottomMy base URL is ipv6 enabled.PLEASE I NEED URGENT HELP IN RESOLVING THIS. MY CLIENT IS REALLY FURIOUS BECAUSE OF THIS PLEASE HELP
`import UIKit import AppTrackingTransparency func requestDFA(){ if #available(iOS 14, *){ ATTrackingManager.requestTrackingAuthorization { status in switch status { case .authorized: // 用户已授权跟踪 print(Tracking authorization status: authorized) case .denied: // 用户拒绝跟踪 print(Tracking authorization status: denied) case .notDetermined: // 用户尚未做出选择 print(Tracking authorization status: not determined) case .restricted: // 跟踪受限,例如在家长控制设置下 print(Tracking authorization status: restricted) default: print(Tracking authorization status: unknown) } } } } @main class AppDelegate: UIResponder, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { // Override point for customization after application launch. requestDFA() return true } // MARK: UISceneSession Lifecyclez func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptio
i want to add vibration and sound with the notification on iwatch, when it arrives on apple watch. I dont know how to do it. Currently my notifications are being displayed on apple wtch but without any sound or vibrations. I want to add these stuff with the notification.
I have a link in a UITextView in a ViewController which is part of a UINavigationControler, which I present as Modal. When I do a Peek and pop (force touch /3d touch) twice on the link it opens the link in Safari which is default behavior. But When I go back to my app, the View controller is dismissed. Does anyone have encountered this issue and know how to stop the view controller being dismissed.Thanks in Advance.
When I used Context Menu for each photo in three columns. So, the code is like below: List { LazyVGrid(columns: columns) { View() .contextMenu { Button() } } } This just pop up whole list not individual element when I long press the element to see context menu. The whole list is highlighted when context menu appears. How can I make this to highlight individual element not the whole list? If I remove any of List or LazyVGrid the problem fix. But the columns are not in the way I want. Also, I don't want to change List to VStack or ScrollView which don't have refreshControl. Thanks in advance.
Hello,I'm developing a pre-amp that will accept a music stream over ethernet. On OSX side, I'm writing a driver (just plugin) that will take system audio and send it over ethernet. I'm able to collect the audio data. Now I need to figure out how to stream it out.What is the proper way to do sockets in an audio plugin? I tried using socket callbacks with CFRunLoop but it seems to be freezing all of coreaudiod. Any other suggestion on this architecutre will be appreciated. For example, does anyone see a need for a kernel level driver or will the plugin suffice?Thanks,Konstantin
We've been using remote push notifications from a gateway to allow us to provide auth information. In some cases there's no sound to tell the user that a notification has arrived. In our code we specify UNAuthorizationOptionSounds, and the notification settings on the devices are set to allow notifications and sounds for the application. The client code hasn't changed in a couple of years, so I'm wondering whether something might have happened from the sending side. That's not my strongest area though. Does anyone know whether there could have been a change in the call generating the push notification which cut off the sound, and where I would look for documentation on that?
I have an AVAudioEngine,but I don't know how to export the audio in AVAudioEngine to a file. Anyone can help?
In my app there is a spinning wheel. I am using KeyFrameAnimations. I want to add sound effects to the wheel rotation animation. I tried AVAudioPlayer. But I want the audio speed to change with the speed of wheel rotation. Please help me ASAP.
I am working on an iOS app for mixing an audio file with the user's voice input into a new file, and playing the content of this audio file simultaneously. You can just consider this app as a Karaoke player which records both the singer’s voice and the original soundtrack into file while playing the original soundtrack for the singer.I use AUGraph to establish the audio process flow as:One mixer AudioUnit (with type kAudioUnitType_Mixer and subtype kAudioUnitSubType_MultiChannelMixer) with 2 inputs;One resampler AudioUnit for necessary sample rate converting( kAudioUnitType_FormatConverter, kAudioUnitSubType_AUConverter) from the sample rate of the audio file to that of the mic input in order to make formats of the mixer's 2 input buses match;Then connect these nodes: Mic’s output(output element of IO node's input scope) —> Mixer’s input 0, Resampler’s input —> Mixer’s input 1, Mixer’s output —> Speaker’s input(input element of IO node's output scope);Set a render
UITabBarController | | VC_Tab1 --------------------------- VC_Tab2 | | | | VC_Tab1_Child VC_Tab2_Child | (HeaderView) | (MyButton) The structure of the view controllers and views in the project is as described above. self.navigationController?.popToRootViewController(animated: false) tabBarController.selectedIndex = 1 When popToRootViewController(animated: false) is called in VC_Tab1_Child, followed by setting the tab controller’s selectedIndex = 1, the following results are observed: viewWillAppear(_:), deinit, viewDidAppear(_:), The originally expected results are as follows viewWillDisappear(_:), viewDidDisappear(_:), deinit, deinit, deinit, headerView.backButton.rx.tap -> Event completed headerView.backButton.rx.tap -> isDisposed viewWillAppear(_:), viewDidAppear(_:), The HeaderView belonging to VC_Tab1_Child was not deallocated, and the resources associated with that view were also not released. Similarly, VC_Tab1_Child.viewWillDisappear and VC_Tab1_Child.didDisappear were not called. s
Hello,We are trying to send compressed audio data over BLE.Our test setup includes a Linux PC(Peripheral) and an iPhone acting as the Central.We have the below characteristics on the Peripheral for audio transfer.Char1 (Read | Notify)Char2 (Write)We have an use case where audio transfer happens both ways simultaneously using these two characteristics.Peripheral -> Central (using Char1)Central -> Peripheral (using Char2)The Peripheral -> Central transfer seems to be fast enough for audio playback.But with the Central -> Peripheral transfer, we see a considerable lag.This transfer takes more than double the time required for Peripheral -> Central transfer.We also tried sending a connection parameter update request to set the connection interval between 20-21ms.But still we see no increase in the transfer speed.Are we missing out something here that enables faster transfer rates when sending data from the iPhone to PC?
Does anybody got any luck playing sound on Apple Watch? I have tried since watchOS 2 and never get to work. In watchOS 3 documentation (https://developer.apple.com/library/prerelease/content/documentation/General/Conceptual/WatchKitProgrammingGuide/AudioandVideo.html#//apple_ref/doc/uid/TP40014969-CH24-SW19), it says:Any audio you play is routed to a paired Bluetooth headset if one is available. If no Bluetooth headset is available, audio is routed to the Apple Watch speaker.Here's my code - instead of hearing the sound, the UI hung. I have placed the sound asset ShipHit.m4a in both Watch App and Watch Extension for testing purpose. let url = NSURL.fileURL(withPath: Bundle.main().pathForResource(ShipHit, ofType: m4a)!) let asset = WKAudioFileAsset(url: url) let playerItem = WKAudioFilePlayerItem(asset: asset) soundPlayer = WKAudioFilePlayer(playerItem: playerItem) ... if soundPlayer.status == .readyToPlay { soundPlayer.play() }