I have an AVAudioEngine,but I don't know how to export the audio in AVAudioEngine to a file. Anyone can help?
Search results for
Popping Sound
19,356 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am working on an iOS app for mixing an audio file with the user's voice input into a new file, and playing the content of this audio file simultaneously. You can just consider this app as a Karaoke player which records both the singer’s voice and the original soundtrack into file while playing the original soundtrack for the singer.I use AUGraph to establish the audio process flow as:One mixer AudioUnit (with type kAudioUnitType_Mixer and subtype kAudioUnitSubType_MultiChannelMixer) with 2 inputs;One resampler AudioUnit for necessary sample rate converting( kAudioUnitType_FormatConverter, kAudioUnitSubType_AUConverter) from the sample rate of the audio file to that of the mic input in order to make formats of the mixer's 2 input buses match;Then connect these nodes: Mic’s output(output element of IO node's input scope) —> Mixer’s input 0, Resampler’s input —> Mixer’s input 1, Mixer’s output —> Speaker’s input(input element of IO node's output scope);Set a render
Some of our clients that have updated their devices to iOS 11 have reported that they are not receiving any sound or vibration when receiving notifications, after testing myself I can confirm that this is the case. The notification is constructed on our API with ping.aiff as the value for sound like so:string appleSound = ping.aiff;var appleNotificationMessage = ${{aps:{{alert:{message}, sound:{appleSound}}}}};I've tried to search around a bit but I haven't found anything, any ideas why this may have stopped working? Notifications are coming through fine they are just silent and as far as I can tell the notification settings on the app are fine.
Hi all, I am developing a digital signal processing application using AudioToolbox to capture audio from an audio loop application (BlackHole). Environment: MacOS Sonoma 14.4.1 Xcode 15.4 Quicktime 10.5 (I also tested with JRive Media Center) BlackHole 2ch and 16ch Problem: All audio samples received are zero. Steps to recreate: Set Mac Settings Sound audio output to BlackHole 2ch. Set Mac Settings Sound audio input to BlackHole 2ch. Authorise Xcode to access Microphone. In Audio MIDI set Use this device for sound input and Use this device for sound output. Set volume of both to 1.0 . Play a 44.1 16-bit signed integer stereo FLAC file using Quicktime. Start C++ application . Key details of my code below... AudioStreamBasicDescription asbd = { 0 }; asbd.mFormatID = kAudioFormatLinearPCM; asbd.mFormatFlags = kLinearPCMFormatFlagIsFloat | kLinearPCMFormatFlagIsPacked; asbd.mSampleRate = 48000; asbd.mBitsPerChannel = 32; asb
When I used Context Menu for each photo in three columns. So, the code is like below: List { LazyVGrid(columns: columns) { View() .contextMenu { Button() } } } This just pop up whole list not individual element when I long press the element to see context menu. The whole list is highlighted when context menu appears. How can I make this to highlight individual element not the whole list? If I remove any of List or LazyVGrid the problem fix. But the columns are not in the way I want. Also, I don't want to change List to VStack or ScrollView which don't have refreshControl. Thanks in advance.
In my app there is a spinning wheel. I am using KeyFrameAnimations. I want to add sound effects to the wheel rotation animation. I tried AVAudioPlayer. But I want the audio speed to change with the speed of wheel rotation. Please help me ASAP.
Hello,We are trying to send compressed audio data over BLE.Our test setup includes a Linux PC(Peripheral) and an iPhone acting as the Central.We have the below characteristics on the Peripheral for audio transfer.Char1 (Read | Notify)Char2 (Write)We have an use case where audio transfer happens both ways simultaneously using these two characteristics.Peripheral -> Central (using Char1)Central -> Peripheral (using Char2)The Peripheral -> Central transfer seems to be fast enough for audio playback.But with the Central -> Peripheral transfer, we see a considerable lag.This transfer takes more than double the time required for Peripheral -> Central transfer.We also tried sending a connection parameter update request to set the connection interval between 20-21ms.But still we see no increase in the transfer speed.Are we missing out something here that enables faster transfer rates when sending data from the iPhone to PC?
In an iOS app, it's possible to have an app that has been backgrounded continue to play audio if the Info.plist setting 'Required background modes includes the item: App plays audio or streams audio/video using AirPlay.Using this key in tvOS seems to have no effect, and audio stops when I open another applicaiton while my AVPlayerViewController is playing audio. Is this a bug with tvOS or is there a different key?
Whenever I get a notification, it lowers the audio that I’m currently listening to at the moment. Everywhere I look, the answer I see is just to mute my phone or put it on do not disturb, but I still wish to hear the ringer go off when I get a notification. is there a way I can stop the audio from lowering without needing to mute the ringer?
UITabBarController | | VC_Tab1 --------------------------- VC_Tab2 | | | | VC_Tab1_Child VC_Tab2_Child | (HeaderView) | (MyButton) The structure of the view controllers and views in the project is as described above. self.navigationController?.popToRootViewController(animated: false) tabBarController.selectedIndex = 1 When popToRootViewController(animated: false) is called in VC_Tab1_Child, followed by setting the tab controller’s selectedIndex = 1, the following results are observed: viewWillAppear(_:), deinit, viewDidAppear(_:), The originally expected results are as follows viewWillDisappear(_:), viewDidDisappear(_:), deinit, deinit, deinit, headerView.backButton.rx.tap -> Event completed headerView.backButton.rx.tap -> isDisposed viewWillAppear(_:), viewDidAppear(_:), The HeaderView belonging to VC_Tab1_Child was not deallocated, and the resources associated with that view were also not released. Similarly, VC_Tab1_Child.viewWillDisappear and VC_Tab1_Child.didDisappear were not called. s
Dear all,My app is being rejected for almost 16 times now citing te below reason.We discovered one or more bugs in your app when reviewed on iPhone and iPad running iOS 11.2.5 on Wi-Fi connected to an IPv6 network.Specifically, after inputing the demo account credentials in the fields provided and tapping Login, we are presented with a blank alert with “OK” listed at the bottomMy base URL is ipv6 enabled.PLEASE I NEED URGENT HELP IN RESOLVING THIS. MY CLIENT IS REALLY FURIOUS BECAUSE OF THIS PLEASE HELP
`import UIKit import AppTrackingTransparency func requestDFA(){ if #available(iOS 14, *){ ATTrackingManager.requestTrackingAuthorization { status in switch status { case .authorized: // 用户已授权跟踪 print(Tracking authorization status: authorized) case .denied: // 用户拒绝跟踪 print(Tracking authorization status: denied) case .notDetermined: // 用户尚未做出选择 print(Tracking authorization status: not determined) case .restricted: // 跟踪受限,例如在家长控制设置下 print(Tracking authorization status: restricted) default: print(Tracking authorization status: unknown) } } } } @main class AppDelegate: UIResponder, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { // Override point for customization after application launch. requestDFA() return true } // MARK: UISceneSession Lifecyclez func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptio
Does anybody got any luck playing sound on Apple Watch? I have tried since watchOS 2 and never get to work. In watchOS 3 documentation (https://developer.apple.com/library/prerelease/content/documentation/General/Conceptual/WatchKitProgrammingGuide/AudioandVideo.html#//apple_ref/doc/uid/TP40014969-CH24-SW19), it says:Any audio you play is routed to a paired Bluetooth headset if one is available. If no Bluetooth headset is available, audio is routed to the Apple Watch speaker.Here's my code - instead of hearing the sound, the UI hung. I have placed the sound asset ShipHit.m4a in both Watch App and Watch Extension for testing purpose. let url = NSURL.fileURL(withPath: Bundle.main().pathForResource(ShipHit, ofType: m4a)!) let asset = WKAudioFileAsset(url: url) let playerItem = WKAudioFilePlayerItem(asset: asset) soundPlayer = WKAudioFilePlayer(playerItem: playerItem) ... if soundPlayer.status == .readyToPlay { soundPlayer.play() }
I'm using few images and audio in my playground/playgroundbook which is from a famous TV show. I'm not sure if there's any copyright infringement against use of those images/audio. The swift student challenge guidelines state that, any public domain images and/or audio files can be used with rightful credits and and a brief explanation. So, is it safe/sufficient if I give the credits and explanation, or should I consider replacing those assets ? Thanks in advance.
Topic:
Developer Tools & Services
SubTopic:
Swift Playground
Tags:
Swift Playground
Swift Student Challenge
WWDC Scholarships
is there a way to fade out my audio file once the button is pressed, and only have the audio fade on the first loop? At the moment i have an action where the button plays audio and changes page, but the audio comes in two early, i'm trying to find away so that the audio only comes in when the page changes, so if theres an altenative way that would awesome 🙂