Attaching an audio file to MSMessageTemplateLayout's mediaFileURL property appears to still not be working in Beta 7. Has anyone found workarounds for this?
Search results for
Popping Sound
19,600 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am playing back audio on an AVPlayer while recording audio with AVAcaptureSession. How can I synchronize the recording to the audio playback timeline. I am subtracting both inputLatency & outputLatency from AVAudioSession to the timestamp of the capture audio but the timing seems to be still be off. What other latencies do I need to compensate for?
Hi all,I would like to know if it's possible to record all audio played by my app, either played via AudioUnit, AVPlayer, AudioQueue or some other source (except system sounds / sounds from other apps). The thing is that some of the audio played by the app is generated by code that resides in a static library, the source of which is hard for me to access. Thus I would like to achieve this by intercepting the mixed down audio from all sources before it leaves the app. I'm thinking in the way of setting up an AudioUnit and connecting it's input bus to the audio coming from the app, but I am not sure if this can work. I tried to setup a remoteIO audio unit, and getting the samples from the output callback sample buffers, but it does not work that way, only makes a very annoying noise, which is mixed with other app audio 🙂Thanks in advance for all your thoughts!
Is it possible for a macOS app to mute the audio input for whatever is selected under System Preferences -> Sound -> Input?The goal is to be certain that microphone audio is not being sent to any running applications.Cheers,Patrick
Hi, was following along with this documentation (https://developer.apple.com/documentation/soundanalysis/analyzing_audio_to_classify_sounds) trying classify sounds within my SwiftUI app. Here's what I have: let noiseDetector = NoiseDetector() let model: MLModel = noiseDetector.model let analysisQueue = DispatchQueue(label: com.apple.AnalysisQueue) public var noiseType: String = default class ResultsObserver : NSObject, SNResultsObserving, ObservableObject { func request(_ request: SNRequest, didProduce result: SNResult) { guard let result = result as? SNClassificationResult, let classification = result.classifications.first else { return } noiseType = classification.identifier let formattedTime = String(format: %.2f, result.timeRange.start.seconds) print(Analysis result for audio at time: (formattedTime)) let confidence = classification.confidence * 100.0 let percent = String(format: %.2f%%, confidence) print((classification.identifier): (percent) confidence.n) } func request(_ request: SNRe
hi, I'm pretty new to audio unit development, but I've bought a micro synth (Korg nts-1). I just want to build an audio unit that receives and send midi input to/from the nts-1 how can I build an audio unit with midi events in input and output? can an audio unit (instrument) take the audio signal from an external instrument?
Hello everyone, I wanted to post this as a sanity check before I create a Feedback for this bug. I'm using Xcode 16.1 Beta (167B5001e) along with iOS 18.1 Beta. When using a NavigationStack with a bindable path and navigating to another view while searching, it won't allow you to pop to root by resetting the path. Here's some simple code: // Base code from: // https://sarunw.com/posts/how-to-pop-to-root-view-in-swiftui/ import SwiftUI struct ContentView: View { @State private var path: [Int] = [] var body: some View { NavigationStack(path: $path) { Button(Start) { path.append(1) } .navigationDestination(for: Int.self) { int in DetailView(path: $path, count: int) } .navigationTitle(Home) } } } struct DetailView: View { @Binding var path: [Int] @State private var searchText = let count: Int var body: some View { Button(Go deeper) { path.append(count + 1) } .navigationBarTitle(count.description) .toolbar { ToolbarItem(placement: .bottomBar) { Button(Pop to Root) { path = [] } } } .sea
Hi, when I turn on my mic my speaker volume is low and I cannot hear any game sounds . This happened since I updated to iOS 14 . Can anyone help me with how to solve this issue.
I need to associate sound with the movement of a sprite. Movement can be as a result of physics, not as a result of an SKAction. When the object is sliding thee should be sliding sound throughout the time when it is sliding, and then a different sound when it bumps into a rock and goes up in the air. When the object is airborne, there is no sound, till it falls again - a falling sound, and then slides down with a sliding sound. The sounds associated with the collision ( rock, ground and so on ) are straightforward and work fine. But am having difficulty associating the sound with movement. The closest result I have is to check the velocity of the sprite's physics body every update cycle and play or stop the sound based on whether the velocity is greater than zero. I tried SKAction.playSoundFileNamed first - the sound kept going even when the object was not moving. I tried adding an SKAudioNode with Play and Stop, with no
When setting a custom sound in AlarmKit, the alarm only plays the audio file once. I can understand why push notifications would play a sound only once, but I don’t understand why alarms can only play the sound for less than 30 seconds. We’re already at beta 6, so I’m wondering if Apple still hasn’t fixed this or if they have no intention of fixing it.
Topic:
App & System Services
SubTopic:
Notifications
When voice over is working it audio ducking will happen it will reduce my app volume too. Is there any way to handle audio ducking?, when voice over is complete its speach, app volume wont get back. In my app created my own audio device demon for enhancing audio. Is there any solution for it?
Hi,I am working on a video application for tvOS. Our streams include multiple options for both subtitles and audio options. The subtitles just work. When I change a subtitle option, the subtitle language in the system Settings app also changes. However, the same is not true for audio options. When I choose an audio option for a video in my app, the audio option setting in Settings.app remains unchanged. Is this known/expected behavior, or do I need to configure my app to work with audio options?Thanks,Halen
If you take a look at the differences document on watchOS 3, Apple has posted that Core Audio is now available in watchOS 3: The Core Audio framework (CoreAudio.framework) provides data types that help you represent audio streams, complex buffers, and time values.So can we use this to capture sample buffers as it is recording audio and if so, is there any sample code on how to do this?Thanks!
Hi, how can I fade out the background music (html5 audio element) of my browser game under iOS? Neither can I set audio.volume nor does it work to pipe it through the Web Audio API and modify the gain value. Both solutions work great on basically every non-Apple OS and browser. But it fails on Safari/OSX and in my experience on any browser under iOS. This is quite frustrating. So: how can I fade out music in my browser game under iOS? Thanks Leander
i want to import Sound into Reality Composer... ... so how to prepare and convert Audio Files on my Device, before import new Sound to my Composition. usually there are many options in .mp3 or .wav available, also in different qualities and length?