Search results for

Popping Sound

19,747 results found

Post

Replies

Boosts

Views

Activity

Analyzing Audio to Classify Sounds With Core ML
Hi, was following along with this documentation (https://developer.apple.com/documentation/soundanalysis/analyzing_audio_to_classify_sounds) trying classify sounds within my SwiftUI app. Here's what I have: let noiseDetector = NoiseDetector() let model: MLModel = noiseDetector.model let analysisQueue = DispatchQueue(label: com.apple.AnalysisQueue) public var noiseType: String = default class ResultsObserver : NSObject, SNResultsObserving, ObservableObject { func request(_ request: SNRequest, didProduce result: SNResult) { guard let result = result as? SNClassificationResult, let classification = result.classifications.first else { return } noiseType = classification.identifier let formattedTime = String(format: %.2f, result.timeRange.start.seconds) print(Analysis result for audio at time: (formattedTime)) let confidence = classification.confidence * 100.0 let percent = String(format: %.2f%%, confidence) print((classification.identifier): (percent) confidence.n) } func request(_ request: SNRe
0
0
738
Jun ’20
synchronizing audio recording to playback
I am playing back audio on an AVPlayer while recording audio with AVAcaptureSession. How can I synchronize the recording to the audio playback timeline. I am subtracting both inputLatency & outputLatency from AVAudioSession to the timestamp of the capture audio but the timing seems to be still be off. What other latencies do I need to compensate for?
2
0
975
Jun ’20
Mute Audio Input on macOS
Is it possible for a macOS app to mute the audio input for whatever is selected under System Preferences -> Sound -> Input?The goal is to be certain that microphone audio is not being sent to any running applications.Cheers,Patrick
0
0
1.4k
Apr ’20
How to record audio played by the app
Hi all,I would like to know if it's possible to record all audio played by my app, either played via AudioUnit, AVPlayer, AudioQueue or some other source (except system sounds / sounds from other apps). The thing is that some of the audio played by the app is generated by code that resides in a static library, the source of which is hard for me to access. Thus I would like to achieve this by intercepting the mixed down audio from all sources before it leaves the app. I'm thinking in the way of setting up an AudioUnit and connecting it's input bus to the audio coming from the app, but I am not sure if this can work. I tried to setup a remoteIO audio unit, and getting the samples from the output callback sample buffers, but it does not work that way, only makes a very annoying noise, which is mixed with other app audio 🙂Thanks in advance for all your thoughts!
0
0
1.1k
Mar ’19
iOS 18 Bug: NavigationStack, paths, searchable and popping to root
Hello everyone, I wanted to post this as a sanity check before I create a Feedback for this bug. I'm using Xcode 16.1 Beta (167B5001e) along with iOS 18.1 Beta. When using a NavigationStack with a bindable path and navigating to another view while searching, it won't allow you to pop to root by resetting the path. Here's some simple code: // Base code from: // https://sarunw.com/posts/how-to-pop-to-root-view-in-swiftui/ import SwiftUI struct ContentView: View { @State private var path: [Int] = [] var body: some View { NavigationStack(path: $path) { Button(Start) { path.append(1) } .navigationDestination(for: Int.self) { int in DetailView(path: $path, count: int) } .navigationTitle(Home) } } } struct DetailView: View { @Binding var path: [Int] @State private var searchText = let count: Int var body: some View { Button(Go deeper) { path.append(count + 1) } .navigationBarTitle(count.description) .toolbar { ToolbarItem(placement: .bottomBar) { Button(Pop to Root) { path = [] } } } .sea
2
0
1.1k
Aug ’24
SpriteKit - Associating sound with motion
I need to associate sound with the movement of a sprite. Movement can be as a result of physics, not as a result of an SKAction. When the object is sliding thee should be sliding sound throughout the time when it is sliding, and then a different sound when it bumps into a rock and goes up in the air. When the object is airborne, there is no sound, till it falls again - a falling sound, and then slides down with a sliding sound. The sounds associated with the collision ( rock, ground and so on ) are straightforward and work fine. But am having difficulty associating the sound with movement. The closest result I have is to check the velocity of the sprite's physics body every update cycle and play or stop the sound based on whether the velocity is greater than zero. I tried SKAction.playSoundFileNamed first - the sound kept going even when the object was not moving. I tried adding an SKAudioNode with Play and Stop, with no
0
0
736
Nov ’23
Alarmkit custom sound plays once
When setting a custom sound in AlarmKit, the alarm only plays the audio file once. I can understand why push notifications would play a sound only once, but I don’t understand why alarms can only play the sound for less than 30 seconds. We’re already at beta 6, so I’m wondering if Apple still hasn’t fixed this or if they have no intention of fixing it.
5
0
663
Aug ’25
Notification Sound with iOS 17
Why did Apple change the sound for Notifications in iOS 17??? I just finally upgraded today from iOS 16.7.2 to iOS 17.1.1 and among other things I don't like, this one makes no sense whatsoever! Why can't user select the sound we want for Notifications? At least when I had the Tri-Tone sound I could hear it. The new sound is way too quiet, causing me to miss notifications now. PLEASE FIX THIS ASAP!
1
0
679
Nov ’23
Custom Keyboard Audio Recording
I am writing a custom keyboard. I need to record audio (which than is send to our server for processing) - I gave the keyboard full access, but it seems audio recordings are not working.Is it possible to access the audio device for recording in a custom keyboard? Do I need some other permissions for this?thanks
Topic: UI Frameworks SubTopic: AppKit Tags:
1
0
1.2k
Oct ’15
Persisting Audio Options in AVPlayerViewController
Hi,I am working on a video application for tvOS. Our streams include multiple options for both subtitles and audio options. The subtitles just work. When I change a subtitle option, the subtitle language in the system Settings app also changes. However, the same is not true for audio options. When I choose an audio option for a video in my app, the audio option setting in Settings.app remains unchanged. Is this known/expected behavior, or do I need to configure my app to work with audio options?Thanks,Halen
0
0
401
May ’17
Core Audio in watchOS 3
If you take a look at the differences document on watchOS 3, Apple has posted that Core Audio is now available in watchOS 3: The Core Audio framework (CoreAudio.framework) provides data types that help you represent audio streams, complex buffers, and time values.So can we use this to capture sample buffers as it is recording audio and if so, is there any sample code on how to do this?Thanks!
0
0
486
Sep ’16