Search results for

Popping Sound

19,350 results found

Post

Replies

Boosts

Views

Activity

Reply to Bouncy ball in RealityKit - game
Thank you! The scale workaround sounds interesting, but I'm having difficulty determining the exact implementation approach within my repository structure. I would greatly appreciate your guidance on implementing this solution in code I shared (https://github.com/michaello/BouncyBall-visionOS/). Specifically I'm uncertain about Where in my current entity hierarchy I should introduce the container entity for scaling Whether all physics entities (ball, ground, walls) need to be scaled equally, or if only the ball requires this treatment The recommended scale factor to use (10x, 100x?) to prevent premature sleeping How to properly adjust the initial velocity and gravity values to maintain the same visual behavior at the larger simulation scale If you could point me toward specific code changes or provide a small example of the container/scaling setup, it would help a lot! I can't find any sample code on internet for it so it's difficult for me to test your suggestion.
Topic: Spatial Computing SubTopic: General Tags:
Jul ’25
Reply to WebSocket connection in background triggered by BLE accessory
To reiterate, I can’t comment on App Store policy. Speaking technically, the audio background mode still has limitation. The one most relevant to your setup is that you have to start your audio session while your app is in the foreground. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Jul ’25
WebSocket connection in background triggered by BLE accessory
Hello everyone, We are building an iOS app using React Native that connects to a custom Bluetooth Low Energy (BLE) accessory. The accessory continuously sends small chunks of audio data to the app through BLE (basically every time the user speaks), which are then streamed in real time to our server via WebSocket for transcription and processing. We need to know if the following behavior is allowed by iOS runtime and App Store review policies: Can the app open a WebSocket connection in the background (not permanently, just briefly, several times a day) triggered by BLE activity from a registered accessory? Is there a limit to this? Clarifications: The app is not expected to remain permanently awake. Only during accessory-triggered events. WebSocket is required due to the real-time nature of streaming STT and delivering quick responses (via notifications). If allowed, are there any specific Info.plist declarations or entitlements we must include? Thanks in advance! Fran
5
0
199
Jul ’25
Reply to WebSocket connection in background triggered by BLE accessory
And just to clarify, the app only needs to be active in background while receiving data over BLE. My use case involves batch-processing audio streamed from a BLE accessory, and then either: Sending it to the server in real time via WebSocket, or Running a lightweight STT model locally on the phone, and then sending the resulting text to the server via HTTP. In either scenario, the background activity only happens while BLE data is actively being received, and ceases immediately after.
Jul ’25
Reply to Complex view structures are frustratingly too much work
This sounds like you've just not designed your code properly. Yes, @Binding is used to represent a @State var from a different component, because they are designed that way. They are very useful, and I can see and understand why they exist. As I suggested in your other similar post, just raise the bugs as you find them. If you raise them, they can get fixed.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jul ’25
How can third-party iOS apps obtain real-time waveform / spectrogram data for Apple Music tracks (similar to djay & other DJ apps)?
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback, using ApplicationMusicPlayer. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? Where can I find official documentation or a point of contact for this kind
1
0
168
Jul ’25
Reply to WebSocket connection in background triggered by BLE accessory
Hi Quinn, Thank you very much for your detailed and thoughtful response. It was incredibly helpful. Just to clarify and expand on our specific use case: We are developing a voice-based wearable (a pendant) that continuously listens for speech through its onboard microphone. The device sends audio in real time via BLE to our mobile app (React Native), which then streams the audio to our backend using a WebSocket connection. The goal is to provide immediate STT transcription and return a response via push notification, all within a few seconds. To be clear: The mobile app receives real audio data from a BLE-connected accessory. We do not expect the app to remain awake indefinitely. It only becomes active when the device is actively sending audio (a few seconds here, 30 seconds there, depending on how much time the user speaks). This is not synthetic audio or a hack to stay alive. It’s a legitimate voice input stream triggered by a BLE accessory and transmitted in rea
Jul ’25
Processing / tapping an HLS audio stream (or global app output)
I'm trying to do some realtime audio processing on audio served from an HLS stream (i.e. an AVPlayer created using an M3U HTTP URL). It doesn't seem like attaching an AVAudioMix configured with with an `audioTapProcessor` has any effect; none of the callbacks except `init` are being invoked. Is this a known limitation? If so, is this documented somewhere?If the above is a limitation, what are my options using some of the other audio APIs? I looked into `AVAudioEngine` as well but it doesn't seem like there's any way I can configure any of the input node types to use an HLS stream. Am I wrong? Are there lower level APIs available to play HLS streams that provide the necessary hooks?Alternatively, is there some generic way to tap into all audio being output by my app regardless of its source?Thanks a lot!
10
0
4.9k
Sep ’23
StateObject is not deinitialized when List(selection:) binding
Hello, I have a simple example using StateObject and List. When I bind the List(selection:) to a property of the StateObject like this: List(selection: $viewModel.selectedIndex) { ... } I noticed that each time I push the view using a NavigationLink, a new instance of the StateObject is created. However, when I pop the view, the deinit of the StateObject is not called. When is deinit actually expected to be called in this case? Example code: import SwiftUI @main struct NavigationViewDeinitSampleApp: App { var body: some Scene { WindowGroup { NavigationStack { ContentView() } } } } struct Item: Hashable { let text: String } @MainActor fileprivate class ContentViewModel: ObservableObject { @Published var selectedIndex: Int? = nil init() { NSLog(ContentViewModel.init) } deinit { NSLog(ContentViewModel.deinit) } } struct ContentView: View { @StateObject private var model = ContentViewModel() let items: [Item] = { return (0...10).map { i in Item(text: (i)) } }() var body: some View { List(selection: $mode
1
0
105
Jul ’25
TestFlight installations repeatedly failing on macOS
I'm repeatedly hitting an issue when deploying Xcode Cloud builds to macOS from TestFlight. Once the build appears in TestFlight I hit the Install or Update button in TestFlight and after a couple of seconds of spinning-wheel the button goes back to it's original state and the app fails to install. There's no error pop-up but I've noticed an Error Domain=ASDErrorDomain Code=710 Invalid hash '***' expected 'yyy' error in console each time it happens. My project needs to deploy 2 different macOS/Catalyst apps (actually they are 2 different targets in the same project) and it seems completely random as to which will actually successfully install on which machine. For my last build, one of the 2 binaries was failing to install on a Mac Studio on 15.5, but the other was fine. All were fine on 3 other machines I tried. For my latest build, both binaries are fine on the Mac Studio but both now fail to install on an M2 Air on macOS 26 beta 2. I'm now extremely nervous about deploying to my TestFlight beta gr
6
0
375
Jul ’25
iOS Team Provisioning Profile” Missing UIBackgroundModes Entitlement
I’m trying to enable Background Modes (specifically for audio, background fetch, remote notifications) in my iOS SwiftUI app, but I’m getting this error: Provisioning profile “iOS Team Provisioning Profile: [my app]” doesn’t include the UIBackgroundModes entitlement. On the developer website when I make the provision profile It doesnt give me the option to allow background modes. I added it to the sign in capabilities seccion in X code and matched the bundle ID to the provision profile and certificate etc but it still runs this error because the provision profile doesnt have the entitlements..
3
0
226
Jul ’25
CheckError.swift:CheckError(_:):211:kAudioUnitErr_InvalidParameter (CheckError.swift:CheckError(_:):211)
I'm getting this error when I launch my application on the iPhone 14 Pro via Xcode. Everything builds OK. Im using the audio kit plugin and Sound Pipe Audiokit. The error starts as soon as I start the app and will carry on repeatedly. I have background processing turned on as I'd like the sounds to play when the phone is locked via the headphones. I can't find anything online about this error. None of my catches are printing anything in the logs either. So I don't know if this is just something that pops up repeatedly or whether there is something fundamentally wrong. private func setupAudioSession() { do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, options: [.mixWithOthers]) try session.setActive(true, options: .notifyOthersOnDeactivation) } catch { errorMessage = Failed to set up audio session: (error.localizedDescription) print(errorMessage ?? ) } } // MARK: - Background Task Handling private func setupBackgroundTaskH
2
0
73
Apr ’25