Search results for

Popping Sound

19,739 results found

Post

Replies

Boosts

Views

Activity

Reply to How to check if a sandboxed app already has the access permission to a URL
You can use the isReadable value from URLResourceValues. However, it sounds like you're not approaching this from the right direction. You should never need to do this in the first place. Your app shouldn't attempt to read random files. It should only attempt to access files that the user has specifically requested. And even then, you shouldn't check readability. That's not reliable. Instead, just try to do what the user asked and report an error if it fails.
Topic: UI Frameworks SubTopic: AppKit Tags:
Nov ’25
Reply to GenerationError -1 / 1026
This definitely sounds like a bug. Can you please file a Feedback Assistant bug report to Foundation Models Framework so we can take a look at your Mac's and Xcode's system state? In the meantime, here are a few tactics that might help reset whatever's gone wrong: Close Xcode. Restart your Mac. Open Xcode. If that doesn't work: Go to Settings -> Apple Intelligence & Siri -> turn Apple Intelligence off. Restart your Mac. Then turn Apple Intelligence back on.
Nov ’25
Bar button item showing wrong overlay after pop transition on iOS 26.1
I am experiencing a frustrating bug on iOS 26.1 that makes my app look as if it lacks attention to detail. Basically, when having a NavigationStack, where the root view has a top-right confirmation bar button item, and a pushed detail view does not have any button in the navigation bar trailing position, upon poping back to the root view, the confirmation bar button item shows a white overlay for about 1-2 seconds before finally disappearing and showing the correct appearance. Here is the incorrect appearance right after the pop transition: Eventually after about 1,5 seconds it gets turned back to what it should look like: Here is the full code that you can use to reliably reproduce this issue on iOS 26.1: @State private var path: [Int] = [] var body: some View { NavigationStack(path: $path) { VStack { Text(First View) .font(.title) } .navigationDestination(for: Int.self, destination: { param in Text(Detail View) .font(.title) }) .navigationTitle(First) .navigationBarTitleDisplayMode(.inline) .toolba
0
0
81
Nov ’25
CarPlay: AVSpeechUtterance not speaking/playing audio in some cars
I am having an issue with the code that I posted below. I capture voice in my CarPlay app, then allow the user to have it read back to them using AVSpeechUtterance. This works fine on some cars, but many of my beta testers report no audio being played. I have also experienced this in a rental car where the audio was either too quiet or the audio didn't play. Does anyone see any issue with the code that I posted? This is for CarPlay specifically. class CarPlayTextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate { private var speechSynthesizer = AVSpeechSynthesizer() static let shared = CarPlayTextToSpeechService() /// Completion callback private var completionCallback: (() -> Void)? override init() { super.init() speechSynthesizer.delegate = self } func configureAudioSession() { do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.duckOthers, .interruptSpokenAudioAndMixWithOthers, .allowBluetoothHFP]) } catch { prin
2
0
145
Nov ’25
Reply to NEAppPushProvider lifecycle guarantees for safety-critical local networking
During testing, we noticed that if the user backgrounds the app (still connected to the device’s Wi‑Fi) and opens Safari, the extension’s stop is invoked with NEProviderStopReason.unrecoverableNetworkChange / noNetworkAvailable, and iOS tears the extension down. What's the specific testing scenario here? More specifically, is this real world testing with a disconnected device or is this a development device that's attached to a Mac, particularly with any kind of “active” interactions between the device being tested and the Mac? The one scenario I can think of that might explain this is that Safari has its own network-based debugging infrastructure, and it's (theoretically) possible that might be disrupting the app’s normal networking behavior. Related to that point, are you SURE the device is on the same WiFi network? I can't think of any scenario where NEAppPushProvider would stop working while you remained on the same network, but I can see the device changing networks without that being obvious. What I'd a
Nov ’25
Correct way for an Audio Unit v3 to return fewer than requested number of samples given a buffer
I have an AUv3 plugin which uses an FFT - which requires n samples before it can produce any output - so, depending on the relation between the host's buffer size and the FFT window size, it may receive a several buffers of samples, producing no output, and then dumping out what it has once a sufficient number of samples have been received. This means that output is produced in fits and starts, in batches that match the FFT size (modulo oversampling) - e.g. if being fed buffers of 256 samples with an fft size of 1024, the output buffer sizes will be 0 for the first 3 buffers, and upon the fourth, the first 256 processed samples are returned and the remaining 768 cached; the next three buffers will return the remaining cached samples while processing and buffering subsequent ones, and so forth. The internal mechanics of that I have solved, caching output if the current output buffer is too small, and so forth - so it all works as advertised, and the plugin reports its latency correctly. And when run as an app
0
0
182
Nov ’25
Reply to Multiply exr lightmap in Reality Composer Pro Shader Graph
Hello @psqv , thank you for your question! Can you share more about what exactly you're trying to do? I want to make sure I understand correctly so the answer I give is useful. The first thing that may be going wrong is that you say you are sampling the Image using Image(float), which only gives you one channel, and then multiplying that with a Color3f. I suspect you need to modify the Image node to be Image (Color3f) as well. You can set this in the inspector for the node in Reality Composer Pro. Your description of your baked lighting setup sounds unusual to me. It sounds like you are trying to multiply the color of two textures together in the same material. This should be possible (the sampling mode on your image node might be causing it to break), however I don't think this is the correct way to handle baked lighting. Baked lighting only needs one texture, your base color texture, that already contains all your shadows and highlights. Ultimately, your baked lighting shader graph should
Topic: Graphics & Games SubTopic: RealityKit Tags:
Nov ’25
Reply to Issues Handling Multiple Incoming Calls in CallKit
FB20985796 (Issues Handling Multiple Incoming Calls in CallKit) Perfect, thank you. I filed a bug in Feedback Assistant. Is there another way to file a bug to get a faster resolution? No, not really. In this particular case, I think you should plan on working around the issue instead of waiting for the system to address this. This is a VERY longstanding issue (I suspect it's basically always done this), which makes it hard to justify as a high-priority fix. In terms of working around this, two tools I'd highlight here: The direct workaround is to put your active call on hold before you report the next call. In case you're not aware, you can safely hide new call reports while on an active call by reporting your new call using the same UUID as your currently active call. If your call is still active, you'll get CXErrorCodeIncomingCallErrorCallUUIDAlreadyExists (which you can ignore). If the existing call has ended (this can happen due to communication race conditions in the call infrastructure), then you'll get
Topic: App & System Services SubTopic: General Tags:
Nov ’25
Reply to visionOS 3d interactions like the native keyboard when no longer observed in passthrough
Hi @VaiStardom I can confirm SpatialTapGesture doesn't consistently register direct input (trigger callbacks) on entities outside a person's field of view. However, you can use hand tracking to create a custom gesture that works on all entities, regardless of their position, as long as visionOS can track the person's hands. This approach only works within an ImmersiveSpace and requires adding an NSHandsTrackingUsageDescription entry to your app's Info.plist that explains why your app needs hand tracking access. The code snippet below demonstrates this technique by playing a sound when a person taps a cube, even when it's outside their field of view. import RealityKit import SwiftUI import AudioToolbox struct ImmersiveView: View { @State var spatialTrackingSession = SpatialTrackingSession() @State var collisionEventSubscription: EventSubscription? var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.1), materials: [SimpleMaterial(color: .green, isMetallic:
Topic: Spatial Computing SubTopic: ARKit Tags:
Nov ’25
watchOS longFormAudio cannot de active
My workout watch app supports audio playback during exercise sessions. When users carry both Apple Watch, iPhone, and AirPods, with AirPods connected to the iPhone, I want to route audio from Apple Watch to AirPods for playback. I've implemented this functionality using the following code. try? session.setCategory(.playback, mode: .default, policy: .longFormAudio, options: []) try await session.activate() When users are playing music on iPhone and trigger my code in the watch app, Apple Watch correctly guides users to select AirPods, pauses the iPhone's music, and plays my audio. However, when playback finishes and I end the session using the code below: try session.setActive(false, options:[.notifyOthersOnDeactivation]) the iPhone doesn't automatically resume the previously interrupted music playback—it requires manual intervention. Is this expected behavior, or am I missing other important steps in my code?
1
0
295
Nov ’25
Reply to MapKit detailAccessoryView buttons not working on macOS Tahoe
Thanks for filing FB20975128. While we investigate, I suggest switching your buttons to AppKit types as a workaround. It sounds like you're using SwiftUI to share the callout code between macOS and iOS, so I realize the impact of using platform-specific code here, but that will at least give you a path forward while this is investigated. — Ed Ford,  DTS Engineer
Nov ’25
Reply to Question about "Notification (NSE) filtering" capability request
The decision of whether the filtering entitlement is granted or not is solely in the hands of the entitlements team. Nobody here will have any involvement in granting or expediting the request. If you feel like the filtering entitlement is something you cannot live without, you would need to make your case to the entitlements team. That said, use of reportNewIncomingVoIPPushPayload() does not depend on the filtering entitlement. The filtering entitlement is only required to make the notification silent. You can still use the functionality of converting notifications to VoIP calls, although the notification will be visible. By adapting your messaging in the notification (using the extension) to something that explains the situation and changing the interruption-level key in the payload to passive, so the notification will not interrupt a foreground app, or make a sound or light up the screen in most cases (while still being visible, though), you should be able to implement the functionality you are af
Nov ’25
Reply to Clarification on SwiftUI Environment Write Performance
My example above is contrived, but rather than having environment values for .name, .age and .address it sounds like I should just have .person, which is an @Observable object containing the three properties. That way, writes to any of those properties don't trigger the issue you were talking about in the video. You could also just use @Environment(Person.self) private var person instead of a key, and inject it with .environment(person). However, in your example, since those environment variables wouldn't likely update very frequently (they're not really in a hot path) the cost of putting them in the environment is probably negligible. In my demo, it really adds up because the environment is basically being updated on every single frame. I wouldn't say that you need to go back to all your uses of environment values and change them to use @Observable classes, but if you're seeing performance issues, or building something new, it's worth considering.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Nov ’25