Search results for

Popping Sound

19,599 results found

Post

Replies

Boosts

Views

Activity

Delay timing to evaluate Menu content until it actually opens (macOS)
SwiftUI’s Menu is used also to display view controls like pop-up buttons. However, in such cases, its content is evaluated at the moment the button itself appears, although it’s not required until the menu is actually opened. Additionally, since the menu content isn’t re-evaluated when opened, if the content is dynamically generated, there could be a discrepancy between the actual state and the displayed state depending on the timing. Considering these points, I’d like to delay generating the menu content until the moment it’s actually opened. Is there a way to delay the evaluation and generation of the Menu’s content until the moment its contents are displayed? Note: I'd like to know about using it within a macOS app.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
1
0
123
1w
Reply to NEAppPushProvider lifecycle guarantees for safety-critical local networking
During testing, we noticed that if the user backgrounds the app (still connected to the device’s Wi‑Fi) and opens Safari, the extension’s stop is invoked with NEProviderStopReason.unrecoverableNetworkChange / noNetworkAvailable, and iOS tears the extension down. What's the specific testing scenario here? More specifically, is this real world testing with a disconnected device or is this a development device that's attached to a Mac, particularly with any kind of “active” interactions between the device being tested and the Mac? The one scenario I can think of that might explain this is that Safari has its own network-based debugging infrastructure, and it's (theoretically) possible that might be disrupting the app’s normal networking behavior. Related to that point, are you SURE the device is on the same WiFi network? I can't think of any scenario where NEAppPushProvider would stop working while you remained on the same network, but I can see the device changing networks without that being obvious. What I'd a
1w
Reply to Multiply exr lightmap in Reality Composer Pro Shader Graph
Hello @psqv , thank you for your question! Can you share more about what exactly you're trying to do? I want to make sure I understand correctly so the answer I give is useful. The first thing that may be going wrong is that you say you are sampling the Image using Image(float), which only gives you one channel, and then multiplying that with a Color3f. I suspect you need to modify the Image node to be Image (Color3f) as well. You can set this in the inspector for the node in Reality Composer Pro. Your description of your baked lighting setup sounds unusual to me. It sounds like you are trying to multiply the color of two textures together in the same material. This should be possible (the sampling mode on your image node might be causing it to break), however I don't think this is the correct way to handle baked lighting. Baked lighting only needs one texture, your base color texture, that already contains all your shadows and highlights. Ultimately, your baked lighting shader graph should
Topic: Graphics & Games SubTopic: RealityKit Tags:
1w
Reply to Issues Handling Multiple Incoming Calls in CallKit
FB20985796 (Issues Handling Multiple Incoming Calls in CallKit) Perfect, thank you. I filed a bug in Feedback Assistant. Is there another way to file a bug to get a faster resolution? No, not really. In this particular case, I think you should plan on working around the issue instead of waiting for the system to address this. This is a VERY longstanding issue (I suspect it's basically always done this), which makes it hard to justify as a high-priority fix. In terms of working around this, two tools I'd highlight here: The direct workaround is to put your active call on hold before you report the next call. In case you're not aware, you can safely hide new call reports while on an active call by reporting your new call using the same UUID as your currently active call. If your call is still active, you'll get CXErrorCodeIncomingCallErrorCallUUIDAlreadyExists (which you can ignore). If the existing call has ended (this can happen due to communication race conditions in the call infrastructure), then you'll get
Topic: App & System Services SubTopic: General Tags:
1w
Some issues and questions regarding the use of the BGContinuedProcessingTask API
Hi, I have been recently debugging the BGContinuedProcessingTask API and encountered some of the following issues. I hope you can provide some answers: First, let me explain my understanding of this API. I believe its purpose is to allow an app to trigger tasks that can be represented with progress indicators and require a certain amount of time to complete. After entering the background, these tasks can continue to be completed through the BGContinuedProcessingTask, preventing the system from terminating them before they are finished. In the launchHandler of the registration process, we only need to do a few things: Determine whether the actual business processing is still ongoing. Update the progress, title, and subtitle. Handle the expirationHandler. Set the task as completed. Here are some issues I encountered during my debugging process: After I called register and submit, the BGContinuedProcessingTask could not be triggered. The return values from my API calls were all normal. I tried different device m
7
0
216
1w
Reply to visionOS 3d interactions like the native keyboard when no longer observed in passthrough
Hi @VaiStardom I can confirm SpatialTapGesture doesn't consistently register direct input (trigger callbacks) on entities outside a person's field of view. However, you can use hand tracking to create a custom gesture that works on all entities, regardless of their position, as long as visionOS can track the person's hands. This approach only works within an ImmersiveSpace and requires adding an NSHandsTrackingUsageDescription entry to your app's Info.plist that explains why your app needs hand tracking access. The code snippet below demonstrates this technique by playing a sound when a person taps a cube, even when it's outside their field of view. import RealityKit import SwiftUI import AudioToolbox struct ImmersiveView: View { @State var spatialTrackingSession = SpatialTrackingSession() @State var collisionEventSubscription: EventSubscription? var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.1), materials: [SimpleMaterial(color: .green, isMetallic:
Topic: Spatial Computing SubTopic: ARKit Tags:
1w
Clarification on SwiftUI Environment Write Performance
I'm looking for clarification on a SwiftUI performance point mentioned in the recent Optimize your app's speed and efficiency | Meet with Apple video. (YouTube link not allowed, but the video is available on the Apple Developer channel.) At the 1:48:50 mark, the presenter says: Writing a value to the Environment doesn't only affect the views that read the key you're updating. It updates any view that reads from any Environment key. [abbreviated quote] That statement seems like a big deal if your app relies heavily on Environment values. Context I'm building a macOS application with a traditional three-panel layout. At any given time, there are many views on screen, plus others that exist in the hierarchy but are currently hidden (for example, views inside tab views or collapsed splitters). Nearly every major view reads something from the environment—often an @Observable object that acts as a service or provider. However, there are a few relatively small values that are written to the environment frequently, s
4
0
809
1w
Reply to MapKit detailAccessoryView buttons not working on macOS Tahoe
Thanks for filing FB20975128. While we investigate, I suggest switching your buttons to AppKit types as a workaround. It sounds like you're using SwiftUI to share the callout code between macOS and iOS, so I realize the impact of using platform-specific code here, but that will at least give you a path forward while this is investigated. — Ed Ford,  DTS Engineer
1w
Reply to Question about "Notification (NSE) filtering" capability request
The decision of whether the filtering entitlement is granted or not is solely in the hands of the entitlements team. Nobody here will have any involvement in granting or expediting the request. If you feel like the filtering entitlement is something you cannot live without, you would need to make your case to the entitlements team. That said, use of reportNewIncomingVoIPPushPayload() does not depend on the filtering entitlement. The filtering entitlement is only required to make the notification silent. You can still use the functionality of converting notifications to VoIP calls, although the notification will be visible. By adapting your messaging in the notification (using the extension) to something that explains the situation and changing the interruption-level key in the payload to passive, so the notification will not interrupt a foreground app, or make a sound or light up the screen in most cases (while still being visible, though), you should be able to implement the functionality you are af
1w
Question about "Notification (NSE) filtering" capability request
We are developing a messaging app which sends End-to-End encrypted data. The application supports multiple types of E2EE data, including text messages and voice over IP calls. Apple's article titled “Sending End-to-End Encrypted VoIP calls” (https://developer.apple.com/documentation/callkit/sending-end-to-end-encrypted-voip-calls) states that the following steps are required to support E2EE VoIP calls: Request permission to receive remote notifications through the User Notifications framework Register for VoIP calls using PuskKit Add a Notification Service Extension target to your app. Add the com.apple.developer.usernotifications.filtering entitlement to the NSE target’s entitlements file. We have completed steps one through three. We are still missing the filtering entitlement. As of right now the system does not allow us to use reportNewIncomingVoIPPushPayload(_:completion:) method because of the missing entitlement.
 Below is a short description of how our messaging app works: User sends a message to anot
1
0
371
1w
Reply to Clarification on SwiftUI Environment Write Performance
My example above is contrived, but rather than having environment values for .name, .age and .address it sounds like I should just have .person, which is an @Observable object containing the three properties. That way, writes to any of those properties don't trigger the issue you were talking about in the video. You could also just use @Environment(Person.self) private var person instead of a key, and inject it with .environment(person). However, in your example, since those environment variables wouldn't likely update very frequently (they're not really in a hot path) the cost of putting them in the environment is probably negligible. In my demo, it really adds up because the environment is basically being updated on every single frame. I wouldn't say that you need to go back to all your uses of environment values and change them to use @Observable classes, but if you're seeing performance issues, or building something new, it's worth considering.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
1w
FaceTime Screen-Share Audio and Video Experience
FaceTime’s screen-share audio balance is insanely absurd right now. Whenever I share media, the system audio that gets sent through FaceTime is a tiny whisper even at full volume (or even when connected to my speaker or headphones). The moment anyone on the call makes any noise at all, the shared audio ducks so hard it disappears, while the voice (or rustling or air conditioning noise) spikes to painful levels. It’s impossible to watch or listen to anything together. Also, the feature where FaceTime would shrink to a square during screen-sharing has been completely removed. That was a good feature and I'm really confused why it's gone. Now, the FaceTime window stays as a long rectangle that covers part of the content I'm trying to share (unless I do full screen tile, but then I can't pull up any other windows during the call) and can't be made smaller than about a third of the screen. You can't resize the window or adjust its dimensions, so it ends up blocking the actual media you'r
1
0
165
1w
Reply to Some issues and questions regarding the use of the BGContinuedProcessingTask API
I discussed these issues with the engineering team and have a bit more follow-up: (1) Currently, there is no way to distinguish whether the task was stopped by the user, by the system due to resource insufficiency, or due to an abnormal task. I would like to know whether there will be more information provided in the future to help distinguish these different scenarios. This is a serious issue I hadn't considered. The engineering team agrees that differentiating between system expiration and user cancellation is a significant oversight in the current API. I can't comment on future plans/scheduling, but this is something I expect the API to address. (2) However, on devices that do not support Dynamic Island, the app directly displays a pop-up notification within the app, and this notification does not disappear when switching between different screens within the same app. The user needs to actively swipe up to dismiss it. I think this experience is too intrusive for users. Just to clarify, the system
1w