I have an AUv3 plugin which uses an FFT - which requires n samples before it can produce any output - so, depending on the relation between the host's buffer size and the FFT window size, it may receive a several buffers of samples, producing no output, and then dumping out what it has once a sufficient number of samples have been received. This means that output is produced in fits and starts, in batches that match the FFT size (modulo oversampling) - e.g. if being fed buffers of 256 samples with an fft size of 1024, the output buffer sizes will be 0 for the first 3 buffers, and upon the fourth, the first 256 processed samples are returned and the remaining 768 cached; the next three buffers will return the remaining cached samples while processing and buffering subsequent ones, and so forth. The internal mechanics of that I have solved, caching output if the current output buffer is too small, and so forth - so it all works as advertised, and the plugin reports its latency correctly. And when run as an app
Search results for
Popping Sound
19,599 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I've had this exact issue pop up occasionally as well. Our issues have been a combination of this and what I've posted in this thread: https://developer.apple.com/forums/thread/806242
Topic:
App Store Distribution & Marketing
SubTopic:
TestFlight
Tags:
Hello @psqv , thank you for your question! Can you share more about what exactly you're trying to do? I want to make sure I understand correctly so the answer I give is useful. The first thing that may be going wrong is that you say you are sampling the Image using Image(float), which only gives you one channel, and then multiplying that with a Color3f. I suspect you need to modify the Image node to be Image (Color3f) as well. You can set this in the inspector for the node in Reality Composer Pro. Your description of your baked lighting setup sounds unusual to me. It sounds like you are trying to multiply the color of two textures together in the same material. This should be possible (the sampling mode on your image node might be causing it to break), however I don't think this is the correct way to handle baked lighting. Baked lighting only needs one texture, your base color texture, that already contains all your shadows and highlights. Ultimately, your baked lighting shader graph should
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
FB20985796 (Issues Handling Multiple Incoming Calls in CallKit) Perfect, thank you. I filed a bug in Feedback Assistant. Is there another way to file a bug to get a faster resolution? No, not really. In this particular case, I think you should plan on working around the issue instead of waiting for the system to address this. This is a VERY longstanding issue (I suspect it's basically always done this), which makes it hard to justify as a high-priority fix. In terms of working around this, two tools I'd highlight here: The direct workaround is to put your active call on hold before you report the next call. In case you're not aware, you can safely hide new call reports while on an active call by reporting your new call using the same UUID as your currently active call. If your call is still active, you'll get CXErrorCodeIncomingCallErrorCallUUIDAlreadyExists (which you can ignore). If the existing call has ended (this can happen due to communication race conditions in the call infrastructure), then you'll get
Topic:
App & System Services
SubTopic:
General
Tags:
Hi @VaiStardom I can confirm SpatialTapGesture doesn't consistently register direct input (trigger callbacks) on entities outside a person's field of view. However, you can use hand tracking to create a custom gesture that works on all entities, regardless of their position, as long as visionOS can track the person's hands. This approach only works within an ImmersiveSpace and requires adding an NSHandsTrackingUsageDescription entry to your app's Info.plist that explains why your app needs hand tracking access. The code snippet below demonstrates this technique by playing a sound when a person taps a cube, even when it's outside their field of view. import RealityKit import SwiftUI import AudioToolbox struct ImmersiveView: View { @State var spatialTrackingSession = SpatialTrackingSession() @State var collisionEventSubscription: EventSubscription? var body: some View { RealityView { content in let cube = ModelEntity( mesh: .generateBox(size: 0.1), materials: [SimpleMaterial(color: .green, isMetallic:
Topic:
Spatial Computing
SubTopic:
ARKit
Tags:
My workout watch app supports audio playback during exercise sessions. When users carry both Apple Watch, iPhone, and AirPods, with AirPods connected to the iPhone, I want to route audio from Apple Watch to AirPods for playback. I've implemented this functionality using the following code. try? session.setCategory(.playback, mode: .default, policy: .longFormAudio, options: []) try await session.activate() When users are playing music on iPhone and trigger my code in the watch app, Apple Watch correctly guides users to select AirPods, pauses the iPhone's music, and plays my audio. However, when playback finishes and I end the session using the code below: try session.setActive(false, options:[.notifyOthersOnDeactivation]) the iPhone doesn't automatically resume the previously interrupted music playback—it requires manual intervention. Is this expected behavior, or am I missing other important steps in my code?
Thanks for filing FB20975128. While we investigate, I suggest switching your buttons to AppKit types as a workaround. It sounds like you're using SwiftUI to share the callout code between macOS and iOS, so I realize the impact of using platform-specific code here, but that will at least give you a path forward while this is investigated. — Ed Ford, DTS Engineer
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
The decision of whether the filtering entitlement is granted or not is solely in the hands of the entitlements team. Nobody here will have any involvement in granting or expediting the request. If you feel like the filtering entitlement is something you cannot live without, you would need to make your case to the entitlements team. That said, use of reportNewIncomingVoIPPushPayload() does not depend on the filtering entitlement. The filtering entitlement is only required to make the notification silent. You can still use the functionality of converting notifications to VoIP calls, although the notification will be visible. By adapting your messaging in the notification (using the extension) to something that explains the situation and changing the interruption-level key in the payload to passive, so the notification will not interrupt a foreground app, or make a sound or light up the screen in most cases (while still being visible, though), you should be able to implement the functionality you are af
Topic:
App & System Services
SubTopic:
Notifications
Tags:
My example above is contrived, but rather than having environment values for .name, .age and .address it sounds like I should just have .person, which is an @Observable object containing the three properties. That way, writes to any of those properties don't trigger the issue you were talking about in the video. You could also just use @Environment(Person.self) private var person instead of a key, and inject it with .environment(person). However, in your example, since those environment variables wouldn't likely update very frequently (they're not really in a hot path) the cost of putting them in the environment is probably negligible. In my demo, it really adds up because the environment is basically being updated on every single frame. I wouldn't say that you need to go back to all your uses of environment values and change them to use @Observable classes, but if you're seeing performance issues, or building something new, it's worth considering.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
Sounds like the classic off-by one error to me. PDF Pages are traditionally one-based, while many functions in PDFKit, i.e. page(at:) are zero based. https://developer.apple.com/documentation/pdfkit/pdfdocument/page(at:)
Topic:
Media Technologies
SubTopic:
General
Tags:
I discussed these issues with the engineering team and have a bit more follow-up: (1) Currently, there is no way to distinguish whether the task was stopped by the user, by the system due to resource insufficiency, or due to an abnormal task. I would like to know whether there will be more information provided in the future to help distinguish these different scenarios. This is a serious issue I hadn't considered. The engineering team agrees that differentiating between system expiration and user cancellation is a significant oversight in the current API. I can't comment on future plans/scheduling, but this is something I expect the API to address. (2) However, on devices that do not support Dynamic Island, the app directly displays a pop-up notification within the app, and this notification does not disappear when switching between different screens within the same app. The user needs to actively swipe up to dismiss it. I think this experience is too intrusive for users. Just to clarify, the system
Topic:
App & System Services
SubTopic:
Processes & Concurrency
Tags:
We are encountering the following issue with our VoIP application for iPhone, published on the App Store, and would appreciate your guidance on possible countermeasures. The VoIP application (callee side) utilizes a Wi-Fi network. The sequence leading to the issue is as follows: VoIP App (callee): Launches iPhone (callee): Locks (e.g., by short-pressing the power button) VoIP App (callee): Transitions to a suspended state VoIP App (caller): Initiates a VoIP call VoIP App (callee): Receives a local push notification VoIP App (callee): Creates a UDP socket for call control (for SIP send/receive) VoIP App (callee): Creates a UDP socket for audio stream (for RTP send/receive) VoIP App (callee): Exchanges SIP messages (INVITE, 100 Trying, 180 Ringing, etc.) using the call control UDP socket VoIP App (callee): Answers the incoming call VoIP App (callee): Executes performAnswerCallAction() Immediately after executing performAnswerCallAction() in the above sequence, the sendto() function for both the UDP soc
Topic:
App & System Services
SubTopic:
Notifications
Tags:
APNS
User Notifications
PushKit
Push To Talk
Hi all, I’ve implemented the new Core Audio Tap API (AudioHardwareCreateProcessTap with CATapDescription) and I’m seeing consistent level attenuation that scales with the number of stereo output pairs exposed by the target device. What I observe Device with 4 stereo pairs (8 outs) → tap shows −12.04 dB relative to source. True 2-ch devices (built-in speakers, AirPods) → ~0 dB attenuation. The attenuation appears regardless of whether I: Create a global (default-output) tap via initStereoGlobalTapButExcludeProcesses: Or create a per-process/per-device tap via initWithProcesses:andDeviceUID:withStream: Additionally, the routing choice inside the sending app matters: App output to “System/Default Output” → I often see no attenuation. App output directly to a multi-out interface (e.g., RME Fireface) → I see the pair-count-scaled attenuation. I can query Core Audio for the number of output channels/pairs and gain-compensate (+20·log10(N_pairs) dB) and that matches my measurements for many cases.
@StevenPeterson Thanks for chiming in and thanks for the video itself, much appreciated. No, the view bodies are only run if the body uses the environment value for the key(s) used in that view and the value changes. That makes a lot more sense to me and aligns with the simple demo app I made to test out my original question. Consider: struct ContentView: View { @Environment(.name) var name @Environment(.age) var age var body: some View { Text(name) } } If .name is updated anywhere, then ContentView.body will be run but if .age is updated anywhere, then body is not run because .age is not referenced in the body. What worried me was that if somewhere else a view wrote to .address, then ContentView would be re-evaluated even though it doesn't refer to .address in any way. That's how I (incorrectly) interpreted the slide above. But in SwiftUI, an update doesn't always cause the view body to run again, but there is still a cost associated with these updates. That's the real clarifying statement, for me. There's
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
Hi! I'll try to provide some clarification: Does writing any value to the environment really cause all views in the entire SwiftUI view hierarchy that read any environment key to have their body re-evaluated? No, the view bodies are only run if the body uses the environment value for the key(s) used in that view and the value changes. But in SwiftUI, an update doesn't always cause the view body to run again, but there is still a cost associated with these updates. In Instruments, these updates are labeled as skipped. The demo in the video shows that the cumulative cost of all the skipped updates (seen in the Consequences detail view) is significant, even without their associated view bodies running (the view that demonstrates this in the video is called CardBackView). Do environment writes only affect child views, or do they propagate through the entire SwiftUI hierarchy? Great question! This only applies to child views of the .environment modifier that you're writing with. So if View B uses an .environment m
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags: