I'm developing a Final Cut Pro X workflow extension that transcribes audio and creates a text output. I need to allow users to drag this text directly from my extension into FCPX's timeline as titles. Current Implementation: Using NSFilePromiseProvider as per Apple's guidelines for drag and drop Generating valid FCPXML (v1.10) with proper structure: Complete resources section with format and asset references Event and project hierarchy Asset clip with connected title elements Proper timing and duration calculations Supporting multiple pasteboard types: com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What's Working: Drag operation initiates correctly File promise provider is set up properly FCPXML generation is successful (verified content) All required pasteboard types are registered Proper logging confirms data is being requested and provided Current Pasteboard Types Offered: com.apple.NSFilePromiseItemMetaData com.apple.pasteboard.promised-file-name com.appl
Search results for
Popping Sound
19,350 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello, I have a simple example using StateObject and List. When I bind the List(selection:) to a property of the StateObject like this: List(selection: $viewModel.selectedIndex) { ... } I noticed that each time I push the view using a NavigationLink, a new instance of the StateObject is created. However, when I pop the view, the deinit of the StateObject is not called. When is deinit actually expected to be called in this case? Example code: import SwiftUI @main struct NavigationViewDeinitSampleApp: App { var body: some Scene { WindowGroup { NavigationStack { ContentView() } } } } struct Item: Hashable { let text: String } @MainActor fileprivate class ContentViewModel: ObservableObject { @Published var selectedIndex: Int? = nil init() { NSLog(ContentViewModel.init) } deinit { NSLog(ContentViewModel.deinit) } } struct ContentView: View { @StateObject private var model = ContentViewModel() let items: [Item] = { return (0...10).map { i in Item(text: (i)) } }() var body: some View { List(selection: $mode
Hello everyone, We are building an iOS app using React Native that connects to a custom Bluetooth Low Energy (BLE) accessory. The accessory continuously sends small chunks of audio data to the app through BLE (basically every time the user speaks), which are then streamed in real time to our server via WebSocket for transcription and processing. We need to know if the following behavior is allowed by iOS runtime and App Store review policies: Can the app open a WebSocket connection in the background (not permanently, just briefly, several times a day) triggered by BLE activity from a registered accessory? Is there a limit to this? Clarifications: The app is not expected to remain permanently awake. Only during accessory-triggered events. WebSocket is required due to the real-time nature of streaming STT and delivering quick responses (via notifications). If allowed, are there any specific Info.plist declarations or entitlements we must include? Thanks in advance! Fran
Topic:
App Store Distribution & Marketing
SubTopic:
App Review
Tags:
App Review
Background Tasks
Core Bluetooth
Hello, Check out the discussion of thread priority inversions in Diagnosing performance issues early. Could it be that there is audio processing occurring on the main thread that is waiting on the results of a lower priority thread? This would cause such a warning.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
Hi, we're having trouble implementing search through Siri voice commands. We already did it successfully for audio playback using INPlayMediaIntentHandling. For search, none of the available ways works. Both INSearchForMediaIntentHandling and ShowInAppSearchResultsIntent never open the App in the first place. We tried various commands, but e.g. Search for sometimes opens the Apple Music app and sometimes shows a Google search widget. Our app is never taken into consideration for providing any results. We implemented all steps mentioned in WWDC videos and documentation (e.g. https://developer.apple.com/documentation/appintents/making-in-app-search-actions-available-to-siri-and-apple-intelligence), but nothing seems to work. We're mainly testing on iOS 18 currently. Any idea why this is not working?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
Intents
App Intents
Since MacOS 26 Apple Music has inconsitent drops to the Quality of some Tracks indiscrimantly. I don't know if others Expereinced it. It doesn't happen on the Speakers or connected via Bluetooth, but the AUX I/O has it quite often. It is more noticable on Headphones with 48kHz and higher Frequency Bandwidth. Here is the FB18062589
I have an App that has working audio for most devices except for iOS18 + Air pods pro connected. Eventually I realized that this is due to the Audio Spatialization which is enabled by default. I found a information property list key called AVGameBypassSystemSpatialAudio which can be enabled to true AVGameBypassSystemSpatialAudio and now the audio works for 90% of the game. However my game has a .MP4 which when it is played will be paused when the air pods are connected and after force skipping the video the Spatialization is now enabled and the app no longer has audio. I checked through the logs and found this part mediaplaybackd <<<< FigFilePlayer >>>> itemfig_establishedStereoAudioSpatializationPreferenceForAsset: <0x53451c000|I/OWH.01>: Stereo Spatialization allowed by default due to asset containing video After this line I can see the Spatialization being enabled. Does anyone know how to disable this auto setting of stereo Spatialization by
Topic:
Developer Tools & Services
SubTopic:
Xcode
I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation the best choice for me) but after making the scale with scaleTimeRange API, there is some short noise sound in play back. Also, sometimes it's fine when play AVMutableCompostion using AVPlayer with AVPlayerItem, but after exporting with AVAssetReader, will catch some short noise sounds in result file.... Not sure why. Here is the example project, which can build and run directly. https://github.com/luckysmg/daily_images/raw/refs/heads/main/TestDemo.zip
The CPU Counters modes for Guided configuration in Instruments 26 sample the underlying hardware counters at thread context switch. It sounds like you might want something with more control, though. Could you create a feedback to request this functionality be added to the Manual configurations as well, along with your use cases?
Topic:
Developer Tools & Services
SubTopic:
Instruments
Tags:
Hello @leizh007, First, a point of clarification: According to the Apple documentation https://developer.apple.com/documentation/avfoundation/avcapturesession/interruptionreason/videodevicenotavailablewithmultipleforegroundapps?language=objc , this interruption typically occurs when the app is running in a multi-app layout such as Slide Over, Split View, or Picture in Picture — all of which are iPad-only features. Picture in Picture is supported on iPhone as well (FaceTime for example). Now, back to your original question: Why Does AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps Occur on iPhone? Based on what you've mentioned, it sounds like you are receiving this interruption when there are not multiple foreground apps. That should not happen, and so my recommendation is that you file a bug report for this issue using Feedback Assistant if you haven't already. And users have reported that when the above error occurs and causes the camera to go black, restarting th
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
While the extension is .stopped, ALL URL LOADS are blocked on the device. Is this to be expected? (shouldFailClosed is set to false) This sounds like a bug, if shouldFailClosed is false, when the feature fails to come up, all URLs should be allowed. Please file a feedback with repro steps and sysdiagnose.
Topic:
App & System Services
SubTopic:
Networking
Tags:
The app is an official Apple app: https://developer.apple.com/documentation/wifiaware/building-peer-to-peer-apps. I have two phones, an iPhone 12 and an iPhone 13, both with Bluetooth turned on and connected to the same WiFi. The devices paired successfully the first time, but after I reset the Wi-Fi identifier in Settings - Privacy & Security - Paired Devices, the devices could no longer pair. Specifically, one device displays a PIN input pop-up, but the other device does not show the PIN. What could be the reason for this?
This is my native module code implementation I'm getting base64 encoded string from server and passing this to my native module of pcm player to play audio App.tsx PcmPlayer.writeChunk(e.data); PcmPlayer.swift import AVFoundation @objc(PcmPlayer) class PcmPlayer: RCTEventEmitter { private var engine: AVAudioEngine? private var playerNode: AVAudioPlayerNode? private var format: AVAudioFormat? private var bufferQueue = [Data]() private var isPlaying = false private var hasEnded = false private var scheduledBufferCount = 0 private let minBufferBytes = 50000 private let pcmQueue = DispatchQueue(label: pcm.queue) override init() { super.init() } override func supportedEvents() -> [String]! { return [onStatus, onMessage] } @objc(initPlayer:channels:bitsPerSample:) func initPlayer(_ sampleRate: NSNumber, channels: NSNumber, bitsPerSample: NSNumber) { pcmQueue.async { self.stopInternal() let session = AVAudioSession.sharedInstance() do { try session.setCategory(.playback, mode: .default, options: []) try
Topic:
Media Technologies
SubTopic:
Streaming
Hello everyone, I’m new to Swift development and have been working on an audio module that plays a specific sound at regular intervals - similar to a workout timer that signals switching exercises every few minutes. Following AVFoundation documentation, I’m configuring my audio session like this: let session = AVAudioSession.sharedInstance() try session.setCategory( .playback, mode: .default, options: [.interruptSpokenAudioAndMixWithOthers, .duckOthers] ) self.engine.attach(self.player) self.engine.connect(self.player, to: self.engine.outputNode, format: self.audioFormat) try? session.setActive(true) When it’s time to play cues, I schedule playback on a DispatchQueue: // scheduleAudio uses DispatchQueue self.scheduleAudio(at: interval.start) { do { try audio.engine.start() audio.node.play() for sample in interval.samples { audio.node.scheduleBuffer(sample.buffer, at: AVAudioTime(hostTime: sample.hostTime)) } } catch { print(Audio activation failed: (error)) } } This works p
Hello! I've two mics connected to a USB-hub. The USB-hub is then connected to my iPad. Both mics are part of the audio session's list of available inputs. The problem is that regardless of which mic I select in my app (using setPreferredInput() on the audio session), the audio keeps coming from the mic that was last connected to the USB-hub. Anyone that knows if this is a limitation in iPadOS/iOS?
Topic:
Media Technologies
SubTopic:
Audio