Swift is a powerful and intuitive programming language for Apple platforms and beyond.

Posts under Swift tag

200 Posts

Post

Replies

Boosts

Views

Activity

Launch The Main App from LockedCameraCapture
If the app is launched from LockedCameraCapture and if the settings button is tapped, I need to launch the main app. CameraViewController: func settingsButtonTapped() { #if isLockedCameraCaptureExtension //App is launched from Lock Screen //Launch main app here... #else //App is launched from Home Screen self.showSettings(animated: true) #endif } In this document: https://developer.apple.com/documentation/lockedcameracapture/creating-a-camera-experience-for-the-lock-screen Apple asks you to use: func launchApp(with session: LockedCameraCaptureSession, info: String) { Task { do { let activity = NSUserActivityTypeLockedCameraCapture activity.userInfo = [UserInfoKey: info] try await session.openApplication(for: activity) } catch { StatusManager.displayError("Unable to open app - \(error.localizedDescription)") } } } However, the documentation states that this should be placed within the extension code - LockedCameraCapture. If I do that, how can I call that all the way down from the main app's CameraViewController?
3
0
502
Nov ’25
Async function doesn’t see external changes to an inout Bool in Release build
Title Why doesn’t this async function see external changes to an inout Bool in Release builds (but works in Debug)? Body I have a small helper function that waits for a Bool flag to become true with a timeout: public func test(binding value: inout Bool, timeout maximum: Int) async throws { var count = 0 while value == false { count += 1 try await Task.sleep(nanoseconds: 0_100_000_000) if value == true { return } if count > (maximum * 10) { return } } } I call like this: var isVPNConnected = false adapter.start(tunnelConfiguration: tunnelConfiguration) { [weak self] adapterError in guard let self = self else { return } if let adapterError = adapterError { } else { isVPNConnected = true } completionHandler(adapterError) } try await waitUntilTrue(binding: &isVPNConnected, timeout: 10) What I expect: test should keep looping until flag becomes true (or the timeout is hit). When the second task sets flag = true, the first task should see that change and return. What actually happens: In Debug builds this behaves as expected: when the second task sets flag = true, the loop inside test eventually exits. In Release builds the function often never sees the change and gets stuck until the timeout (or forever, depending on the code). It looks like the while value == false condition is using some cached value and never observes the external write. So my questions are: Is the compiler allowed to assume that value (the inout Bool) does not change inside the loop, even though there are await suspension points and another task is mutating the same variable? Is this behavior officially “undefined” because I’m sharing a plain Bool across tasks without any synchronization (actors / locks / atomics), so the debug build just happens to work? What is the correct / idiomatic way in Swift concurrency to implement this kind of “wait until flag becomes true with timeout” pattern? Should I avoid inout here completely and use some other primitive (e.g. AsyncStream, CheckedContinuation, Actor, ManagedAtomic, etc.)? Is there any way to force the compiler to re-read the Bool from memory each iteration, or is that the wrong way to think about it? Environment (if it matters): Swift: [fill in your Swift version] Xcode: [fill in your Xcode version] Target: iOS / macOS [fill in as needed] Optimization: default Debug vs. Release settings I’d like to understand why Debug vs Release behaves differently here, and what the recommended design is for this kind of async waiting logic in Swift.
2
0
1.1k
Nov ’25
Swift version wrong in Xcode 26.2 beta
Release notes of Xcode 26.2 https://developer.apple.com/documentation/xcode-release-notes/xcode-26_2-release-notes state that Swift 6.2.3 is shipped with it. But when trying to check the Swift version with #if swift(>= 6.2.3) it returns false. Running swiftc -version returns swift-driver version: 1.127.14.1 Apple Swift version 6.2 (swiftlang-6.2.3.3.2 clang-1700.6.3.2) As you can see, there is a mismatch between the marketing version "6.2" and the build version "6.2.3.3.2". Being able to check for the 6.2.3 version is important for my team, because we are impatiently awaiting the change to the tabviewbottomaccessory modifier https://developer.apple.com/documentation/swiftui/view/tabviewbottomaccessory%28isenabled:content:%29 so that it can be hidden when not needed. Without this fix we have the issue that the accessory shows even without content in iOS 26.1.
0
1
114
Nov ’25
Module dependency cycle errors in Xcode 26
I just updated to Xcode 26 and some of my Swift Packages have been getting strange build errors that I have not been able to resolve. When I try to build my Swift Package in Xcode I get the following error Module dependency cycle: 'UIKit.swiftmodule -> .swiftmodule -> SafariServices.swiftmodule -> UIKit.swiftmodule' It seems like it is related to the change in Xcode 26 that states "Swift explicit modules will be the default mode for building all Swift targets". I see that you can disable this with the build setting SWIFT_ENABLE_EXPLICIT_MODULES=NO, but I don't see a way to do this in Package.swift, as you can't include value assignments like this .define("SWIFT_ENABLE_EXPLICIT_MODULES=NO"). Our private SPM repos use CI/CD and so we need to be able to build them independently of any use in a project. I would appreciate any help on fixing our Swift Package builds in Xcode 26, thanks!
4
1
378
Nov ’25
How to correctly fetch data using SwiftData
Hi there! I'm making an app that stores data for the user's profile in SwiftData. I was originally going to use UserDefaults but I thought SwiftData could save Images natively but this is not true so I really could switch back to UserDefaults and save images as Data but I'd like to try to get this to work first. So essentially I have textfields and I save the values of them through a class allProfileData. Here's the code for that: import SwiftData import SwiftUI @Model class allProfileData { var profileImageData: Data? var email: String var bio: String var username: String var profileImage: Image { if let data = profileImageData, let uiImage = UIImage(data: data) { return Image(uiImage: uiImage) } else { return Image("DefaultProfile") } } init(email:String, profileImageData: Data?, bio: String, username:String) { self.profileImageData = profileImageData self.email = email self.bio = bio self.username = username } } To save this I create a new class (I think, I'm new) and save it through ModelContext import SwiftUI import SwiftData struct CreateAccountView: View { @Query var profiledata: [allProfileData] @Environment(\.modelContext) private var modelContext let newData = allProfileData(email: "", profileImageData: nil, bio: "", username: "") var body: some View { Button("Button") { newData.email = email modelContext.insert(newData) try? modelContext.save() print(newData.email) } } } To fetch the data, I originally thought that @Query would fetch that data but I saw that it fetches it asynchronously so I attempted to manually fetch it, but they both fetched nothing import SwiftData import SwiftUI @Query var profiledata: [allProfileData] @Environment(\.modelContext) private var modelContext let fetchRequest = FetchDescriptor<allProfileData>() let fetchedData = try? modelContext.fetch(fetchRequest) print("Fetched count: \(fetchedData?.count ?? 0)") if let imageData = profiledata.first?.profileImageData, let uiImage = UIImage(data: imageData) { profileImage = Image(uiImage: uiImage) } else { profileImage = Image("DefaultProfile") } No errors. Thanks in advance
7
0
290
Nov ’25
.glassEffect(_in:) crushing on iOS 26 public beta.
In one of my apps, i am using .glassEffect(_:In) to add glass effect on various elements. The app always crushes when a UI element with glassEffect(_in:) modifier is being rendered. This only happens on device running iOS 26 public beta. I know this for certain because I connected the particular device to xcode and run the app on the device. When i comment out the glassEffect modifier, app doesn't crush. Is it possible to check particular realeases with #available? If not, how should something like this be handled. Also how do i handle such os level erros without the app crushing. Thanks.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
8
0
243
Nov ’25
Universal Links not opening my app.
My app has been published by 2 months now I still I cant get Universal Links to work. I checked a lot of docs as well as videos about setting up universal links. Everyone with clear steps: Add the well-known json file to the server. Already validated by AASA web validator. Add the Associated domain on project capabilities, with the Web page root only. Eg: applinks:example:com. Install the app and trying clicking a link from notepad. Or instead make a long press to deploy contextual menu to see if my app is on the selectable options to open the link. My app is not been open in any of my attempts and the console always trying to use safari. I had a couple of screenshots of my testing. I really need help with this.
0
0
288
Nov ’25
EditButton selection gets cleared when confirmationDialog appears in SwiftUI List
I'm experiencing an issue where my List selection (using EditButton) gets cleared when a confirmationDialog is presented, making it impossible to delete the selected items. Environment: Xcode 16.0.1 Swift 5 iOS 18 (targeting iOS 17+) Issue Description: When I select items in a List using EditButton and tap a delete button that shows a confirmationDialog, the selection is cleared as soon as the dialog appears. This prevents me from accessing the selected items to delete them. Code: State variables: @State var itemsSelection = Set<Item>() @State private var showDeleteConfirmation = false List with selection: List(currentItems, id: \.self, selection: $itemsSelection) { item in NavigationLink(value: item) { ItemListView(item: item) } } .navigationDestination(for: Item.self) { item in ItemViewDetail(item: item) } .toolbar { ToolbarItem(placement: .primaryAction) { EditButton() } } Delete button with confirmation: Button { if itemsSelection.count > 1 { showDeleteConfirmation = true } else { deleteItemsSelected() } } label: { Image(systemName: "trash.fill") .font(.system(size: 12)) .foregroundStyle(Color.red) } .padding(8) .confirmationDialog( "Delete?", isPresented: $showDeleteConfirmation, titleVisibility: .visible ) { Button("Delete", role: .destructive) { deleteItemsSelected() } Button("Cancel", role: .cancel) {} } message: { Text("Going to delete: \(itemsSelection.count) items?") } Expected Behavior: The selected items should remain selected when the confirmationDialog appears, allowing me to delete them after confirmation. Actual Behavior: As soon as showDeleteConfirmation becomes true and the dialog appears, itemsSelection becomes empty (count = 0), making it impossible to delete the selected items. What I've Tried: Moving the confirmationDialog to different view levels Checking if this is related to the NavigationLink interaction Has anyone encountered this issue? Is there a workaround to preserve the selection when showing a confirmation dialog?
1
0
144
Nov ’25
ASWebAuthenticationSession Async/Await API
Is there any particular reason why ASWebAuthenticationSession doesn't have support for async/await? (example below) do { let callbackURL = try await webAuthSession.start() } catch { // handle error } I'm curious if this style of integration doesn't exist for architectural reasons? Or is the legacy completion handler style preserved in order to prevent existing integrations from breaking?
2
1
661
Nov ’25
Error when capturing a high-resolution frame with depth data enabled in ARKit
Problem Description (1) I am using ARKit in an iOS app to provide AR capabilities. Specifically, I'm trying to use the ARSession's captureHighResolutionFrame(using:) method to capture a high-resolution frame along with its corresponding depth data: open func captureHighResolutionFrame(using photoSettings: AVCapturePhotoSettings?) async throws -> ARFrame (2) However, when I attempt to do so, the call fails at runtime with the following error, which I captured from the Xcode debugger: [AVCapturePhotoOutput capturePhotoWithSettings:delegate:] settings.depthDataDeliveryEnabled must be NO if self.isDepthDataDeliveryEnabled is NO Code Snippet Explanation (1) ARConfig and ARSession Initialization The following code configures the ARConfiguration and ARSession. A key part of this setup is setting the videoFormat to the one recommended for high-resolution frame capturing, as suggested by the documentation. func start(imagesDirectory: URL, configuration: Configuration = Configuration()) { // ... basic setup ... let arConfig = ARWorldTrackingConfiguration() arConfig.planeDetection = [.horizontal, .vertical] // Enable various frame semantics for depth and segmentation if ARWorldTrackingConfiguration.supportsFrameSemantics(.smoothedSceneDepth) { arConfig.frameSemantics.insert(.smoothedSceneDepth) } if ARWorldTrackingConfiguration.supportsFrameSemantics(.sceneDepth) { arConfig.frameSemantics.insert(.sceneDepth) } if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) { arConfig.frameSemantics.insert(.personSegmentationWithDepth) } // Set the recommended video format for high-resolution captures if let videoFormat = ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing { arConfig.videoFormat = videoFormat print("Enabled: High-Resolution Frame Capturing by selecting recommended video format.") } arSession.run(arConfig, options: [.resetTracking, .removeExistingAnchors]) // ... } (2) Capturing the High-Resolution Frame The code below is intended to manually trigger the capture of a high-resolution frame. The goal is to obtain both a high-resolution color image and its associated high-resolution depth data. To achieve this, I explicitly set the isDepthDataDeliveryEnabled property of the AVCapturePhotoSettings object to true. func requestImageCapture() async { // ... guard statements ... print("Manual image capture requested.") if #available(iOS 16.0, *) { // Assuming 16.0+ for this API if let defaultSettings = arSession.configuration?.videoFormat.defaultPhotoSettings { // Create a mutable copy from the default settings, as recommended let photoSettings = AVCapturePhotoSettings(from: defaultSettings) // Explicitly enable depth data delivery for this capture request photoSettings.isDepthDataDeliveryEnabled = true do { let highResFrame = try await arSession.captureHighResolutionFrame(using: photoSettings) print("Successfully captured a high-resolution frame.") if let initialDepthData = highResFrame.capturedDepthData { // Process depth data... } else { print("High-resolution frame was captured, but it contains no depth data.") } } catch { // The exception is caught here print("Error capturing high-resolution frame: \(error.localizedDescription)") } } } // ... } Issue Confirmation & Question (1) Through debugging, I have confirmed the following behavior: If I call captureHighResolutionFrame without providing the photoSettings parameter, or if photoSettings.isDepthDataDeliveryEnabled is set to false, the method successfully returns a high-resolution ARFrame, but its capturedDepthData is nil. (2) The error message clearly indicates that settings.depthDataDeliveryEnabled can only be true if the underlying AVCapturePhotoOutput instance's own isDepthDataDeliveryEnabled property is also true. (3) However, within the context of ARKit and ARSession, I cannot find any public API that would allow me to explicitly access and configure the underlying AVCapturePhotoOutput instance that ARSession manages. (4) My question is: Is there a way to configure the ARSession's internal AVCapturePhotoOutput to enable its isDepthDataDeliveryEnabled property? Or, is simultaneously capturing a high-resolution frame and its associated depth data simply not a supported use case in the current ARKit framework?
1
0
330
Nov ’25
UIWindow makeKeyAndVisible crash
In our app, there is a UIWindow makeKeyAndVisible crash, and for now, it appears once, crash stack: the crash detail: crash.txt in the RCWindowSceneManager class's makeWindowKeyAndVisible method, we check and set a window's windowScene and makeKeyAndVisible: public func makeWindowKeyAndVisible(_ window: UIWindow?) { guard let window else { return } if let currentWindowScene { if window.windowScene == nil || window.windowScene != currentWindowScene { window.windowScene = currentWindowScene } window.makeKeyAndVisible() } } and I set a break point at a normal no crash flow, the stack is: why it crash? and how we avoid this, thank you.
0
0
109
Nov ’25
UITabBarController Titles Not Updating After Runtime Localization (iOS 18 Regression)
Starting with iOS 18, UITabBarController no longer updates tab bar item titles when localized strings are changed or reassigned at runtime. This behavior worked correctly in iOS 17 and earlier, but in iOS 18 the tab bar titles remain unchanged until the app restarts or the view controller hierarchy is reset. This regression appears to be caused by internal UITabBarController optimizations introduced in iOS 18. Steps to Reproduce Create a UITabBarController with two or more tabs, each having a UITabBarItem with a title. Localize the tab titles using NSLocalizedString(): tabBar.items?[0].title = NSLocalizedString("home_tab", comment: "") tabBar.items?[1].title = NSLocalizedString("settings_tab", comment: "") Run the app. Change the app’s language at runtime (without restarting), or manually reassign the localized titles again: tabBar.items?[0].title = NSLocalizedString("home_tab", comment: "") tabBar.items?[1].title = NSLocalizedString("settings_tab", comment: "") Observe that the tab bar titles do not update visually.
1
0
81
Nov ’25
SwiftUI Map menu / chrome placement — three approaches (overlay, ZStack + safeAreaPadding, safeAreaInset): which one is best practice?
Hi everyone, I’m building a full-screen Map (MapKit + SwiftUI) with persistent top/bottom chrome (menu buttons on top, session stats + map controls on bottom). I have three working implementations and I’d like guidance on which pattern Apple recommends long-term (gesture correctness, safe areas, Dynamic Island/home indicator, and future compatibility). Version 1 — overlay(alignment:) on Map Idea: Draw chrome using .overlay(alignment:) directly on the map and manage padding manually. Map(position: $viewModel.previewMapCameraPosition, scope: mapScope) { UserAnnotation { UserLocationCourseMarkerView(angle: viewModel.userCourse - mapHeading) } } .mapStyle(viewModel.mapType.mapStyle) .mapControls { MapUserLocationButton().mapControlVisibility(.hidden) MapCompass().mapControlVisibility(.hidden) MapPitchToggle().mapControlVisibility(.hidden) MapScaleView().mapControlVisibility(.hidden) } .overlay(alignment: .top) { mapMenu } // manual padding inside .overlay(alignment: .bottom) { bottomChrome } // manual padding inside Version 2 — ZStack + .safeAreaPadding Idea: Place the map at the back, then lay out top/bottom chrome in a VStack inside a ZStack, and use .safeAreaPadding(.all) so content respects safe areas. ZStack(alignment: .top) { Map(...).ignoresSafeArea() VStack { mapMenu Spacer() bottomChrome } .safeAreaPadding(.all) } Version 3 — .safeAreaInset on the Map Idea: Make the map full-bleed and then reserve top/bottom space with safeAreaInset, letting SwiftUI manage insets Map(...).ignoresSafeArea() .mapStyle(viewModel.mapType.mapStyle) .mapControls { MapUserLocationButton().mapControlVisibility(.hidden) MapCompass().mapControlVisibility(.hidden) MapPitchToggle().mapControlVisibility(.hidden) MapScaleView().mapControlVisibility(.hidden) } .safeAreaInset(edge: .top) { mapMenu } // manual padding inside .safeAreaInset(edge: .bottom) { bottomChrome } // manual padding inside Question I noticed: Safe-area / padding behavior – Version 2 requires the least extra padding and seems to create a small but partial safe-area spacing automatically. – Version 3 still needs roughly the same manual padding as Version 1, even though it uses safeAreaInset. Why doesn’t safeAreaInset fully handle that spacing? Rotation crash (Metal) When using Version 3 (safeAreaInset + ignoresSafeArea), rotating the device portrait↔landscape several times triggers a Metal crash: failed assertion 'The following Metal object is being destroyed while still required… CAMetalLayer Display Drawable' The same crash can happen with Version 1, though less often. I haven’t tested it much with Version 2. Is this a known issue or race condition between Map’s internal Metal rendering and view layout changes? Expected behavior What’s the intended or supported interaction between safeAreaInset, safeAreaPadding, and overlay when embedding persistent chrome inside a SwiftUI Map? Should safeAreaInset normally remove the need for manual padding, or is that by design?
0
0
78
Nov ’25
UITextField selects all text on focus when the content is long — how to keep the caret at the end?
I’m building a custom input field using UITextField. When the user taps to focus the field and the text is long, the entire text becomes selected by default. This is the same behavior you can see in iOS Safari’s search field or the Messages app search field. What I want: when the field becomes first responder, the caret should be placed at the end of the text (latest word), without selecting all the text. Here’s the code that builds my text field: public func makeTextField() -> UITextField { let textField = UITextField() textField.autocorrectionType = .no textField.setContentCompressionResistancePriority(.required, for: .horizontal) textField.setContentCompressionResistancePriority(.required, for: .vertical) if #available(iOS 13.0, *) { textField.smartInsertDeleteType = .no } textField.smartQuotesType = .no textField.smartDashesType = .no textField.autocapitalizationType = .none textField.contentMode = .scaleToFill if let font = attributes[.font] as? UIFont { textField.font = font } if let color = attributes[.foregroundColor] as? UIColor { textField.textColor = color } // Truncate long text at the head let paragraphStyle = NSMutableParagraphStyle() paragraphStyle.lineBreakMode = .byTruncatingHead textField.defaultTextAttributes[.paragraphStyle] = paragraphStyle textField.delegate = self textField.backgroundColor = .clear textField.addTarget(self, action: #selector(textFieldDidChange(_:)), for: .editingChanged) return textField } Entire text is selected when focusing the field if the text is long. What I’ve tried Forcing the caret to the end in textFieldDidBeginEditing: func textFieldDidBeginEditing(_ textField: UITextField) { let end = textField.endOfDocument textField.selectedTextRange = textField.textRange(from: end, to: end) } Doing the same asynchronously (next runloop) to avoid the system overriding selection: func textFieldDidBeginEditing(_ textField: UITextField) { DispatchQueue.main.async { let end = textField.endOfDocument textField.selectedTextRange = textField.textRange(from: end, to: end) } } Despite these, the system still selects all text on focus when the string is long/truncated at the head.
2
0
250
Nov ’25
MPSMatrixRandom SEGFAULTs when ran in an async context
The following minimal snippet SEGFAULTS with SDK 26.0 and 26.1. Won't crash if I remove async from the enclosing function signature - but it's impractical in a real project. import Metal import MetalPerformanceShaders let SEED = UInt64(0x0) typealias T = Float16 /* Why ran in async context? Because global GPU object, and async makeMTLFunction, and async makeMTLComputePipelineState. Nevertheless, can trigger the bug without using global @MainActor let myGPU = MyGPU() */ @main struct CMDLine { static func main() async { let ptr = UnsafeMutablePointer<T>.allocate(capacity: 0) async let future: Void = randomFillOnGPU(ptr, count: 0) print("Main thread is playing around") await future print("Successfully reached the end.") } static func randomFillOnGPU(_ buf: UnsafeMutablePointer<T>, count destbufcount: Int) async { // let (device, queue) = await (myGPU.device, myGPU.commandqueue) let myGPU = MyGPU() let (device, queue) = (myGPU.device, myGPU.commandqueue) // Init MTLBuffer, async let makeFunction, makeComputePipelineState, etc. let tempDataType = MPSDataType.uInt32 let randfiller = MPSMatrixRandomMTGP32(device: device, destinationDataType: tempDataType, seed: Int(bitPattern:UInt(SEED))) print("randomFillOnGPU: successfully created MPSMatrixRandom.") // try await computePipelineState // ^ Crashes before this could return // Or in this minimal case, after randomFillOnGPU() returns // make encoder, set pso, dispatch, commit... } } actor MyGPU { let device : MTLDevice let commandqueue : MTLCommandQueue init() { guard let dev: MTLDevice = MPSGetPreferredDevice(.skipRemovable), let cq = dev.makeCommandQueue(), dev.supportsFamily(.apple6) || dev.supportsFamily(.mac2) else { print("Unable to get Metal Device! Exiting"); exit(EX_UNAVAILABLE) } print("Selected device: \(String(format: "%llX", dev.registryID))") self.device = dev self.commandqueue = cq print("myGPU: initialization complete.") } } See FB20916929. Apparently objc autorelease pool is releasing the wrong address during context switch (across suspension points). I wonder why such obvious case has not been caught before.
0
0
111
Nov ’25
Tab bar alignment issue from iOS 26
I have a tab bar with 4 tabs. Since iOS 26, I’m facing an issue with the tab alignment — the text automatically shrinks, and the tab icon shifts upward for some tabs. I am using UITab property for creating Tabs from os Version above 18 and UITabBar for os version below 18.
1
1
189
Nov ’25
Xcode SPM (Swift Package Manager) Error
Xcode SPM (Swift Package Manager) Error I added the "Apple App Store Server Swift Library" library to Xcode using Swift Package Manager. Both the project and target are set to iOS 14 or higher. However, when I build after adding the library, an error occurs with the library. A message appears stating that the target is set to iOS 12. I'm using Xcode 26.0.1. Even after adding it to all my projects, the build continues with the same error. I've tried building the library from version 1.0.0 to the latest version, but the same error persists. Even after completely cleaning the project and running it, the same error persists. Does anyone know how to fix this?
1
0
130
Nov ’25
ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
0
0
307
Nov ’25
Where are Huggingface Models, downloaded by Swift MLX apps cached
I'm downloading a fine-tuned model from HuggingFace which is then cached on my Mac when the app first starts. However, I wanted to test adding a progress bar to show the download progress. To test this I need to delete the cached model. From what I've seen online this is cached at /Users/userName/.cache/huggingface/hub However, if I delete the files from here, using Terminal, the app still seems to be able to access the model. Is the model cached somewhere else? On my iPhone it seems deleting the app also deletes the cached model (app data) so that is useful.
0
0
409
Oct ’25
AVAssetExportSession ignores frameDuration 60fps and exports at 30fps, but AVPlayer playback is correct
Hey everyone, I'm stuck on a really frustrating AVFoundation problem. I'm building a video editor that uses a custom AVVideoCompositor to add effects, and I need the final output to be 60 FPS. So basically, I create an AVMutableComposition to sequence my video clips. I create an AVMutableVideoComposition and set the frame rate to 60 FPS: videoComposition.frameDuration = CMTime(value: 1, timescale: 60) I assign my CustomVideoCompositor class to the videoComposition. I create an AVPlayerItem with the composition and video composition. The Problem: Playback Works: When I play the AVPlayerItem in an AVPlayer, it's perfect. It plays at a smooth 60 FPS, and my custom compositor's startRequest method is called 60 times per second. Export Fails: When I try to export the exact same composition and video composition using AVAssetExportSession, the final .mp4 file is always 30 FPS (or 29.97). I've logged inside my custom compositor during the export, and it's definitely being called 30 times per second, so it's generating the 30 frames. It seems like AVAssetExportSession is just dropping every other frame when it encodes the video. My source videos are screen recordings which I recorded using ScreenCaptureKit itself with the minimum frame interval to be 60. Here is my export function. I'm using the AVAssetExportPresetHighestQuality preset :- func exportVideo(to outputURL: URL) async throws { guard let composition = composition, let videoComposition = videoComposition else { throw VideoCompositionError.noValidVideos } try? FileManager.default.removeItem(at: outputURL) guard let exportSession = AVAssetExportSession( asset: composition, presetName: AVAssetExportPresetHighestQuality // Is this the problem? ) else { throw VideoCompositionError.trackCreationFailed } exportSession.outputFileType = .mp4 exportSession.videoComposition = videoComposition // This has the 60fps setting try await exportSession.export(to: outputURL, as: .mp4) } I've created a bare bones sample project that shows this exact bug in action. The resulting video is 60fps during playback, but only 30fps during the export. https://github.com/zaidbren/SimpleEditor My Question: Why is AVAssetExportSession ignoring my 60 FPS frameDuration and defaulting to 30 FPS, even though AVPlayer respects it?
1
0
374
Oct ’25