Swift is a powerful and intuitive programming language for Apple platforms and beyond.

Posts under Swift tag

201 Posts

Post

Replies

Boosts

Views

Activity

MPSMatrixRandom SEGFAULTs when ran in an async context
The following minimal snippet SEGFAULTS with SDK 26.0 and 26.1. Won't crash if I remove async from the enclosing function signature - but it's impractical in a real project. import Metal import MetalPerformanceShaders let SEED = UInt64(0x0) typealias T = Float16 /* Why ran in async context? Because global GPU object, and async makeMTLFunction, and async makeMTLComputePipelineState. Nevertheless, can trigger the bug without using global @MainActor let myGPU = MyGPU() */ @main struct CMDLine { static func main() async { let ptr = UnsafeMutablePointer<T>.allocate(capacity: 0) async let future: Void = randomFillOnGPU(ptr, count: 0) print("Main thread is playing around") await future print("Successfully reached the end.") } static func randomFillOnGPU(_ buf: UnsafeMutablePointer<T>, count destbufcount: Int) async { // let (device, queue) = await (myGPU.device, myGPU.commandqueue) let myGPU = MyGPU() let (device, queue) = (myGPU.device, myGPU.commandqueue) // Init MTLBuffer, async let makeFunction, makeComputePipelineState, etc. let tempDataType = MPSDataType.uInt32 let randfiller = MPSMatrixRandomMTGP32(device: device, destinationDataType: tempDataType, seed: Int(bitPattern:UInt(SEED))) print("randomFillOnGPU: successfully created MPSMatrixRandom.") // try await computePipelineState // ^ Crashes before this could return // Or in this minimal case, after randomFillOnGPU() returns // make encoder, set pso, dispatch, commit... } } actor MyGPU { let device : MTLDevice let commandqueue : MTLCommandQueue init() { guard let dev: MTLDevice = MPSGetPreferredDevice(.skipRemovable), let cq = dev.makeCommandQueue(), dev.supportsFamily(.apple6) || dev.supportsFamily(.mac2) else { print("Unable to get Metal Device! Exiting"); exit(EX_UNAVAILABLE) } print("Selected device: \(String(format: "%llX", dev.registryID))") self.device = dev self.commandqueue = cq print("myGPU: initialization complete.") } } See FB20916929. Apparently objc autorelease pool is releasing the wrong address during context switch (across suspension points). I wonder why such obvious case has not been caught before.
0
0
68
Nov ’25
Tab bar alignment issue from iOS 26
I have a tab bar with 4 tabs. Since iOS 26, I’m facing an issue with the tab alignment — the text automatically shrinks, and the tab icon shifts upward for some tabs. I am using UITab property for creating Tabs from os Version above 18 and UITabBar for os version below 18.
1
1
168
Nov ’25
Xcode SPM (Swift Package Manager) Error
Xcode SPM (Swift Package Manager) Error I added the "Apple App Store Server Swift Library" library to Xcode using Swift Package Manager. Both the project and target are set to iOS 14 or higher. However, when I build after adding the library, an error occurs with the library. A message appears stating that the target is set to iOS 12. I'm using Xcode 26.0.1. Even after adding it to all my projects, the build continues with the same error. I've tried building the library from version 1.0.0 to the latest version, but the same error persists. Even after completely cleaning the project and running it, the same error persists. Does anyone know how to fix this?
1
0
104
Nov ’25
ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
0
0
260
Nov ’25
Where are Huggingface Models, downloaded by Swift MLX apps cached
I'm downloading a fine-tuned model from HuggingFace which is then cached on my Mac when the app first starts. However, I wanted to test adding a progress bar to show the download progress. To test this I need to delete the cached model. From what I've seen online this is cached at /Users/userName/.cache/huggingface/hub However, if I delete the files from here, using Terminal, the app still seems to be able to access the model. Is the model cached somewhere else? On my iPhone it seems deleting the app also deletes the cached model (app data) so that is useful.
0
0
394
Oct ’25
AVAssetExportSession ignores frameDuration 60fps and exports at 30fps, but AVPlayer playback is correct
Hey everyone, I'm stuck on a really frustrating AVFoundation problem. I'm building a video editor that uses a custom AVVideoCompositor to add effects, and I need the final output to be 60 FPS. So basically, I create an AVMutableComposition to sequence my video clips. I create an AVMutableVideoComposition and set the frame rate to 60 FPS: videoComposition.frameDuration = CMTime(value: 1, timescale: 60) I assign my CustomVideoCompositor class to the videoComposition. I create an AVPlayerItem with the composition and video composition. The Problem: Playback Works: When I play the AVPlayerItem in an AVPlayer, it's perfect. It plays at a smooth 60 FPS, and my custom compositor's startRequest method is called 60 times per second. Export Fails: When I try to export the exact same composition and video composition using AVAssetExportSession, the final .mp4 file is always 30 FPS (or 29.97). I've logged inside my custom compositor during the export, and it's definitely being called 30 times per second, so it's generating the 30 frames. It seems like AVAssetExportSession is just dropping every other frame when it encodes the video. My source videos are screen recordings which I recorded using ScreenCaptureKit itself with the minimum frame interval to be 60. Here is my export function. I'm using the AVAssetExportPresetHighestQuality preset :- func exportVideo(to outputURL: URL) async throws { guard let composition = composition, let videoComposition = videoComposition else { throw VideoCompositionError.noValidVideos } try? FileManager.default.removeItem(at: outputURL) guard let exportSession = AVAssetExportSession( asset: composition, presetName: AVAssetExportPresetHighestQuality // Is this the problem? ) else { throw VideoCompositionError.trackCreationFailed } exportSession.outputFileType = .mp4 exportSession.videoComposition = videoComposition // This has the 60fps setting try await exportSession.export(to: outputURL, as: .mp4) } I've created a bare bones sample project that shows this exact bug in action. The resulting video is 60fps during playback, but only 30fps during the export. https://github.com/zaidbren/SimpleEditor My Question: Why is AVAssetExportSession ignoring my 60 FPS frameDuration and defaulting to 30 FPS, even though AVPlayer respects it?
1
0
357
Oct ’25
iMessages Deeplink App Switching for iOS 26.0
Ok so for some background, our app has a keyboard extension where we run a dictation service. Due to iOS limitations, this requires the user to press a button on the keyboard which will then bring the user to our app to activate an audio session. Once the audio session has been activated, it takes the user back to the original app it came from to continue using the keyboard + dictation service. The problem we're running into involves iOS 26.0 and the iMessages app. Whenever our app tries to switch back to the iMessages app using Deep Link (specifically the messages:// URL), the iMessages app opens up a new message compose sheet. This compose sheet replaces the view or message thread that the user was previously looking at which we don't want. This behavior appears to be only happening in iOS 26 and not in any of the previous iOS versions (tested up to iOS 18.6). We know that it should be possible to bring the user back to the messages app without opening up this new compose sheet, because similar apps do the same thing and these apps have been verified to work on iOS 26. We've tried also using the sms:// URL but that always opens a new message compose sheet regardless of whether or not it's iOS 26.0.
3
0
156
Oct ’25
AVFoundation Custom Video Compositor Skipping Frames During AVPlayer Playback Despite 60 FPS Frame Duration
I'm building a Swift video editor with AVFoundation and a custom compositor. Despite setting AVVideoComposition.frameDuration to 60 FPS, I'm seeing significant frame skipping during playback. Console Output Shows Frame Skipping Frame #0 at 0.0 ms (fps: 60.0) Frame #2 at 33.333333333333336 ms (fps: 60.0) Frame #6 at 100.0 ms (fps: 60.0) Frame #10 at 166.66666666666666 ms (fps: 60.0) Frame #32 at 533.3333333333334 ms (fps: 60.0) Frame #62 at 1033.3333333333335 ms (fps: 60.0) Frame #96 at 1600.0 ms (fps: 60.0) Instead of frames every ~16.67ms (60 FPS), I'm getting irregular intervals, sometimes 33ms, 67ms, or hundreds of milliseconds apart. Renderer.swift (Key Parts) @MainActor class Renderer: ObservableObject { @Published var playerItem: AVPlayerItem? private let assetManager: ProjectAssetManager? private let compositorId: String func buildComposition() async { // ... load mouse moves/clicks data ... let composition = AVMutableComposition() let videoTrack = composition.addMutableTrack( withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid ) var currentTime = CMTime.zero var layerInstructions: [AVMutableVideoCompositionLayerInstruction] = [] // Insert video segments for videoURL in videoURLs { let asset = AVAsset(url: videoURL) let tracks = try await asset.loadTracks(withMediaType: .video) let assetVideoTrack = tracks.first let duration = try await asset.load(.duration) try videoTrack.insertTimeRange( CMTimeRange(start: .zero, duration: duration), of: assetVideoTrack, at: currentTime ) let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack) let transform = try await assetVideoTrack.load(.preferredTransform) layerInstruction.setTransform(transform, at: currentTime) layerInstructions.append(layerInstruction) currentTime = CMTimeAdd(currentTime, duration) } let videoComposition = AVMutableVideoComposition() videoComposition.frameDuration = CMTime(value: 1, timescale: 60) // 60 FPS // Set render size from first video if let firstURL = videoURLs.first { let firstAsset = AVAsset(url: firstURL) let firstTrack = try await firstAsset.loadTracks(withMediaType: .video).first let naturalSize = try await firstTrack.load(.naturalSize) let transform = try await firstTrack.load(.preferredTransform) videoComposition.renderSize = CGSize( width: abs(naturalSize.applying(transform).width), height: abs(naturalSize.applying(transform).height) ) } let instruction = CompositorInstruction() instruction.timeRange = CMTimeRange(start: .zero, duration: currentTime) instruction.layerInstructions = layerInstructions instruction.compositorId = compositorId videoComposition.instructions = [instruction] videoComposition.customVideoCompositorClass = CustomVideoCompositor.self let playerItem = AVPlayerItem(asset: composition) playerItem.videoComposition = videoComposition self.playerItem = playerItem } } class CompositorInstruction: NSObject, AVVideoCompositionInstructionProtocol { var timeRange: CMTimeRange = .zero var enablePostProcessing: Bool = false var containsTweening: Bool = false var requiredSourceTrackIDs: [NSValue]? var passthroughTrackID: CMPersistentTrackID = kCMPersistentTrackID_Invalid var layerInstructions: [AVVideoCompositionLayerInstruction] = [] var compositorId: String = "" } class CustomVideoCompositor: NSObject, AVVideoCompositing { var sourcePixelBufferAttributes: [String : Any]? = [ kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA) ] var requiredPixelBufferAttributesForRenderContext: [String : Any] = [ kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA) ] func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {} func startRequest(_ asyncVideoCompositionRequest: AVAsynchronousVideoCompositionRequest) { guard let sourceTrackID = asyncVideoCompositionRequest.sourceTrackIDs.first?.int32Value, let sourcePixelBuffer = asyncVideoCompositionRequest.sourceFrame(byTrackID: sourceTrackID), let outputBuffer = asyncVideoCompositionRequest.renderContext.newPixelBuffer() else { asyncVideoCompositionRequest.finish(with: NSError(domain: "VideoCompositor", code: -1)) return } let videoComposition = asyncVideoCompositionRequest.renderContext.videoComposition let frameDuration = videoComposition.frameDuration let fps = Double(frameDuration.timescale) / Double(frameDuration.value) let compositionTime = asyncVideoCompositionRequest.compositionTime let seconds = CMTimeGetSeconds(compositionTime) let frameInMilliseconds = seconds * 1000 let frameNumber = Int(round(seconds * fps)) print("Frame #\(frameNumber) at \(frameInMilliseconds) ms (fps: \(fps))") asyncVideoCompositionRequest.finish(withComposedVideoFrame: outputBuffer) } func cancelAllPendingVideoCompositionRequests() {} } VideoPlayerViewModel @MainActor class VideoPlayerViewModel: ObservableObject { let player = AVPlayer() private let renderer: Renderer func loadVideo() async { await renderer.buildComposition() if let playerItem = renderer.playerItem { player.replaceCurrentItem(with: playerItem) } } } What I've Tried Frame skipping is consistent—exact same timestamps on every playback Issue persists even with minimal processing (just passing through buffers) Occurs regardless of compositor complexity Please note that I need every frame at exact millisecond intervals for my application. Frame loss or inconsistent frameInMillisecond values are not acceptable.
1
0
287
Oct ’25
Complications not showing up after WatchOS 26
We are having an issue with our app after upgrading to WatchOS 26. After some time our complication disappears from the watch face. If you go to add it back, our app shows up with an empty icon placeholder and you are unable to tap it to add it back. Sometimes restarting the watch will bring it back, sometimes it does not. Has anyone experienced this? What should we be looking at to figure out why this is happening? Or could this be a bug in WatchOS 26?
3
1
206
Oct ’25
Compile Failure on NSXPCInterface Initializer
I have a project that leverages XPC and has interoperability between Swift and Objective-C++. I am presently getting a compile-time error in one of our unit test targets, of "Argument passed to call that takes no arguments" on the following code: let interface = NSXPCInterface(with: XPCServiceDelegate.self) My XPCServiceDelegate protocol is defined as: @objc(XPCServiceDelegate) public protocol XPCServiceDelegate { //... } For the longest time, this code has compiled successfully, and it has not recently changed. There are two confusing things about this error. The first is that I have a different build scheme that will compile correctly other code with the same structure. The other is that I have team members that are able to compile my failing scheme successfully on the same XCode version, OSVersion, and branch of our repository. I've attempted numerous things to try to get this code to compile, but I've run out of ideas. Here's what I've tried: Clean build both on XCode 16.4 and XCode 26 Beta Delete DerivedData and rebuild on XCode 16.4 and XCode 26 Beta Delete and re-clone our git repository Uninstall and reinstall XCode Attempt to locate cached data for XCode and clear it out. (I'm not sure if I got everything that exists on the system for this.) Ensure all OS and XCode updates have been applied. The interface specification for NSXPCInterface clearly has an initializer with one arguement for the delegate protocol, so I don't know why the compiler would fail for this. Is there some kind of forward declaration or shadowing of NSXPCInterface? Do you have any ideas on what I could try next?
12
0
202
Oct ’25
Avoid repeated authorization dialogs when changing network settings
I have an swift command line tool that changes proxy settings in system preferences via SystemConfiguration framework, does some stuff, and in the end reverts proxy settings back to original. Here is simplified code: var authorization: AuthorizationRef? let status = AuthorizationCreate(nil, nil, [], &authorization) let prefs = SCPreferencesCreateWithAuthorization(nil, "myapp" as CFString, nil, authorization) // change proxy setttings // do some stuff let prefs2 = SCPreferencesCreateWithAuthorization(nil, "myapp" as CFString, nil, authorization) // change proxy settings back to original When I try to change settings for the first time, the system dialog appears requesting permission to change network settings. If I try to change settings again within а short period of time, the dialog does not appear again. However, if it takes more than several minutes after first change, the dialog does appear again. Is there a way to create authorization, so that the dialog appears only once per app launch, no matter how much time passed since the first dialog?
1
0
281
Oct ’25
How can I remove a localization from a String Catalog in a Swift Package?
Hello, I'm trying to remove a localization from a String Catalog in a Swift Package. How can I do that? I tried to remove the file and create a new one, but all the languages are back. The only place where I've found a reference to the languages is in Package/.swiftpm/xcode/package.xcworkspace/xcuserdata/user.xcuserdatad/UserInterfaceState.xcuserstate But I don't know how to edit this file to remove a language. Thank you, Axel
2
0
141
Oct ’25
ManipulationComponent Not Translating using indirect input
When using the new RealityKit Manipulation Component on Entities, indirect input will never translate the entity - no matter what settings are applied. Direct manipulation works as expected for both translation and rotation. Is this intended behaviour? This is different from how indirect manipulation works on Model3D. How else can we get translation from this component? visionOS 26 Beta 2 Build from macOS 26 Beta 2 and Xcode 26 Beta 2 Attached is replicable sample code, I have tried this in other projects with the same results. var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "MovieFilmReel", in: reelRCPBundle) { ManipulationComponent.configureEntity(immersiveContentEntity, allowedInputTypes: .all, collisionShapes: [ShapeResource.generateBox(width: 0.2, height: 0.2, depth: 0.2)]) immersiveContentEntity.position.y = 1 immersiveContentEntity.position.z = -0.5 var mc = ManipulationComponent() mc.releaseBehavior = .stay immersiveContentEntity.components.set(mc) content.add(immersiveContentEntity) } } }
11
3
1k
Oct ’25
Add a value to the Photos Caption field
In the iOS Photos app there is a caption field the user can write to. How can you write to this value from Swift when creating a photo? I see apps that do this, but there doesn't seem to be any official way to do this using the Photo library through PHAssetCreationRequest or PHAssetResourceCreationOptions or setting EXIF values, I tried settings a bunch of values there including IPTC values but nothing appears in the caption field in the iOS photos app. There must be some way to do it since I see other apps setting that value somehow after capturing a photo.
1
1
123
Oct ’25
Prevent default file selector in a SwiftUI DocumentGroup app and show a custom welcome window on launch
I’m building a macOS document based app using SwiftUI’s DocumentGroup API. By default, when a document based app launches, macOS automatically shows a file open panel or creates a new untitled document window. However, I want to suppress this default behavior and instead show a custom welcome window when the app starts — something similar to how Xcode or Final Cut Pro shows a “Welcome” or “Start Project” screen first. So basically, when the user opens the app normally, it should not open the document selector or create a document automatically. Instead, it should show my custom SwiftUI or AppKit window. Here is my Code :- //MyApp.swift import SwiftUI import AppKit @main struct PhiaApp: App { @NSApplicationDelegateAdaptor(AppDelegate.self) var appDelegate var body: some Scene { DocumentGroup(newDocument: MyDocumentModel()) { file in EditorView(document: file.document, filePath: file.fileURL) } Settings { EmptyView() } } } Current I have this code setup for my MainApp.swift, where I am using the AppDelegate to create a custom recording window using appkit and also defining the DocumentGroup to handle the custom .myapp file opens. However, when I launch the app, its showing my appkit window as well as the macOs native file Selector to select the file I want to open. I want when the user opens the app normally, it should not open the document selector or create a document automatically. Instead, it should show my custom SwiftUI or AppKit window. However, the app should still fully support opening .myapp documents by double clicking from Finder, using the standard File → Open and File → New menu options, also having multiple document windows open at once. This is my AppDelegate.swift file :- import AppKit import SwiftUI class AppDelegate: NSObject, NSApplicationDelegate { var panel: Panel? private var statusItem: NSStatusItem? func applicationDidFinishLaunching(_ notification: Notification) { showWindow() } // MARK: - Window control func showWindow() { if panel == nil { let root = RecordingViewMain() let newPanel = Panel(rootView: root) if let screen = NSScreen.main { let size = NSSize(width: 360, height: 240) let origin = NSPoint( x: screen.visibleFrame.midX - size.width / 2, y: screen.visibleFrame.midY - size.height / 2 ) newPanel.setFrame(NSRect(origin: origin, size: size), display: true) } panel = newPanel } panel?.makeKeyAndOrderFront(nil) } func hideWindow() { panel?.orderOut(nil) } @objc private func showPanelAction() { showWindow() } @objc private func quitAction() { NSApp.terminate(nil) } }
2
0
189
Oct ’25
UIViewController memory leak with modal presentedViewController
Hi everyone, I'm encountering an unexpected behavior with modal presentations in UIKit. Here’s what happens: I have UIViewControllerA (let’s call it the "orange" VC) pushed onto a UINavigationController stack. I present UIViewControllerB (the "red" VC, inside its own UINavigationController as a .formSheet) modally over UIViewControllerA. After a short delay, I pop UIViewControllerA from the navigation stack. Issue: After popping UIViewControllerA, the modal UIViewControllerB remains visible on the screen and in memory. I expected that dismissing (popping) the presenting view controller would also dismiss the modal, but it stays. Expected Behavior: When UIViewControllerA (orange) is popped, I expect the modal UIViewControllerB (red) to be dismissed as well. Actual Behavior: The modal UIViewControllerB remains on screen and is not dismissed, even though its presenting view controller has been removed from the navigation stack. Video example: https://youtube.com/shorts/sttbd6p_r_c Question: Is this the expected behavior? If so, what is the recommended way to ensure that the modal is dismissed when its presenting view controller is removed from the navigation stack? Code snippet: class MainVC: UIViewController { private weak var orangeVC: UIViewController? override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .blue let dq = DispatchQueue.main dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc1 = UIViewController() vc1.view.backgroundColor = .orange vc1.modalPresentationStyle = .overCurrentContext self?.navigationController?.pushViewController(vc1, animated: true) self?.orangeVC = vc1 dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc2 = UIViewController() vc2.view.backgroundColor = .red vc2.modalPresentationStyle = .formSheet vc2.isModalInPresentation = true let nav = UINavigationController(rootViewController: vc2) if let sheet = nav.sheetPresentationController { sheet.detents = [.medium()] } self?.orangeVC?.present(nav, animated: true) dq.asyncAfter(deadline: .now() + 1) { [weak self] in self?.navigationController?.popViewController(animated: true) } } } } } Thank you for your help!
0
0
94
Oct ’25