ScreenCaptureKit

RSS for tag

ScreenCaptureKit brings high-performance screen capture, including audio and video, to macOS.

Posts under ScreenCaptureKit tag

38 Posts

Post

Replies

Boosts

Views

Activity

Issue: Plain Executables Do Not Appear Under “Screen & System Audio Recording” on macOS 26.1 (Tahoe)
Summary I am investigating a change in macOS 26.1 (Tahoe) where plain (non-bundled) executables that request screen recording access no longer appear under: System Settings → Privacy & Security → Screen & System Audio Recording This behavior differs from macOS Sequoia, where these executables did appear in the list and could be managed through the UI. Tahoe still prompts for permission and still allows the executable to capture the screen once permission is granted, but the executable never shows up in the UI list. This breaks user expectations and removes UI-based permission management. To confirm the behavior, I created a small reproduction project with both: a plain executable, and an identical executable packaged inside an .app bundle. Only the bundled version appears in System Settings. Observed Behaviour 1. Plain Executable (from my reproduction project) When running a plain executable that captures the screen: macOS displays the normal screen-recording permission prompt. Before granting permission: screenshots show only the desktop background. After granting permission: screenshots capture the full display. The executable does not appear under “Screen & System Audio Recording”. Even when permission is granted manually (e.g., dragging the executable into the pane), the executable still does not appear, which prevents the user from modifying or revoking the permission through the UI. If the executable is launched from inside another app (e.g., VS Code, Terminal), the parent app appears in the list instead, not the executable itself. 2. Bundled App Version (from the reproduction project) I packaged the same code into a simple .app bundle (ScreenCaptureApp.app). When running the app: The same permission prompt appears. Pre-permission screenshots show the desktop background. Post-permission screenshots capture the full display. The app does appear under “Screen & System Audio Recording”. This bundle uses the same underlying executable — the only difference is packaging. Hypothesis macOS 26.1 (Tahoe) appears to require app bundles for an item to be shown in the Screen Recording privacy UI. Plain executables: still request and receive permission, still function correctly after permission is granted, but do not appear in the System Settings list. This may be an intentional change, undocumented behavior, or a regression. Reproduction Project The reproduction project includes: screen_capture.go A simple Go program that captures screenshots in a loop. screen_capture_executable Plain executable built from the Go source. ScreenCaptureApp.app/ App bundle containing the same executable. build.sh Builds both the plain executable and the app bundle. Permission reset and TCC testing scripts. The project demonstrates the behavior consistently. Steps to Reproduce Plain Executable Build: ./build.sh Reset screen capture permissions: sudo tccutil reset ScreenCapture Run: ./screen_capture_executable Before granting: screenshots show desktop only. Grant permission when prompted. After granting: full screenshots. Executable does not appear in “Screen & System Audio Recording”. Bundled App Build (if not already built): ./build.sh Reset permissions (optional): sudo tccutil reset ScreenCapture Run: open ScreenCaptureApp.app Before granting: screenshots show desktop. After granting: full screenshots. App bundle appears in the System Settings list. Additional Check I also tested launching the plain executable as a child process of another executable, similar to how some software architectures work. Result: Permission prompt appears Permission can be granted Executable still does not appear in the UI, even though TCC tracks it internally → consistent with the plain-executable behaviour. This reinforces that only app bundles are listed. Questions for Apple Is the removal of plain executables from “Screen & System Audio Recording” an intentional change in macOS Tahoe? If so, does Apple now require all screen-recording capable binaries to be packaged as .app bundles for the UI to display them? Is there a supported method for making a plain executable (launched by a parent process) appear in the list? If this is not intentional, what is the recommended path for reporting this as a regression? Files Unfortunately, I have discovered the zip file that contains my reproduction project can't be directly uploaded here. Here is a Google Drive link instead: https://drive.google.com/file/d/1sXsr3Q0g6_UzlOIL54P5wbS7yBkpMJ7A/view?usp=sharing Thank you for taking the time to review this. Any insight into whether this change is intentional or a regression would be very helpful.
1
0
41
53m
ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
0
0
240
2w
Value of type 'SCRecordingOutput' has no member 'delegate'
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing. I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :- try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput) let captureStartTime = Date() mouseTracker?.startTracking(with: captureStartTime) But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions. The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly. import Foundation import AVFAudio import ScreenCaptureKit import OSLog import Combine class CaptureEngine: NSObject, @unchecked Sendable { private let logger = Logger() private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? private var recordingOutput: SCRecordingOutput? private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue") private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue") private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue") func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { // Create the stream output delegate. let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) self.recordingOutput = recordingOutput recordingOutput.delegate = self try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } func stopCapture() async throws { do { try await stream?.stopCapture() } catch { logger.error("Failed to stop capture: \(error.localizedDescription)") throw error } } func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async { do { try await stream?.updateConfiguration(configuration) try await stream?.updateContentFilter(filter) } catch { logger.error("Failed to update the stream session: \(String(describing: error))") } } func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws { try self.stream?.removeRecordingOutput(recordingOutput) } } // MARK: - SCRecordingOutputDelegate extension CaptureEngine: SCRecordingOutputDelegate { func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) { let startTime = Date() logger.info("Recording output did start recording \(startTime)") } func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) { logger.info("Recording output did finish recording") } func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) { logger.error("Recording output failed with error: \(error.localizedDescription)") } } private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate { private let logger = Logger() override init() { super.init() } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) { guard sampleBuffer.isValid else { return } switch outputType { case .screen: break case .audio: break case .microphone: break @unknown default: logger.error("Encountered unknown stream output type:") } } func stream(_ stream: SCStream, didStopWithError error: Error) { logger.error("Stream stopped with error: \(error.localizedDescription)") } } I am getting error Value of type 'SCRecordingOutput' has no member 'delegate' Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only. What is the best way to achieving the desired result? Is there any other / better way to do it?
1
0
184
4w
Recorded video looks blurry, color-washed, low bitrate, compressed using ScreenCaptureKit
Hello everyone, I am trying to implement ScreenCaptureKit into my project, I am using MacOs 26 for the target version and followed this official project from apple regarding the screencapture kit. https://developer.apple.com/documentation/ScreenCaptureKit/capturing-screen-content-in-macos I used the official exact code and implemented in my app, but the results are not good. The video look blurry, unclear, lost colors and its like 720p honestly. The 1st video frame t is result when I integrate it in my app. After that, I used another app ( which was built in electron, they were using screencapturekit as well ) and there results were a lot better. The 2nd video frame is when I recorded using their application. It appears as close to as system display I tried multiple things, but no impressive results. For my purpose, I want to the final recorded video to be as good as the display quality of the system. I also applied .hdr local display and coronolicial, but no help with that as well. Changed codecs to .mov, .hevc, but still no help Why is not the recoded video as high quality as the display
3
0
321
Oct ’25
Take correctly sized screenshots with ScreenCaptureKit
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window. But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false. Is there a way and what's the best way to take full-resolution screenshots of the correct size? import Cocoa import ScreenCaptureKit class ViewController: NSViewController { @IBOutlet weak var imageView: NSImageView! override func viewDidAppear() { imageView.imageScaling = .scaleProportionallyUpOrDown view.wantsLayer = true view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1) Task { let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows let window = windows[0] let filter = SCContentFilter(desktopIndependentWindow: window) let configuration = SCStreamConfiguration() configuration.ignoreShadowsSingleWindow = false configuration.showsCursor = false configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale) configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale) print(filter.contentRect) let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration) imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height)) } } }
5
0
894
Oct ’25
SCStreamUpdateFrameContentRect X coordinate always returns 48 instead of expected 0
SCStreamUpdateFrameContentRect X coordinate always returns 48 instead of expected 0 Environment Device: MacBook Pro 13-inch macOS: Sequoia 15.6.1 Xcode: 16.4 Framework: Screen Capture Kit Issue Description I'm experiencing an unexpected behavior with Screen Capture Kit where the SCStreamUpdateFrameContentRect X coordinate consistently returns 48 instead of the expected 0. Code Context I'm using SCContentSharingPicker to capture screen content and implementing the SCStreamOutput protocol to receive frame data. In my stream(_:didOutputSampleBuffer:of:) method, I'm extracting the content rect information from the sample buffer attachments: func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of type: SCStreamOutputType) { switch type { case .screen: guard let attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]] else { return } guard let attachments = attachmentsArray.first else { return } if !attachments.keys.contains(.contentRect) { return } print(attachments) // X coordinate always shows 48 /* }, __C.SCStreamFrameInfo(_rawValue: SCStreamUpdateFrameContentRect): { Height = 540; Width = 864; X = 48; <<-- unexpected value Y = 0; }] */ return // ... other cases } } Expected vs Actual Behavior Expected: X coordinate should be 0 (indicating the content starts at the left edge of the screen) Actual: X coordinate is consistently 48 Visual verification: When I display the captured screen content, it appears correctly without any offset, suggesting the actual content should indeed start at X=0 Main ViewModel Class import Foundation import ScreenCaptureKit import SwiftUICore class VM: NSObject, ObservableObject, SCContentSharingPickerObserver, SCStreamDelegate, SCStreamOutput { @State var isRecording = false // Error handling delegate func stream(_ stream: SCStream, didStopWithError error: Error) { DispatchQueue.main.async { self.isRecording = false } } var picker: SCContentSharingPicker? func createPicker() -> SCContentSharingPicker { if let p = picker { return p } let picker = SCContentSharingPicker.shared var config = SCContentSharingPicker.shared.defaultConfiguration //SCContentSharingPickerConfiguration() config.allowedPickerModes = .singleDisplay config.allowsChangingSelectedContent = false config.excludedBundleIDs.append(Bundle.main.bundleIdentifier!) picker.add(self) picker.isActive = true SCContentSharingPicker.shared.present(using: .display) return picker } var stream: SCStream? let videoSampleBufferQueue = DispatchQueue(label: "com.example.apple-samplecode.VideoSampleBufferQueue") // observer call back for picker func contentSharingPicker(_ picker: SCContentSharingPicker, didUpdateWith filter: SCContentFilter, for stream: SCStream?) { if let stream = stream { stream.updateContentFilter(filter) } else { let config = SCStreamConfiguration() config.capturesAudio = false config.captureMicrophone = false config.captureResolution = .automatic config.captureDynamicRange = .SDR config.showMouseClicks = false config.showsCursor = false // Set the frame rate for screen capture config.minimumFrameInterval = CMTime(value: 1, timescale: 5) self.stream = SCStream(filter: filter, configuration: config, delegate: self) do { try self.stream?.addStreamOutput(self, type: .screen, sampleHandlerQueue: self.videoSampleBufferQueue) } catch { print("\(error)") } self.stream?.updateContentFilter(filter) DispatchQueue.main.async { self.stream?.startCapture() } } } func contentSharingPicker(_ picker: SCContentSharingPicker, didCancelFor stream: SCStream?) {} func contentSharingPickerStartDidFailWithError(_ error: any Error) { print(error) } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of type: SCStreamOutputType) { switch type { case .screen: guard let attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]] else { return } guard let attachments = attachmentsArray.first else { return } if !attachments.keys.contains(.contentRect) { return } print(attachments) return case .audio: return case .microphone: return @unknown default: return } } func outputVideoEffectDidStart(for stream: SCStream) { print("outputVideoEffectDidStart") } func outputVideoEffectDidStop(for stream: SCStream) { print("outputVideoEffectDidStop") } func streamDidBecomeActive(_ stream: SCStream) { print("streamDidBecomeActive") } func streamDidBecomeInactive(_ stream: SCStream) { print("streamDidBecomeInactive") } }
1
0
53
Sep ’25
SCStreamUpdateFrameContentRect X coordinate always returns 48 instead of expected 0
SCStreamUpdateFrameContentRect X coordinate always returns 48 instead of expected 0 Environment Device: MacBook Pro 13-inch macOS: Sequoia 15.6.1 Xcode: 16.4 Framework: Screen Capture Kit Issue Description I'm experiencing an unexpected behavior with Screen Capture Kit where the SCStreamUpdateFrameContentRect X coordinate consistently returns 48 instead of the expected 0. Code Context I'm using SCContentSharingPicker to capture screen content and implementing the SCStreamOutput protocol to receive frame data. In my stream(_:didOutputSampleBuffer:of:) method, I'm extracting the content rect information from the sample buffer attachments: func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of type: SCStreamOutputType) { switch type { case .screen: guard let attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]] else { return } guard let attachments = attachmentsArray.first else { return } if !attachments.keys.contains(.contentRect) { return } print(attachments) // X coordinate always shows 48 /* , __C.SCStreamFrameInfo(_rawValue: SCStreamUpdateFrameContentRect): { Height = 540; Width = 864; X = 48; <<-- unexpected offset Y = 0; }] */ return // ... other cases } } Expected vs Actual Behavior Expected: X coordinate should be 0 (indicating the content starts at the left edge of the screen) Actual: X coordinate is consistently 48 Visual verification: When I display the captured screen content, it appears correctly without any offset, suggesting the actual content should indeed start at X=0 Additional Information The picker is configured with .singleDisplay mode I'm excluding the current app's bundle ID from capture The captured content visually appears correct, only the reported coordinates seem off Main ViewModel Class import Foundation import ScreenCaptureKit import SwiftUICore class VM: NSObject, ObservableObject, SCContentSharingPickerObserver, SCStreamDelegate, SCStreamOutput { @State var isRecording = false // Error handling delegate func stream(_ stream: SCStream, didStopWithError error: Error) { DispatchQueue.main.async { self.isRecording = false } } var picker: SCContentSharingPicker? func createPicker() -> SCContentSharingPicker { if let p = picker { return p } let picker = SCContentSharingPicker.shared picker.add(self) picker.isActive = true SCContentSharingPicker.shared.present(using: .display) return picker } var stream: SCStream? let videoSampleBufferQueue = DispatchQueue(label: "com.example.apple-samplecode.VideoSampleBufferQueue") // observer call back for picker func contentSharingPicker(_ picker: SCContentSharingPicker, didUpdateWith filter: SCContentFilter, for stream: SCStream?) { if let stream = stream { stream.updateContentFilter(filter) } else { let config = SCStreamConfiguration() config.capturesAudio = false config.captureMicrophone = false config.captureResolution = .automatic config.captureDynamicRange = .SDR config.showMouseClicks = false config.showsCursor = false // Set the frame rate for screen capture config.minimumFrameInterval = CMTime(value: 1, timescale: 5) // 10 FPS self.stream = SCStream(filter: filter, configuration: config, delegate: self) do { try self.stream?.addStreamOutput(self, type: .screen, sampleHandlerQueue: self.videoSampleBufferQueue) } catch { print("\(error)") } self.stream?.updateContentFilter(filter) DispatchQueue.main.async { self.stream?.startCapture() } } } func contentSharingPicker(_ picker: SCContentSharingPicker, didCancelFor stream: SCStream?) {} func contentSharingPickerStartDidFailWithError(_ error: any Error) { print(error) } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of type: SCStreamOutputType) { switch type { case .screen: guard let attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]] else { return } guard let attachments = attachmentsArray.first else { return } if !attachments.keys.contains(.contentRect) { return } print(attachments) return case .audio: return case .microphone: return @unknown default: return } } func outputVideoEffectDidStart(for stream: SCStream) { print("outputVideoEffectDidStart") } func outputVideoEffectDidStop(for stream: SCStream) { print("outputVideoEffectDidStop") } func streamDidBecomeActive(_ stream: SCStream) { print("streamDidBecomeActive") } func streamDidBecomeInactive(_ stream: SCStream) { print("streamDidBecomeInactive") } }
0
0
40
Sep ’25
Guidance on Blocking Screenshots/Screen Recordings in macOS App (Unity 6000.0.51f1)
Dear Team, We have developed a macOS app using Unity (6000.0.51f1) that includes learning activities, assessments/tests, audio recording, and video playback functionalities. For security and content protection, we want to restrict the ability for users to capture screenshots or screen recordings of the app (especially via the built-in Cmd+Shift+5 / Screenshot toolbar). We have attempted several approaches, but they have not been reliable. We would appreciate guidance from Apple or the developer community on the feasibility of this requirement. Our requirements: Block or disable screenshots/screen recordings (particularly the built-in Cmd+Shift+5) for the app. Preferably achieve this using public APIs so that the app remains App Store compatible and passes review. If full blocking is not possible, then at least ensure that any captured content appears blank/black for sensitive sections of the app. Additionally, we would like our app’s window behavior to work like other apps do: Red button → Close the application completely. Yellow button → Minimize the application to the Dock. Green button → Maximize to full screen while still allowing access to the Dock and menu bar. Any advice, best practices, or references to relevant documentation would be highly valuable. Thank you for your support.
1
0
106
Sep ’25
ScreenCapture + CMSampleBuffer logic issue
i'm trying to work on a simple screen recording app on macOS that always records the last 'x' seconds of your screen and saves it whenever you want, as a way to get comfortable with swift programming and apple APIs. i was able to get it running for the past '30 seconds' and record and store it. however i realised that there was a core issue with my solution: i was defining the SCStreamConfiguration.queueDepth = 900 (to account for 30fps for 30 seconds) which goes completely against apple's instructions: https://developer.apple.com/documentation/screencapturekit/scstreamconfiguration/queuedepth?language=objc now when i changed queueDepth back to 8, i am only able to record 8 frames and it saves only those first 8 frames. i am unsure what the flow of the apis should be while dealing with screenCaptureKit. for context, here's my recording manager code that handles this logic (queueDepth = 900) import Foundation import ScreenCaptureKit import AVFoundation class RecordingManager: NSObject, ObservableObject, SCStreamDelegate { static let shared = RecordingManager() @Published var isRecording = false private var isStreamActive = false // Custom state flag private var stream: SCStream? private var streamOutputQueue = DispatchQueue(label: "com.clipback.StreamOutput", qos: .userInteractive) private var screenStreamOutput: ScreenStreamOutput? // Strong reference to output private var lastDisplayID: CGDirectDisplayID? private let displayCheckQueue = DispatchQueue(label: "com.clipback.DisplayCheck", qos: .background) // In-memory rolling buffer for last 30 seconds private var rollingFrameBuffer: [(CMSampleBuffer, CMTime)] = [] private let rollingFrameBufferQueue = DispatchQueue(label: "com.clipback.RollingBuffer", qos: .userInteractive) private let rollingBufferDuration: TimeInterval = 30.0 // seconds // Track frame statistics private var frameCount: Int = 0 private var lastReportTime: Date = Date() // Monitor for display availability private var displayCheckTimer: Timer? private var isWaitingForDisplay = false func startRecording() { print("[DEBUG] startRecording called.") guard !isRecording && !isWaitingForDisplay else { print("[DEBUG] Already recording or waiting, ignoring startRecording call") return } isWaitingForDisplay = true isStreamActive = true // Set active state checkForDisplay() } private func setupAndStartRecording(for display: SCDisplay, excluding appToExclude: SCRunningApplication?) { print("[DEBUG] setupAndStartRecording called for display: \(display.displayID)") let excludedApps = [appToExclude].compactMap { $0 } let filter = SCContentFilter(display: display, excludingApplications: excludedApps, exceptingWindows: []) let config = SCStreamConfiguration() config.width = display.width config.height = display.height config.minimumFrameInterval = CMTime(value: 1, timescale: 30) // 30 FPS config.queueDepth = 900 config.showsCursor = true print("[DEBUG] SCStreamConfiguration created: width=\(config.width), height=\(config.height), FPS=\(config.minimumFrameInterval.timescale)") stream = SCStream(filter: filter, configuration: config, delegate: self) print("[DEBUG] SCStream initialized.") self.screenStreamOutput = ScreenStreamOutput { [weak self] sampleBuffer, outputType in guard let self = self else { return } guard outputType == .screen else { return } guard sampleBuffer.isValid else { return } guard let attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]], let statusRawValue = attachments.first?[.status] as? Int, let status = SCFrameStatus(rawValue: statusRawValue), status == .complete else { return } self.trackFrameRate() self.handleFrame(sampleBuffer) } do { try stream?.addStreamOutput(screenStreamOutput!, type: .screen, sampleHandlerQueue: streamOutputQueue) stream?.startCapture { [weak self] error in print("[DEBUG] SCStream.startCapture completion handler.") guard error == nil else { print("[DEBUG] Failed to start capture: \(error!.localizedDescription)") self?.handleStreamError(error!) return } DispatchQueue.main.async { self?.isRecording = true self?.isStreamActive = true // Update state on successful start print("[DEBUG] Recording started. isRecording = true.") } } } catch { print("[DEBUG] Error adding stream output: \(error.localizedDescription)") handleStreamError(error) } } private func handleFrame(_ sampleBuffer: CMSampleBuffer) { rollingFrameBufferQueue.async { [weak self] in guard let self = self else { return } let pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) var retainedBuffer: CMSampleBuffer? CMSampleBufferCreateCopy(allocator: kCFAllocatorDefault, sampleBuffer: sampleBuffer, sampleBufferOut: &retainedBuffer) guard let buffer = retainedBuffer else { print("[DEBUG] Failed to copy sample buffer") return } self.rollingFrameBuffer.append((buffer, pts)) if let lastPTS = self.rollingFrameBuffer.last?.1 { while let firstPTS = self.rollingFrameBuffer.first?.1, CMTimeGetSeconds(CMTimeSubtract(lastPTS, firstPTS)) > self.rollingBufferDuration { self.rollingFrameBuffer.removeFirst() } } } } func stream(_ stream: SCStream, didStopWithError error: Error) { print("[DEBUG] Stream stopped with error: \(error.localizedDescription)") displayCheckQueue.async { [weak self] in // Move to displayCheckQueue for synchronization self?.handleStreamError(error) } } what could be the reason for this and what would be the possible fix logically? i dont understand why it's dependant on queueDepth, and if it is, how can I empty and append new recorded frames to it so that it continues working? any help or resource is greatly appreciated!
3
0
192
Jul ’25
On macOS 15.4+, NSWindow with kCGWindowSharingStateSharingNone still captured by ScreenCaptureKit
I have a custom NSWindow that I want to exclude from screen capture by setting its sharing state to kCGWindowSharingStateSharingNone. The goal is to prevent this window from appearing in the content captured by ScreenCaptureKit. [window setSharingType:NSWindowSharingType::NSWindowSharingNone]; However, on macOS 15.4+ (Sequoia), the window is still captured by ScreenCaptureKit and appears in the shared content. Does anyone know if kCGWindowSharingStateSharingNone is still effective with ScreenCaptureKit on macOS 15.4 and later?
1
0
329
Jul ’25
MacOS app on Sonoma with xcode Version 16.3 (16E140)
We are working on a screen capture app. I have provisioning setup for a developer id certificate for do direct distribution and a distribution certificate for Mac Store distribution; I submitted the app to the store with the distribution certificate provisioning active. We need to add documentation so while we are waiting, we decided to distribute the app directly and this is where the problems come in. I made the developer id certificate and archive-&gt;exported the app. Then I manually stapled the app with "xcrun stapler staple Madshot360.app". I created a dmg file with the exported app. The problems are; The app captures screen area with ScreenCaptureKit. A prior version of the app used a development certificate. When a user runs this new developer id cert app. the macos gets confused because it doesn't connect the new version to the already permissioned older app version. The user has to manually delete the old permission and then restart the app so the new version creates a new record which can then be enabled. This is confusing for the user since the permission says the app is enabled but it really isn't. We experimented with IT using a command line to delete the old app permission. That did not remove the old permission but now the user can't delete this record at all. What can I do to force the removal of a permission that is broken. The command we ran was this. "sudo tccutil reset ScreenCapture com.madwire.Madshot360" The app used to display it's normal warning that screen recording needed the users permission. This is the permission I talk about above. Now there is a second permission screen that states the following; "Madshot360" is requesting to bypass the system private window picker and directly access your screen and audio. This will allow Madshot360 to record your screen and system audio, including personal or sensitive information that may be visible or audible. Allow, Open System Settings. This is basically what the normal alert does. Why the second window and how can I stop it from appearing when the user has already allowed it. Is it because the binary is distributed directly from my computer? Summary: What can I do when a permission is broken? Is there a command that IT can use to remove any old permissions before installing the app. This app is to be used internally. Is there a command line that will remove a specific app's permission before installing the app? Remember, the command line I showed you basically further broke the permissions for this app. What is causing this second warning dialog to be displayed?
4
0
142
Jun ’25
ScreenCaptureKit confuses virtual displays
If there are multiple virtual displays connected when an app starts using SCStream, then if there is any change in the configuration of connected virtual screens (one of them gets disconnected, reconnected, etc), a new SCStream will always stream the content of the last connected virtual screen, no matter which virtual screen is intended to be streamed. This happens despite the fact that the SCContentFilter is properly configured for the stream, with the filter's content having the right displayID with the proper frame in the global display coordinate system and the filter also confirms that it knows the proper contentRect size and pointPixelScale. When all virtual displays are disconnected and reconnected, things return to normal - it's as is SCStream somehow gets confused and can't properly handle or internally reenumerate multiple virtual screens until none is connected. This issue does not normally come up, as most users will probably have only one virtual displays (like a Sidecar display) connected at most. However some configurations (like systems using multiple DisplayLink displays or apps using the undocumented CGVirtualDisplay to create virtual screens) encounter this issue. Note: this is a longstanding problem, has been like this from the first introduction of ScreenCaptureKit and even before, affected CGDisplayStream which similarly confused virtual screens.
2
0
108
Jun ’25
ScreenCaptureKit and mixed Retina/non-Retina configuration
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration. When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer. There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture). When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value. I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it. What is the proper way with ScreenCapture to handle the retina factors ?
1
1
807
Jun ’25
How to reset system window private picker alert with Screen Capture Kit
Hi, I would like to reset system window private picker alert with ScreenCapture kit. i can reset the ScreenCapture permission with tccutil reset ScreenCapture. but it does not reset the system window private picker alert. i tried deleting the application directory from container and it does not help. the system window private picker alert uses the old approval i gave and it does not prompt a new alert. How can i starta with fresh screencapture kit settings for an app in testing? Thanks
0
0
94
Jun ’25
Presenter Overlay Not Showing When Recording a Single Window or Region with ScreenCaptureKit
Hi, I'm using ScreenCaptureKit on macOS 14+ to record a single window. I've noticed that the Presenter Overlay only appears when capturing the entire screen, but it does not appear when recording a specific window or a region. Is there a way to enable the Presenter Overlay while recording a single window or a defined region, similar to how it works with full-screen capture? Any guidance or clarification would be greatly appreciated. Thanks in advance!
0
0
120
May ’25
I’m using ScreenCaptureKit on macOS to grab frames and measure end-to-end latency (capture → my delegate callback). For each CMSampleBuffer I read:
I’m using ScreenCaptureKit on macOS to grab frames and measure end-to-end latency (capture → my delegate callback). For each CMSampleBuffer I read: let pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds to get the “capture” timestamp, and I also extract the mach-absolute display time: let attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]] let displayMach = attachments?.first?[.displayTime] as? UInt64 // convert mach ticks to seconds... Then I compare both against the current time: let now = CACurrentMediaTime() let latencyFromPTS = now - pts let latencyFromDisplay = now - displayTimeSeconds But I consistently see negative values for both calculations—i.e. the PTS or displayTime often end up numerically larger than now. This suggests that the “presentation timestamp” and the mach-absolute display time are coming from a different epoch or clock domain than CACurrentMediaTime(). Questions: Which clocks/epochs does ScreenCaptureKit use for PTS and for .displayTime? How can I align these timestamps with CACurrentMediaTime() so that now - pts and now - displayTime reliably yield non-negative real-world latencies? Any pointers on the correct clock conversions or APIs to use would be greatly appreciated.
1
0
137
May ’25
SystemAudio Capture API Fails with OSStatus error 1852797029 (kAudioCodecIllegalOperationError)
Issue Description I'm implementing a system audio capture feature using AudioHardwareCreateProcessTap and AudioHardwareCreateAggregateDevice. The app successfully creates the tap and aggregate device, but when starting the IO procedure with AudioDeviceStart, it sometimes fails with OSStatus error 1852797029. (The operation couldn’t be completed. (OSStatus error 1852797029.)) The error occurs inconsistently, which makes it particularly difficult to debug and reproduce. Questions Has anyone encountered this intermittent "nope" error code (0x6e6f7065) when working with system audio capture? Are there specific conditions or system states that might trigger this error sporadically? Are there any known workarounds for handling this intermittent failure case? Any insights or guidance would be greatly appreciated. I'm wondering if anyone else has encountered this specific "nope" error code (0x6e6f7065) when working with system audio capture.
0
0
113
May ’25
"Application" is accessing your screen notification
Hi! I'm developing an application based on Chrome that needs to take regular screenshots of webpages. Under the hood (actually Chromium), it uses SCScreenshotManager to capture screenshots automatically (without user interaction). I've noticed that regularly using this API triggers a user notification saying: "Your Screen 'AppTest' has accessed your screen and system audio 3,594 times in the past 30 days. You can manage this in Settings." How can I prevent this notification from appearing? Are there any specific entitlements(Or configuration of SCScreenshotManager) that I can use? Thanks!
2
0
134
May ’25