ScreenCaptureKit

RSS for tag

ScreenCaptureKit brings high-performance screen capture, including audio and video, to macOS.

ScreenCaptureKit Documentation

Posts under ScreenCaptureKit tag

33 Posts
Sort by:
Post not yet marked as solved
0 Replies
155 Views
Hello! I'm trying to find my own window or SCWindow in a SCDisplay. This is so I can automatically change the SCDisplay to record when I drag a window from a display to another. Is there any way to check the windows that are contained in every SCDisplay? Thank you in advance!
Posted
by gmorubio.
Last updated
.
Post not yet marked as solved
0 Replies
430 Views
Hi, I'm new to AVAudioEngine(and macOS programming in general). I'm trying to mix microphone audio with ScreenCaptureKit audio using AVAudioEngine without playing it back. I've created a AVAudioPlayerNode and scheduling buffers in my SCStream handler: playerNode.scheduleBuffer(samples) and have connected the playerNode to the mainMixerNode. audioEngine.connect(audioEngine.inputNode, to: audioEngine.mainMixerNode, format: micFormat) audioEngine.connect(playerNode, to: audioEngine.mainMixerNode, format: format) The problem is that mainMixerNode plays the audio to the speaker creating a feedback loop. How can I prevent the mixer output from being played back. Also: Is this the best way of mixing microphone input with some other input? I ran into AVAudioEngine's manual rendering mode, which seems like the way to go for mixing audio without playing it back. However, I couldn't figure out how to connect microphone input to the AVAudioEngine in manual rendering mode?
Posted
by tarmac94.
Last updated
.
Post not yet marked as solved
1 Replies
510 Views
If someone in Apple WWDR sees this, please take the feedback to heart and report it up the chain: When you announce that a technology is being deprecated — such as CGDisplayStream — and also publish WWDC sessions about the intended replacement — ScreenCaptureKit — then you also need to give third-party developers a clear deadline by which this technology will be deprecated so that they can plan engineering efforts around implementing the new feature, and have ample time to communicate this to their customers. If it's important for third-party developers to get on board with this change, you should use every available means to communicate this to them, including multiple email alerts to their registered email address. Additionally, if you plan to make a BREAKING change in a framework that results in a wildly different user experience, you should probably hold that off until the summer release for the next major OS. What you should definitely NOT do is roll out a new privacy prompt in a mid-year release of macOS; or give your developers, customers, and AppleSeed program participants zero advance notice that this alert is coming, ignore your own Human Interface Guidelines when designing said prompt, and perform no user experience design testing (aka "putting on your customer hat") during a presumed internal alpha testing cycle to refine the experience and still find the most effective and least annoying way to present this additional prompt and spur change with your third-party developers. Oh, wait, you've done exactly all those things the wrong way with respect to ScreenCaptureKit. Right now, a host of Apple device administrators and client platform engineers are sending mountains of feedback to you, and they're also scrambling to contact third-party developers to let them know this is coming. Most of the vendors being discussed in private forums are said to be caught off guard by this change. We anticipate that users are not going to like this, and there is no way we can manage it with MDM or configuration profiles. In short, the current experience is a ghastly mess. WE, the administrators, will get blamed for this, not the third-party developers. WE will have to explain to our leadership why this experience is terrible and cannot be managed. Engineers need deadlines to help plan their work and prioritize tasks. In this case, vendors have had no firm deadline for this effort. There's already precedence for Apple announcing estimated deadlines for deprecations and feature removals. You do your developers and customers a great disservice by not communicating schedules to them. Please do better. P.S.: Feedback filed as FB13619326.
Posted Last updated
.
Post not yet marked as solved
0 Replies
262 Views
Hello, I'm playing with the ScreenCaptureKit. My scenario is very basic: I capture frames and assign them to a CALayer to display in a view (for previewing) - basically what Apple's sample app does. I have two screens, and I capture only one of them, which has no app windows on it. And my app excludes itself from capture. So, when I place my app on the screen which is not being captured, I observe that most didOutputSampleBuffer calls receive frames with Idle status (which is expected for a screen which is not updating). However, if I bring my capturing app (which, to remind, is excluded from capture) to the captured screen, the frames start coming with Complete status (i.e. holding pixel buffers). And this is not what I expect - from capture perspective the screen is still not updating, as the app is excluded. The preview which the app displays proves that, showing empty desktop. So it seems like updates to a CALayer triggers screen capture frame regardless of the app being included or excluded. This looks like a bug to me, and can lead to noticable waste of resources and battery when frames are not just displayed on screen, but also processed somehow or/and sent over a network. Also, I'm observing another issue due to this behavior, where capture hangs when queueDepth is set to 3 in this same scenario (but I'll describe it separately). Please, advise if I should file a bug somewhere, or maybe there is a rational explanation of this behavior. Thank you
Posted
by andsav.
Last updated
.
Post not yet marked as solved
0 Replies
261 Views
I found that didOutputSampleBuffer would not be called for long time when the screen is no change. It sometimes make me confused for if there something wrong. In my design, it will change to other screen shot method, such as CGWindowListCreateImage, when long time no data. But this is not what I expected. I set the minimumFrameInterval to 30 but it seems no work. [config setMinimumFrameInterval:CMTimeMake(1, 30)]; Is there any settings that can let me get a didOutputSampleBuffer, even given a CMSampleBufferRef with status SCFrameStatusIdle, called atlest one time per second? Which would make me think it works fine without any exception.
Posted
by Wenqi_Liu.
Last updated
.
Post not yet marked as solved
0 Replies
315 Views
I notice from macOS Sonoma System Settings, we have "Screen & System audio Recording". I'm an macOS app developer and want to request only Audio permission, I browse the document for a while and WWDC code demo, but still have no idea of how to request "System Audio Recording Only" permission? All the demo and doc I can find is request "Screen Recording & System Audio"
Posted Last updated
.
Post not yet marked as solved
2 Replies
316 Views
When I try to initialize SCContentFilter(desktopIndependentWindow: window), I get a very weird error: Assertion failed: (did_initialize), function CGS_REQUIRE_INIT, file CGInitialization.c, line 44. This is a brand new CLI project on Sonoma 14.2.1. Any ideas?
Posted Last updated
.
Post not yet marked as solved
0 Replies
335 Views
Hello, I develop an application called MiniMeters and am using ScreenCaptureKit to get the desktop audio on macOS. As of macOS 14.2, a few users began noticing that the volume of the incoming audio differed depending on the audio device connected. One user's Apogee Symphony Desktop is trimmed -14dB, another user's UAD Apollo Twin is -14dB as well, my MOTU M4 is -6dB, and my MacBook Pro's internal speakers show 0dB. This does not change with changing the output volume on the interface (obviously), nor digitally in the system. I cannot seem to find anything documenting this change. It also affects other applications that use ScreenCaptureKit such as OBS. Does anyone have any idea what that could correlate to so I could potentially compensate? Thanks.
Posted Last updated
.
Post not yet marked as solved
1 Replies
379 Views
I try capture screen on wingon. For start app on winlogon i register LaunchAgent to start on LoginWindow. Application is started, but callback getShareableContentExcludingDesktopWindows newer called, and application just stuck.
Posted Last updated
.
Post not yet marked as solved
1 Replies
522 Views
So I'm building a colour sampler tool, similar to ColorSlurp and since CGWindowListCreateImage (which I am using for below macOS 14.0) is deprecated since Sonoma, I am wondering what's the best approach to replacing it. The way I use CGWindowListCreateImage currently is to take a screenshot of a specified area around the mouse pointer every time the mouse moves. Which works perfectly fine without issues. Now I've tried replacing CGWindowListCreateImage with SCScreenshotManager.createImage which is an async function and as you might expect, running async functions on mouse movements doesn't quite work out that well. It is lagging behind, heavily. So my question would be what's the appropriate ScreenCaptureKit methods to replace the functionality already created with CGWindowListCreateImage? Should I create a SCStream instead? Then I am worried about the fact that I would need to stream the whole screen instead of just the area around the mouse pointer since updating stream configs is an async function as well. I'd greatly appreciate any sort of direction!
Posted
by shipty.
Last updated
.
Post not yet marked as solved
1 Replies
543 Views
My existing code is working properly in iOS < 17 devices it records the iPhone screen and records audio as well simultaneously, but in iOS 17 devices the screen recording video is captured for only 2 seconds and then stops automatically, As its an extension, i don't have logs to debug the issue. I have tested the same code in other iPhones and OS less than 17, its working fine but in iOS 17 devices this issue is coming. @try { NSLog(@“initAssesWriter”); NSError *error = nil; CGRect screenRect = [[UIScreen mainScreen] bounds]; _videoWriter = [[AVAssetWriter alloc] initWithURL: _filePath fileType:AVFileTypeMPEG4 error:&error]; NSParameterAssert(_videoWriter); //Configure video NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithDouble:2048*1024.0], AVVideoAverageBitRateKey, nil ]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecTypeH264, AVVideoCodecKey, [NSNumber numberWithInt:screenRect.size.width * 4], AVVideoWidthKey, [NSNumber numberWithInt:screenRect.size.height * 4], AVVideoHeightKey, videoCompressionProps, AVVideoCompressionPropertiesKey, nil]; _writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ; _writerInput.expectsMediaDataInRealTime = YES; NSParameterAssert(_writerInput); NSParameterAssert([_videoWriter canAddInput:_writerInput]); [_videoWriter addInput:_writerInput]; AudioChannelLayout acl; bzero( &acl, sizeof(acl)); acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey, [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey, [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey, [ NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey, [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey, nil]; _audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeAudio outputSettings: audioOutputSettings ]; _audioWriterInput.expectsMediaDataInRealTime = YES; // seems to work slightly better NSParameterAssert(_audioWriterInput); NSParameterAssert([_videoWriter canAddInput:_audioWriterInput]); [_videoWriter addInput:_audioWriterInput]; [_videoWriter setMovieFragmentInterval:CMTimeMake(1, 600)]; [_videoWriter startWriting]; } @catch (NSException *exception) { } @finally { } -(void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType{ @try { if(!_isRecordingStarted){ [_videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)]; _isRecordingStarted = YES; [self saveFlurryLogs:@"Assest writer Start Recording" Details:@""]; NSLog(@"CMSampleBufferGetPresentationTimeStamp"); } } @catch (NSException *exception) { [self saveFlurryLogs:@"Recording Start Execption" Details:exception.description]; } @finally { } @try { switch (sampleBufferType) { case RPSampleBufferTypeVideo: // Handle video sample buffer if([_writerInput isReadyForMoreMediaData]){ [_writerInput appendSampleBuffer:sampleBuffer]; NSLog(@"writing matadata Video"); } break; case RPSampleBufferTypeAudioApp: // Handle audio sample buffer for app audio break; case RPSampleBufferTypeAudioMic: if([_audioWriterInput isReadyForMoreMediaData]){ [_audioWriterInput appendSampleBuffer:sampleBuffer]; NSLog(@"writing matadata Audio"); } // Handle audio sample buffer for mic audio break; default: break; } } @catch (NSException *exception) { [self saveFlurryLogs:@"Packet Write Execption" Details:exception.description]; } @finally { } }
Posted
by GRishi.
Last updated
.
Post not yet marked as solved
0 Replies
371 Views
Trying to integrate the new screencapturekit into our application. The stand alone test we made works fine, however when integrated, when we start the stream capture we get this error in the logs (ScreenCaptureKit) [ERROR] _SCStream_RemoteAudioQueueOperationHandlerWithError:1032 Error received from the remote queue -16665 Any insights what might be causing this? this is what we're passing addStreamOutput private let sampleQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! + ".SampleQueue") self.stream = SCStream(filter: filter, configuration: self.streamConfig, delegate: self) do { try self.stream?.addStreamOutput(self, type: .screen, sampleHandlerQueue: self.sampleQueue) } We have the whole handlers and what not, pretty much verbatim from the apple provided sample
Posted Last updated
.
Post marked as solved
1 Replies
502 Views
I am using ScreenCaptureKit to create a screenshot software, but I found that the screenshot captured by the new API, SCScreenshotManager.captureImage, is very blurry. This is my screenshot. It is so blurry. But I hope it's like this. My code is as follows. func captureScreen(windows: [SCWindow], display: SCDisplay) async throws -> CGImage? { let availableWindows = windows.filter { window in Bundle.main.bundleIdentifier != window.owningApplication?.bundleIdentifier } let filter = SCContentFilter(display: display, including: availableWindows) if #available(macOS 14.0, *) { let image = try? await SCScreenshotManager.captureImage( contentFilter: filter, configuration: SCStreamConfiguration.defaultConfig( width: display.width, height: display.height ) ) return image } else { return nil } } extension SCStreamConfiguration { static func defaultConfig(width: Int, height: Int) -> SCStreamConfiguration { let config = SCStreamConfiguration() config.width = width config.height = height config.showsCursor = false if #available(macOS 14.0, *) { config.captureResolution = .best } return config } }
Posted
by Hylas.
Last updated
.
Post not yet marked as solved
0 Replies
403 Views
I work on a screen recorder app and having issues with the new presenter overlay mode on macOS 14. Switching to the "Small" overlay is fine, but switching to the "Large" overlay mode causes our AVAssetWriter to fail every time with the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-16364), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x6000028729a0 {Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"}} which doesn't provide any helpful information. I'm not sure if we're doing something wrong, but I've tried to reduce our code as much as possible and still get the crash. I'm not sure if anyone has any clues or is experiencing the same thing? Alternatively, is there a way to disable presenter overlay until it's fixed? Our app displays a camera and uses ScreenCaptureKit to record the screen along with the camera, which automatically enables presenter overlay options. I can't find any way to opt-out or turn off the presenter overlay options which is a bummer. That seems like it should be controllable from either the AVCaptureSession or SCStreamConfiguration
Posted Last updated
.
Post not yet marked as solved
0 Replies
462 Views
I'm encountering performance degradation with my application that utilizes ScreenCaptureKit. Even after explicitly disabling App Nap using the NSAppSleepDisabled key, the problem persists. My application, which relies heavily on ScreenCaptureKit for its core functionality, experiences significant performance drops after running for a short period. When I click on the application, the performance momentarily returns to normal but quickly deteriorates again. I've checked for memory leaks in my application and haven't found any issues in that regard. Has anyone experienced similar performance issues with ScreenCaptureKit? I'm keen to know if there are any known bugs or workarounds to mitigate this problem.
Posted
by hhajime.
Last updated
.
Post not yet marked as solved
1 Replies
390 Views
Hey team -- looking to receive a delegate callback in Sonoma for screencapturekit when the toolbar portion of stop sharing is called. Does this come through in any of these? /** @abstract stream :didStopStreamWithError: @param stream the SCStream object @param error the error denoted by the stopping of the stream @discussion notifies the delegate that the stream has stopped and the error associated with it */ optional func stream(_ stream: SCStream, didStopWithError error: Error) /** @abstract outputVideoEffectDidStartForStream: @param stream the SCStream object @discussion notifies the delegate that the stream's overlay video effect has started. */ @available(macOS 14.0, *) optional func outputVideoEffectDidStart(for stream: SCStream) /** @abstract stream:outputVideoEffectDidStart: @param stream the SCStream object @discussion notifies the delegate that the stream's overlay video effect has stopped. */ @available(macOS 14.0, *) optional func outputVideoEffectDidStop(for stream: SCStream)
Posted Last updated
.
Post not yet marked as solved
0 Replies
321 Views
I am new to ios. I want to develop the application where I can able to share screen (display)from iphone(or from iphone to iphone) for remote support. For this this I need screen capture application or want to develop application .Can any apple developer help me on this.I want to chat and connect with Apple developer. If any freelancer is there then please connect with me.
Posted Last updated
.
Post not yet marked as solved
0 Replies
501 Views
I want to develop the application where I can able to share screen (display)from iphone(or from iphone to iphone) for remote support(or any other method) for this this I need screen captute application or API or system call. If any app is there in iphone or app store or any screen sharing client which can give me screen capture support.
Posted Last updated
.