Continuity Camera

RSS for tag

Support automatic camera switching and high-quality, high-resolution photo capture in your macOS app when iPhone is used as a camera for Mac.

Posts under Continuity Camera tag

5 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Lagging Video Feed Using VNGeneratePersonSegmentationRequest in macOS Camera Extension App
I'm developing a macOS application using Swift and a camera extension. I'm utilizing the Vision framework's VNGeneratePersonSegmentationRequest to apply a background blur effect. However, I'm experiencing significant lag in the video feed. I've tried optimizing the request, but the issue persists. Could anyone provide insights or suggestions on how to resolve this lagging issue? Details: Platform: macOS Language: Swift Framework: Vision code snippet I am using are below `class ViewController: NSViewController, AVCaptureVideoDataOutputSampleBufferDelegate { var frameCounter = 0 let frameSkipRate = 2 private let visionQueue = DispatchQueue(label: "com.example.visionQueue") func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { frameCounter += 1 if frameCounter % frameSkipRate != 0 { return } guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return } let ciImage = CIImage(cvPixelBuffer: pixelBuffer) performPersonSegmentation(on: ciImage) { [self] mask in guard let mask = mask else { return } let blurredBackground = self.applyBlur(to: ciImage) let resultImage = self.composeImage(with: blurredBackground, mask: mask, original: ciImage) let nsImage = ciImageToNSImage(ciImage: resultImage) DispatchQueue.main.async { [self] in // Update your NSImageView or other UI elements with the composite image if needToStream { if (enqueued == false || readyToEnqueue == true), let queue = self.sinkQueue { enqueued = true readyToEnqueue = false if let _ = image, let cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) { enqueue(queue, cgImage) } } } } } } private func performPersonSegmentation(on image: CIImage, completion: @escaping (CIImage?) -> Void) { let request = VNGeneratePersonSegmentationRequest() request.qualityLevel = .fast // Adjust quality level as needed request.outputPixelFormat = kCVPixelFormatType_OneComponent8 let handler = VNImageRequestHandler(ciImage: image, options: [:]) visionQueue.async { do { try handler.perform([request]) guard let result = request.results?.first as? VNPixelBufferObservation else { completion(nil) return } let maskPixelBuffer = result.pixelBuffer let maskImage = CIImage(cvPixelBuffer: maskPixelBuffer) completion(maskImage) } catch { print("Error performing segmentation: \(error)") completion(nil) } } } private func composeImage(with blurredBackground: CIImage, mask: CIImage, original: CIImage) -> CIImage { // Invert the mask to blur the background let invertedMask = mask.applyingFilter("CIColorInvert") // Ensure mask is correctly resized to match original image let resizedMask = invertedMask.transformed(by: CGAffineTransform(scaleX: original.extent.width / invertedMask.extent.width, y: original.extent.height / invertedMask.extent.height)) // Blend the images using the mask let blendFilter = CIFilter(name: "CIBlendWithMask")! blendFilter.setValue(blurredBackground, forKey: kCIInputImageKey) blendFilter.setValue(original, forKey: kCIInputBackgroundImageKey) blendFilter.setValue(resizedMask, forKey: kCIInputMaskImageKey) return blendFilter.outputImage ?? original } private func ciImageToNSImage(ciImage: CIImage) -> NSImage { let cgImage = context.createCGImage(ciImage, from: ciImage.extent)! return NSImage(cgImage: cgImage, size: ciImage.extent.size) } private func applyBlur(to image: CIImage) -> CIImage { let blurFilter = CIFilter.gaussianBlur() blurFilter.inputImage = image blurFilter.radius = 7.0 // Adjust the blur radius as needed return blurFilter.outputImage ?? image } }`
3
0
374
Jul ’24
Continuity for Development Use
Continuity is a very useful feature across Apple devices. Recently, I’d like to integrate continuity into my app by invoking sketch from the app. I couldn’t find an API for this, nor can I can find the program invoked when I right click on finder and try to add a drawing from my iPad. My two questions are: Is the feature officially supported for development use? In general, is it possible to trace what programs are called when I invoke a feature through the MacOS GUI?
0
0
339
Jun ’24
How to prevent continuity camera from correcting orientation?
I am using an AVCaptureSession in a macOS application for real time scan/image feature detection. Since most iPhone cameras are better than most Mac cameras I also support Continuity Camera. This generally works very well. However, when the user wants to point the iPhone/Continuity camera down at a document on the desk, as is most likely the case, the Continuity camera flips and 'corrects' the video orientation to landscape. How can I prevent that? I am using more or less the exact architecture that apple provides in the Supporting Continuity Camera sample app.
0
0
399
Mar ’24
My 14promax urgently needs 24 million pixels
The 24-megapixel camera is most widely used for daily photography, and the 48-megapixel camera is only used for taking landscapes or photos. After all, it takes up too much memory. The biggest problem with the 14promax now is that its photography is lame, and the 12-megapixel camera has long lagged behind Android. There are too many. Adding 24 million modes is much more valuable than updating iOS, and the experience is directly doubled.
0
0
434
Feb ’24
The lack of 24 million pixels in iPhone 14 promax is its biggest shortcoming
iPhone 14 promax uses 12 million pixels to sharpen it so severely that it is impossible to see. Using 48 million pixels takes up too much space. It starts with 128g. How can there be so much space, plus shooting videos. The biggest problem at present is that there are no 24 million pixels, which greatly affects the daily shooting experience. People around me who use the 14pro series say that not having 24 million pixels is the biggest problem at the moment.
0
0
392
Feb ’24