Core Video

RSS for tag

Process digital video using a pipeline-based API and support for both Metal and OpenGL using Core Video.

Core Video Documentation

Posts under Core Video tag

29 Posts
Sort by:
Post not yet marked as solved
1 Replies
458 Views
I am porting over some video decoding code from Intel to M1 and I'm seeing a very strange pixelFormat. The setup is pretty basic, basically just setting kCVPixelBufferMetalCompatibilityKey to true. But I am at a complete loss as to how to interpret this pixelFormat. In looking through CVPixelBuffer.h, I don't see any constant even close. (Using Xcode 12.5.1). This is the beginning of the debug description of the imageBuffer: CVPixelBuffer 0x6000eea7bf60 width=320 height=480 pixelFormat=&8v0 iosurface=0x6000e4c87ff0 planes=2 poolName=decode
Posted
by
Post not yet marked as solved
0 Replies
817 Views
I am developing an app that sends pixel buffers from the Broadcast Upload Extension to OpenTok. When I run my broadcast extension it hits its memory limit in seconds. I have been looking for ways to reduce the size and scale of CMSampleBuffers and ended the process by first converting them to CIImage, then scaling them, and then converting them to CVPixelBuffers for sending OpenTok Servers. Unfortunately, the extension still crashes, even though I tried to reduce the pixel buffer. My code follows: First I convert the CMSampleBuffer to CVPixelBuffer in processSampleBuffer function from Sample Handler then pass CVPixelBuffer to my function along with timestamps. Here I convert the CVPixelBuffer to cIImage and scale it using cIFilter(CILanczosScaleTransform). After that, I generate Pixel Buffer from CIImage using PixelBufferPool and cIContext and then send the new buffer to OpenTok Servers using videoCaptureConsumer. func processPixelBuffer(pixelBuffer:CVPixelBuffer, timeStamp ts:CMTime) { guard let ciImage = self.scaleFilterImage(inputImage: pixelBuffer.cmIImage, withAspectRatio: 1.0, scale: CGFloat(kVideoFrameScaleFactor)) else {return} if self.pixelBufferPool == nil || self.pixelBuffer?.size != pixelBuffer.size{ self.destroyPixelBuffers() self.updateBufferPool(newWidth: Int(ciImage.extent.size.width), newHeight: Int(ciImage.extent.size.height)) guard CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, self.pixelBufferPool, &self.pixelBuffer) == kCVReturnSuccess else {return} } context?.render(ciImage, to:pixelBuffer) self.videoCaptureConsumer?.consumeImageBuffer(pixelBuffer, orientation:.up, timestamp:ts, metadata:nil) } If the pixelBufferPool is nil or there is a change in the size of the pixelBuffer I update the pool. private func updateBufferPool(newWidth: Int, newHeight: Int) { let pixelBufferAttributes: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: UInt(self.videoFormat), kCVPixelBufferWidthKey as String: newWidth, kCVPixelBufferHeightKey as String: newHeight, kCVPixelBufferIOSurfacePropertiesKey as String: [:] ] CVPixelBufferPoolCreate(nil,nil, pixelBufferAttributes as NSDictionary?, &pixelBufferPool) } This is the function I use to scale the cIImage: func scaleFilterImage(inputImage:CIImage, withAspectRatio aspectRatio:CGFloat, scale:CGFloat) -> CIImage? { scaleFilter?.setValue(inputImage, forKey:kCIInputImageKey) scaleFilter?.setValue(scale, forKey:kCIInputScaleKey) scaleFilter?.setValue(aspectRatio, forKey:kCIInputAspectRatioKey) return scaleFilter?.outputImage } My question is why it still keeps crashing and is there another way to reduce the CVPixelBuffer size without causing a memory limit crash? I would appreciate any help on this. Swift or Objective - C, I am open to all suggestions.
Posted
by
Post marked as solved
1 Replies
545 Views
My CODE: the mediaURL.path is obtained from UIImagePickerControllerDelegate guard UIVideoEditorController.canEditVideo(atPath: mediaURL.path) else { return } let editor = UIVideoEditorController() editor.delegate = self editor.videoPath = mediaURL.path editor.videoMaximumDuration = 10 editor.videoQuality = .typeMedium self.parentViewController.present(editor, animated: true) Error description on console as below. Video export failed for asset <AVURLAsset: 0x283c71940, URL = file:///private/var/mobile/Containers/Data/PluginKitPlugin/7F7889C8-20DB-4429-9A67-3304C39A0725/tmp/trim.EECE5B69-0EF5-470C-B371-141CE1008F00.MOV>: Error Domain=AVFoundationErrorDomain Code=-11800 It doesn't call func videoEditorController(_ editor: UIVideoEditorController, didFailWithError error: Error) After showing error on console, UIVideoEditorController automatically dismiss itself. Am I doing something wrong? or is it a bug in swift? Thank you in advance.
Posted
by
Post not yet marked as solved
20 Replies
13k Views
Hi! I recently bought the new iPhone 12 Pro Max. I have noticed that when I shoot video's in the dark (with the lights on in the house), there is some kind of flickering visible in the video. Apparently it is possible that due to very fast flickering of lights, slowmo video's make this kind of flickering visible when you can not see it with the ***** eye. I however have this problem with normal video's as well. I have compared it with the video's on my iPhone X and it is definitely worse in my iPhone 12 video's. I noticed that this happens while recording video on HD (or 4K) at 60 FPS, if you switch to 30 FPS this doesn't happen. Anyone else that has this problem? Problem happening on iOS 14.2.1 and iOS 14.3 Beta 2. Thanks!
Posted
by
Post not yet marked as solved
5 Replies
1.7k Views
I have a background process which is updating an IOSurface-backed CVPixelBuffer at 30fps. I want to render a preview of that pixel buffer in my window, scaled to the size of the NSView that's displaying it. I get a callback every time the pixelbuffer/IOSurface is updated. I've tried using a custom layer-backed NSView and setting the layer contents to the IOsurface -- which works when the view is created but it's never updated unless the window is resized or another window is in front of it. I've tried setting both my view and my layer SetNeedsDisplay(), I've tried changing the layerContentsRedrawPolicy to .onSetNeedsDisplay, I've tried making sure all my content and update code is happening on the UI thread, but I can't get it to dynamically update. Is there a way to bind my layer or view to the IOSurface once and then just have it reflect the updates as they happen, or, if not, at least mark the layer as dirty each frame when it changes? I've pored over the docs but I don't see a lot about the relationship between IOSurface and CALayer.contents, and when in the lifecycle to mark things dirty (especially when updates are happening outside the view). Here's example code: class VideoPreviewThumbnail: NSView, VideoFeedConsumer {   let testCard = TestCardHelper()       override var wantsUpdateLayer: Bool {     get { return true }   }   required init?(coder decoder: NSCoder) {     super.init(coder: decoder)     self.wantsLayer = true     self.layerContentsRedrawPolicy = .onSetNeedsDisplay      &#9;&#9;/* Scale the incoming data to the size of the view */      self.layer?.transform = CATransform3DMakeScale(       (self.layer?.contentsScale)! * self.frame.width / CGFloat(VideoSettings.width),       (self.layer?.contentsScale)! * self.frame.height / CGFloat(VideoSettings.height),       CGFloat(1)) &#9; /* Register us with the content provider */     VideoFeedBrowser.instance.registerConsumer(self)   }       deinit{     VideoFeedBrowser.instance.deregisterConsumer(self)   }       override func updateLayer() { &#9;&#9;/* ideally we woudln't need to do this */     updateLayer(pixelBuffer: VideoFeedBrowser.instance.renderer.pixelBuffer)   }    &#9;/* This gets called every time our pixelbuffer is updated (30fps) */   @objc   func updateFrame(pixelBuffer: CVPixelBuffer) {     updateLayer(pixelBuffer: pixelBuffer)   }       func updateLayer(pixelBuffer: CVPixelBuffer) {     guard let surface = CVPixelBufferGetIOSurface(pixelBuffer)?.takeUnretainedValue() else {      print("pixelbuffer isn't IOsurface backed! noooooo!")       return;     } &#9; /* these don't have any effect */ &#9; //&#9;&#9;self.layer?.setNeedsDisplay() //&#9;&#9;self.setNeedsDisplay(invalidRect: self.visibleRect)     self.layer?.contents = surface   } }
Posted
by
Post not yet marked as solved
3 Replies
1k Views
So my timeline is this: Got MBP 16' in March with graphics options: AMD Radeon Pro 5500M 4 GB Intel UHD Graphics 630 1536 MB Up until 10.15.5 came out, I had zero problems/crashes and I always have the laptop closed and an external display connected with an official Apple A/V adapter using HDMI. As soon as I installed 10.15.5 the panics started happening. Reason:&#9;&#9;&#9;&#9;&#9; (1 monitored services unresponsive): checkin with service: WindowServer returned not alive with context: unresponsive work processor(s): WindowServer main thread&#9;40 seconds since last successful checkin Literally after the update ended, I didn't touch the laptop for some time, the external monitor went to sleep and the laptop panic'ed and rebooted. I installed apps like Caffeine to prevent the external monitor from going to sleep and managed to continue working. Some days after this the crashes started happening even when the monitor was not going to sleep. Usually when using apps that put some strain on the video such as video conferencing apps. These crashes started to become more frequent. The display froze, for about 2 minutes, the laptop started getting very warm and the fans would not go faster, then after 2 minutes the fans go into turbo mode for about 1 second and the laptop reboots. After this I reverted to 10.15.4 and reset SMC, etc, and the panics when the display goes to sleep are gone, but the crashes when I'm using the computer continue. I tried ditching the adapter and using a usb-c displayport cable but the problem remained. As a final test, I unplugged everything from the laptop and disabled "automatic graphics switching" to force the AMD to be used even with no external display. Sure enough, I was able to reproduce the issue. So it seems not related to an external display, but the AMD card itself (which is always used when an external display is connected). Sad times.
Posted
by