All the examples of using AVVideoCompositionCoreAnimationTool with AVAssetExportSession that I've found set the properties of the session to match a preexisting video track passed as the asset in the session's init method. I need to export a cool CALayer with animations without a pre-existing video track. I know I need to use `AVVideoCompositionCoreAnimationTool(additionalLayer: ..., asTrackID: ...)` though I don't know what I'm suppossed to do with the track ID, or where I'm suppossed to get it. I figured out I need to initialize the session with an AVMutableComposition (which took hours to figure out), and I sort of expect I need to add a video track to that, but again I don't know what to do with the track id. So I'm still stuck when I begin exporting, with the two errors:session error = Error Domain=AVFoundationErrorDomain Code=-11822 "Cannot Open" UserInfo={NSLocalizedFailureReason=This media format is not supported., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x604000259c20 {Error Domain=NSOSStatusErrorDomain Code=-16976 "(null)"}}The output file is never created.I tried fiddling with the presets of the session, I haven't found one which produces different errors yet.Is there a recipe of how to use these two classes together somewhere, without a preexisting video background track?I'd like for my users to be able to set the frame size & frame rate (perhaps from a popup list of predetermined options, if necessary).I'm starting with macos.
I'd like to have a CoreData store in a shared app group container between my app and an extension. Since iOS 8.2, file coordination supports that file setup, but I don't see the classes in the CoreData stack which conform to the NSFilePresenter protocols, nor own an NSFileCoordinator. While most of the time I expect the app to be dormant while the extension is running, there will be brief moments of important cross link where a user starts in the extension, but has to pull up the app to continue.Or maybe do the coredata stores handle syncing in some other way already?
Post not yet marked as solved
My goal is to write out a video file with custom-drawn content in each frame, using CGContext. I'm having an issue where after I hit input.markCurrentPassAsFinished(), I get a crash, because [AVAssetWriterInputMediaDataRequester delegate]: message sent to deallocated instance 0x60c00025fa40But AVAssetWriterInputMediaDataRequester is not a documented class, so I don't know what object isn't being kept around long enough.Code:import Foundationimport AVFoundationimport CoreMediaimport Cocoaclass VideoRenderer { //my demo data model let title:String let outputUrl:URL var progress:Progress let writer : AVAssetWriter let completion:(Bool)->() init(title:String, outputUrl:URL, completion:@escaping(Bool)->()) throws { self.title = title self.outputUrl = outputUrl self.completion = completion progress = Progress(parent: nil, userInfo: nil) writer = try AVAssetWriter(outputURL: outputUrl, fileType: .mp4) /* guard let pixelPool = CVPixelBufferPool.create(width: 1280, height: 720) else { throw PoolCreatorError.cantEven } self.pixelPool = pixelPool */ } func export() { //must specify AVVideoCodecKey, AVVideoWidthKey, and AVVideoHeightKey //AVVideoCodecKey = AVVideoCodecTypeH264 let input = AVAssetWriterInput(mediaType: .video, outputSettings: [ AVVideoCodecKey : AVVideoCodecType.h264, AVVideoWidthKey:1280, AVVideoHeightKey:720, AVVideoCompressionPropertiesKey:[ AVVideoAverageBitRateKey:2600000 as NSNumber ,AVVideoExpectedSourceFrameRateKey:30.0 as NSNumber // ,AVVideoAverageNonDroppableFrameRateKey:30 as NSNumber //not supported for H.264 ] //lots of additional keys related to colors ]) input.mediaTimeScale = 30000 //to support 29.97 for NTSC writer.add(input) inputWriter = input let bufferAdapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: input, sourcePixelBufferAttributes: [ kCVPixelBufferCGBitmapContextCompatibilityKey as String:true, kCVPixelBufferWidthKey as String:1280 as CFNumber, kCVPixelBufferHeightKey as String:720 as CFNumber, kCVPixelBufferPixelFormatTypeKey as String:kCVPixelFormatType_32ARGB ] ) pixelBufferAdapter = bufferAdapter if !writer.startWriting() { completion(false) return } writer.startSession(atSourceTime:CMTime(seconds: 0.0, preferredTimescale: 30) /*CMTime(value: 0, timescale: 30)*/) input.respondToEachPassDescription(on: DispatchQueue.global(qos: .userInitiated)) { guard let timeRange:CMTimeRange = input.currentPassDescription?.sourceTimeRanges.first?.timeRangeValue else { //we're done self.writer.finishWriting(completionHandler: { self.didComplete() //self.pixelBufferAdapter = nil }) return } self.lastGeneratedTime = timeRange.start.seconds input.requestMediaDataWhenReady(on: DispatchQueue.global(qos: .userInitiated)) { while input.isReadyForMoreMediaData { //get time let thisTimeStamp = CMTime(seconds: self.lastGeneratedTime, preferredTimescale: input.mediaTimeScale) if thisTimeStamp.seconds >= 2.0 { //compare to end of timeRange input.markCurrentPassAsFinished() return } let nextTime:Double = self.lastGeneratedTime + (1/30.0) //print("thisTime = \(thisTimeStamp.seconds)") //if over, cancel render guard let buffer:CVPixelBuffer = bufferAdapter.pixelBufferDrawing({ context in //TODO: draw frame context.setFillColor(NSColor.blue.cgColor) context.fill(CGRect(x: 0.0, y: 0.0, width: 1280.0, height: 720.0)) }) else { continue } if !bufferAdapter.append(buffer, withPresentationTime: thisTimeStamp) { print("didn't append") } self.lastGeneratedTime = nextTime } } } //endsession? } //var pixelPool:CVPixelBufferPool func didComplete() { if writer.status != .completed ,let error = writer.error { print("writer error = \(error)") } completion(writer.status == .completed) } var lastGeneratedTime:Double = 0.0 var pixelBufferAdapter:AVAssetWriterInputPixelBufferAdaptor? var inputWriter:AVAssetWriterInput?}extension AVAssetWriterInputPixelBufferAdaptor { func pixelBufferDrawing(_ work:(CGContext)->())->CVPixelBuffer? { var bufferOrNil:CVPixelBuffer? guard let pool = pixelBufferPool ,CVPixelBufferPoolCreatePixelBuffer(nil, pool, &bufferOrNil) == kCVReturnSuccess ,let buffer = bufferOrNil else { return nil } let width:Int = CVPixelBufferGetWidth(buffer) let height:Int = CVPixelBufferGetHeight(buffer) CVPixelBufferLockBaseAddress(buffer, []) //create a cg graphics context using the pixel buffer's data bytes let data = CVPixelBufferGetBaseAddress(buffer) let rgbColorSpace = CGColorSpaceCreateDeviceRGB() guard let context = CGContext(data: data, width: width, height: height, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(buffer), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue) else { return nil } work(context) CVPixelBufferUnlockBaseAddress(buffer, []) return buffer }}
Post not yet marked as solved
I have a custom render CGImage, being put in an NSImage, put in an NSImageView, which is a subview of a custom NSView subclass returned for the column's view for a view-based NSTableView. When I use the visual debug inspector (not the view debugger), I get the image I expect when I preview the cgimage, the ns image, the nsimageview, but when I preview the cell I get a "ghosted" image. The dark blacks appear as medium grays, the saturated colors are pastels. I have checked the alphas and looked for secret over-lapping subviews, double checked my columns are not zero-width, and ensured my layout constraints made my image views the full size of my cell. What am I missing that could be causing the image to become faded out like that?
Post not yet marked as solved
On iOS, I need to use an image resource as a mask to create a tinted image. I cannot use CALayer or UIView tinting or masking capacities.I'm trying to use CGImage.masking(_ mask:CGImage)->CGImage?however, with the folowing code in an iOS Swift Playground, all I get from the mask is the same square solid-color image.I have already tried changing the "maskUiImage" from using a black image with an alpha mask to both a white image on black background with no alpha and a white image on a black background, none of these change affected the output of the final image, though they do affect the output of "uiMask".import UIKitimport CoreGraphicsextension CGImage { public class func make(size:CGSize, scale:CGFloat, drawing:(CGContext)->())->CGImage { UIGraphicsBeginImageContextWithOptions(size, false, scale) let bitmapContext:CGContext = UIGraphicsGetCurrentContext()! drawing(bitmapContext) let image = bitmapContext.makeImage()! UIGraphicsEndImageContext() return image }}CGBitmapInfo.alphaInfoMask.rawValuelet maskUiImage:UIImage = #imageLiteral(resourceName: "si_bolt_solid_opaque_black@3x.png")//provide your own masking image literallet maskImage:CGImage = maskUiImage.cgImage!let imageMask:CGImage = CGImage(maskWidth: maskImage.width, height: maskImage.height, bitsPerComponent: maskImage.bitsPerComponent, bitsPerPixel: maskImage.bitsPerPixel, bytesPerRow: maskImage.bytesPerRow, provider: maskImage.dataProvider!, decode: nil, shouldInterpolate: false)!let uiMask = UIImage(cgImage:imageMask, scale:maskUiImage.scale, orientation:.up)let unmaskedImage:CGImage = CGImage.make(size: maskUiImage.size, scale: maskUiImage.scale) { (context) in context.setFillColor(UIColor.green.cgColor) context.fill(CGRect(origin:.zero, size:maskUiImage.size))}let uiUnmaskedImage = UIImage(cgImage: unmaskedImage, scale:maskUiImage.scale, orientation:.up)guard let maskedImage:CGImage = unmaskedImage.masking(imageMask) else { fatalError("new image was nil")}let uiMaskedImage:UIImage = UIImage(cgImage:maskedImage, scale:maskUiImage.scale, orientation:.up)//the problem is that "uiUnmaskedImage" and "uiMaskedImage" are identical. "uiUnmaskedImage" is as expected, a green square the size of my mask image. However, unexpectly, and undesirably, "uiMaskedImage" is also a green square. I want the masked image to be transparent where I don't provide the passthrough-color in the original, and my color selected in the "CGImage.make(size:..." function where I do provide the passthrough color in the original masking image. I can ask our designer to configure the original mask image anyway it needs to be, but I've already tried several combinations with no impact on the final result.