Posts

Post marked as unsolved
671 Views

What objects manage AVAssetWriterInputMediaDataRequester? I get a crash after markCurrentPassAsFinished because the call to its delegate property references a deallocated instance.

My goal is to write out a video file with custom-drawn content in each frame, using CGContext. I'm having an issue where after I hit input.markCurrentPassAsFinished(), I get a crash, because [AVAssetWriterInputMediaDataRequester delegate]: message sent to deallocated instance 0x60c00025fa40But AVAssetWriterInputMediaDataRequester is not a documented class, so I don't know what object isn't being kept around long enough.Code:import Foundationimport AVFoundationimport CoreMediaimport Cocoaclass VideoRenderer { //my demo data model let title:String let outputUrl:URL var progress:Progress let writer : AVAssetWriter let completion:(Bool)->() init(title:String, outputUrl:URL, completion:@escaping(Bool)->()) throws { self.title = title self.outputUrl = outputUrl self.completion = completion progress = Progress(parent: nil, userInfo: nil) writer = try AVAssetWriter(outputURL: outputUrl, fileType: .mp4) /* guard let pixelPool = CVPixelBufferPool.create(width: 1280, height: 720) else { throw PoolCreatorError.cantEven } self.pixelPool = pixelPool */ } func export() { //must specify AVVideoCodecKey, AVVideoWidthKey, and AVVideoHeightKey //AVVideoCodecKey = AVVideoCodecTypeH264 let input = AVAssetWriterInput(mediaType: .video, outputSettings: [ AVVideoCodecKey : AVVideoCodecType.h264, AVVideoWidthKey:1280, AVVideoHeightKey:720, AVVideoCompressionPropertiesKey:[ AVVideoAverageBitRateKey:2600000 as NSNumber ,AVVideoExpectedSourceFrameRateKey:30.0 as NSNumber // ,AVVideoAverageNonDroppableFrameRateKey:30 as NSNumber //not supported for H.264 ] //lots of additional keys related to colors ]) input.mediaTimeScale = 30000 //to support 29.97 for NTSC writer.add(input) inputWriter = input let bufferAdapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: input, sourcePixelBufferAttributes: [ kCVPixelBufferCGBitmapContextCompatibilityKey as String:true, kCVPixelBufferWidthKey as String:1280 as CFNumber, kCVPixelBufferHeightKey as String:720 as CFNumber, kCVPixelBufferPixelFormatTypeKey as String:kCVPixelFormatType_32ARGB ] ) pixelBufferAdapter = bufferAdapter if !writer.startWriting() { completion(false) return } writer.startSession(atSourceTime:CMTime(seconds: 0.0, preferredTimescale: 30) /*CMTime(value: 0, timescale: 30)*/) input.respondToEachPassDescription(on: DispatchQueue.global(qos: .userInitiated)) { guard let timeRange:CMTimeRange = input.currentPassDescription?.sourceTimeRanges.first?.timeRangeValue else { //we're done self.writer.finishWriting(completionHandler: { self.didComplete() //self.pixelBufferAdapter = nil }) return } self.lastGeneratedTime = timeRange.start.seconds input.requestMediaDataWhenReady(on: DispatchQueue.global(qos: .userInitiated)) { while input.isReadyForMoreMediaData { //get time let thisTimeStamp = CMTime(seconds: self.lastGeneratedTime, preferredTimescale: input.mediaTimeScale) if thisTimeStamp.seconds >= 2.0 { //compare to end of timeRange input.markCurrentPassAsFinished() return } let nextTime:Double = self.lastGeneratedTime + (1/30.0) //print("thisTime = \(thisTimeStamp.seconds)") //if over, cancel render guard let buffer:CVPixelBuffer = bufferAdapter.pixelBufferDrawing({ context in //TODO: draw frame context.setFillColor(NSColor.blue.cgColor) context.fill(CGRect(x: 0.0, y: 0.0, width: 1280.0, height: 720.0)) }) else { continue } if !bufferAdapter.append(buffer, withPresentationTime: thisTimeStamp) { print("didn't append") } self.lastGeneratedTime = nextTime } } } //endsession? } //var pixelPool:CVPixelBufferPool func didComplete() { if writer.status != .completed ,let error = writer.error { print("writer error = \(error)") } completion(writer.status == .completed) } var lastGeneratedTime:Double = 0.0 var pixelBufferAdapter:AVAssetWriterInputPixelBufferAdaptor? var inputWriter:AVAssetWriterInput?}extension AVAssetWriterInputPixelBufferAdaptor { func pixelBufferDrawing(_ work:(CGContext)->())->CVPixelBuffer? { var bufferOrNil:CVPixelBuffer? guard let pool = pixelBufferPool ,CVPixelBufferPoolCreatePixelBuffer(nil, pool, &bufferOrNil) == kCVReturnSuccess ,let buffer = bufferOrNil else { return nil } let width:Int = CVPixelBufferGetWidth(buffer) let height:Int = CVPixelBufferGetHeight(buffer) CVPixelBufferLockBaseAddress(buffer, []) //create a cg graphics context using the pixel buffer's data bytes let data = CVPixelBufferGetBaseAddress(buffer) let rgbColorSpace = CGColorSpaceCreateDeviceRGB() guard let context = CGContext(data: data, width: width, height: height, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(buffer), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue) else { return nil } work(context) CVPixelBufferUnlockBaseAddress(buffer, []) return buffer }}
Asked
Last updated .
Post marked as unsolved
193 Views

NSImageView appears ghostly inside view-based NSTableView.

I have a custom render CGImage, being put in an NSImage, put in an NSImageView, which is a subview of a custom NSView subclass returned for the column's view for a view-based NSTableView. When I use the visual debug inspector (not the view debugger), I get the image I expect when I preview the cgimage, the ns image, the nsimageview, but when I preview the cell I get a "ghosted" image. The dark blacks appear as medium grays, the saturated colors are pastels. I have checked the alphas and looked for secret over-lapping subviews, double checked my columns are not zero-width, and ensured my layout constraints made my image views the full size of my cell. What am I missing that could be causing the image to become faded out like that?
Asked
Last updated .
Post marked as unsolved
744 Views

How do I properly use CGImage.masking(_ mask:CGImage)->CGImage? ?

On iOS, I need to use an image resource as a mask to create a tinted image. I cannot use CALayer or UIView tinting or masking capacities.I'm trying to use CGImage.masking(_ mask:CGImage)->CGImage?however, with the folowing code in an iOS Swift Playground, all I get from the mask is the same square solid-color image.I have already tried changing the "maskUiImage" from using a black image with an alpha mask to both a white image on black background with no alpha and a white image on a black background, none of these change affected the output of the final image, though they do affect the output of "uiMask".import UIKitimport CoreGraphicsextension CGImage { public class func make(size:CGSize, scale:CGFloat, drawing:(CGContext)->())->CGImage { UIGraphicsBeginImageContextWithOptions(size, false, scale) let bitmapContext:CGContext = UIGraphicsGetCurrentContext()! drawing(bitmapContext) let image = bitmapContext.makeImage()! UIGraphicsEndImageContext() return image }}CGBitmapInfo.alphaInfoMask.rawValuelet maskUiImage:UIImage = #imageLiteral(resourceName: "si_bolt_solid_opaque_black@3x.png")//provide your own masking image literallet maskImage:CGImage = maskUiImage.cgImage!let imageMask:CGImage = CGImage(maskWidth: maskImage.width, height: maskImage.height, bitsPerComponent: maskImage.bitsPerComponent, bitsPerPixel: maskImage.bitsPerPixel, bytesPerRow: maskImage.bytesPerRow, provider: maskImage.dataProvider!, decode: nil, shouldInterpolate: false)!let uiMask = UIImage(cgImage:imageMask, scale:maskUiImage.scale, orientation:.up)let unmaskedImage:CGImage = CGImage.make(size: maskUiImage.size, scale: maskUiImage.scale) { (context) in context.setFillColor(UIColor.green.cgColor) context.fill(CGRect(origin:.zero, size:maskUiImage.size))}let uiUnmaskedImage = UIImage(cgImage: unmaskedImage, scale:maskUiImage.scale, orientation:.up)guard let maskedImage:CGImage = unmaskedImage.masking(imageMask) else { fatalError("new image was nil")}let uiMaskedImage:UIImage = UIImage(cgImage:maskedImage, scale:maskUiImage.scale, orientation:.up)//the problem is that "uiUnmaskedImage" and "uiMaskedImage" are identical. "uiUnmaskedImage" is as expected, a green square the size of my mask image. However, unexpectly, and undesirably, "uiMaskedImage" is also a green square. I want the masked image to be transparent where I don't provide the passthrough-color in the original, and my color selected in the "CGImage.make(size:..." function where I do provide the passthrough color in the original masking image. I can ask our designer to configure the original mask image anyway it needs to be, but I've already tried several combinations with no impact on the final result.
Asked
Last updated .
Post marked as solved
788 Views

AVAudioSession missing from iOS 10 SDK?

Using Xcode 8 GM, I wrote a small iOS app for iOS 10 which makes use of AVAudioSession. I've built that for simulator and it works as expected. However, when I try building for device (iOS 10.0.1), The compiler tells me that it doesn't recognize "AVAudioSession". AVAudioSession is listed in the documentation, and I don't see a deprecation or removal notice. However, when I looked for it in the frameworks, I see AVFoundation does not list it. Doing to AVFoundation.AVFAudio. does list a "AVFoundation.AVFAudio.AVAudioSession", but clicking into that merely says:import AVFoundationimport FoundationI looked in plain old AVFoundation for it, but didn't find an audio session.I switched back to building for simualtor, and AVFoundation.AVFAudio.AVAudioSession now contains a class for AVAudioSession. I don't see a "mac os only" availability listed, so I'm wondering what gives?I removed the AVAudioSession code and ran the app. It "worked" but didn't work as well as it could have if I had been able to set the correct session category and mode.It's like the class is present in the simulator SDK, but not in the actual SDK?Is anyone else seeing this problem? Is it a bug, or did AVAudioSession actually get removed? Do I maybe have a bad download?
Asked
Last updated .
Post marked as solved
425 Views

CTFrameGetLineOrigins useless in Swift 3?

Implicitly-unwrapped optionals are such an anti-pattern that they are implicitly converted to other types in Swift 3.UnsafePointers are a throw back to an old language.Yet CTFrameGetLineOrigins requires a UnsafeMutablePointer<CGPoint>! .Leaving a big question, how do I even create an implicitly unwrapped unsafe pointer ?Obviously, the need to have an unsafe pointer in the first place is an anti-pattern as well. So either I'd want a var lineOrigins:[CGFloat] //computed propertyor func lineOrigins(inRange:CFRange = default)->[CGFloat]But, as is, is this basically useless, or am I missing something? I spent an hour trying to figure out how to create the argument, haven't figured it out. Frankly I don't think I should have to.
Asked
Last updated .
Post marked as unsolved
329 Views

Does la_solve not attempt least-squares solutions?

According to the LinearAlgebra documentation for la_solve:"If the matrix is not square, we return a least-squares solution computed byperforming a QR factorization of the matrix.If the number of rows of the matrix does not match the number of rows ofthe right hand side object, the returned object has statusLA_DIMENSION_MISMATCH_ERROR."However, my tests indicate that la_solve returns LA_DIMENSION_MISMATCH_ERROR, including when set up for a least-squares solution (i.e. number of rows of matrix and vector match). Please provide working sample code for a least squares computation. Preferably in Swift 3.
Asked
Last updated .