Post not yet marked as solved
How do I use the SW API on my iPhone to get an instant image of an external USB webcamera
Post not yet marked as solved
Hey guys, tried to follow the super confusing doc on this, but no luck yet.
https://developer.apple.com/av-foundation/Incorporating-HDR-video-with-Dolby-Vision-into-your-apps.pdf
I have code that uses AVAssetReader and AVAssetReaderTrackOutput to directly pull frames from a video, but the colors are wrong for HDR dolby videos. Basically what I want is to extract frames from an HDR Dolby video as images and have those images not be the wrong color. Don't care if they are only 8 bit per color instead of 10 and all the all new stuff, just the closest that old fashioned 8 bit per color supports.
I added the statement marked // added for dolby hdr per the above doc, (spread across several lines), no luck, still bad colors.
Any hints of what I am missing?
NSMutableDictionary* dictionary = [[NSMutableDictionary alloc] init];
[dictionary setObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
// added for dolby hdr
dictionary[AVVideoColorPropertiesKey] = @{
AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_709_2,
AVVideoTransferFunctionKey: AVVideoTransferFunction_ITU_R_709_2,
AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_709_2
};
AVAssetReaderTrackOutput* asset_reader_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:video_track outputSettings:dictionary];
[asset_reader addOutput:asset_reader_output];
// from here we get sample buffers like this
CMSampleBufferRef buffer2 = [asset_reader_output copyNextSampleBuffer];
// then pixel buffer
CVPixelBufferRef inputPixelBuffer = CMSampleBufferGetImageBuffer(buffer2);
// then a CIImage
CIImage* ciImage = [CIImage imageWithCVPixelBuffer:inputPixelBuffer]; // one vid frame
then we use standard stuff to convert that to a CGImage/UIImage
Post not yet marked as solved
I created a style transfer model using CreateML and can not save the generated styled image to tempDirectory, unsure if it is to do with the way I create the pixelBuffer? (below):
import Vision
import CoreML
import CoreVideo
let model = style1()
// set input size of the model
var modelInputSize = CGSize(width: 512, height: 512)
// create a cvpixel buffer
var pixelBuffer: CVPixelBuffer?
let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue,
kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary
CVPixelBufferCreate(kCFAllocatorDefault,
Int(modelInputSize.width),
Int(modelInputSize.height),
kCVPixelFormatType_32BGRA,
attrs,
&pixelBuffer)
// put bytes into pixelBuffer
let context = CIContext()
let argPathUrl = "file:///pathhere"
let modelImageUrl: URL = URL(string: argPathUrl)!;
guard let CiImageData = CIImage(contentsOf: modelImageUrl) else { return }
context.render(CiImageData, to: pixelBuffer!)
// predict image
let output = try? model.prediction(image: pixelBuffer!)
let predImage = CIImage(cvPixelBuffer: (output?.stylizedImage)!)
let context2 = CIContext()
let format = kCIFormatRGBA16
try! context2.writePNGRepresentation(of: predImage, to: FileManager.default.temporaryDirectory.appendingPathComponent("testcgi.png"), format: format, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!, options: [:])
let saveUrl = "testcgi.png"
return;
Post not yet marked as solved
Thread 0 name: Dispatch queue: com.apple.main-thread
Thread 0 Crashed:
0 CoreFoundation 0x000000019d415ec8 0x19d371000 + 675528
1 CoreVideo 0x00000001a5a3d38c 0x1a5a2f000 + 58252
2 CoreVideo 0x00000001a5a3e498 0x1a5a2f000 + 62616
3 MirAIe 0x0000000103d7620c 0x102968000 + 21029388
4 MirAIe 0x0000000103beb76c 0x102968000 + 19412844
5 libdispatch.dylib 0x000000019d085a84 0x19d083000 + 10884
6 libdispatch.dylib 0x000000019d08781c 0x19d083000 + 18460
7 libdispatch.dylib 0x000000019d095c70 0x19d083000 + 76912
8 CoreFoundation 0x000000019d414340 0x19d371000 + 668480
9 CoreFoundation 0x000000019d40e218 0x19d371000 + 643608
10 CoreFoundation 0x000000019d40d308 0x19d371000 + 639752
11 GraphicsServices 0x00000001b4a90734 0x1b4a8d000 + 14132
12 UIKitCore 0x000000019fe8b75c 0x19f2c1000 + 12363612
13 UIKitCore 0x000000019fe90fcc 0x19f2c1000 + 12386252
14 MirAIe 0x00000001029818a4 0x102968000 + 104612
15 libdyld.dylib 0x000000019d0c9cf8 0x19d0c8000 + 7416
Thread 0 crashed with ARM Thread State (64-bit):
x0: 0x0000000281da30c0 x1: 0x0000000000000000 x2: 0x0000000281da30c0 x3: 0x00000001acafa188
x4: 0x00000000000062dc x5: 0x00000000fffffffe x6: 0x000000016d495f34 x7: 0x000000016d495f28
x8: 0x0000000000000000 x9: 0x0000000100000053 x10: 0x00006e0105ac30c0 x11: 0x007ffffffffffff8
x12: 0x0000000000000055 x13: 0x0000000106992330 x14: 0x00000000f781f800 x15: 0x0000000104bb5a00
x16: 0x00006e0105ac30c0 x17: 0x0000000105ac30c0 x18: 0x0000000110530abb x19: 0x0000000281da30c0
x20: 0x0000000000000000 x21: 0x0000000283614040 x22: 0x00000002839b9080 x23: 0x0000000000000114
x24: 0x0000000000000000 x25: 0x000000010572f9a0 x26: 0x000000000000000f x27: 0x0000000000000000
x28: 0x0000000002ffffff fp: 0x000000016d496970 lr: 0xbf283781a5a3d38c
sp: 0x000000016d496970 pc: 0x000000019d415ec8 cpsr: 0x20000000
esr: 0xf200c472 Address size fault
Post not yet marked as solved
Hello, I hope you are well.
I am developing a hybrid application, the application as such is web, the problem I have is that iOS does not display videos in safari, in Google Chrome yes, when I take out the application for iOS it does not display the videos either, I do not know if It would be due to the same problem that happens with safari. to hybridize the application I am using capacitor / core
Post not yet marked as solved
I am developing a hybrid app, I am using java script and html, to be able to compile it to xcode I use capacitor, the problem is that my app includes videos but I cannot block the native ios player, I want to block it.
webview.allowsInlineMediaPlayback = yes;
I found this, the problem is that it only blocks it for ipad, not for iphones.
Post not yet marked as solved
I have read that as of macOS X 10.14 setting setAllowsConcurrentViewDrawing to true on a NSWindow and setCanDrawConcurrently to true on its view is no longer supported to perform drawing outside the main thread.
All the documentation that I find on the internet strongly advise programmers to cleanup their main loops to perform only drawing and user input handing there. What else is left to do when this means multiple programmer*years of work?
I used to draw a CGImage wrapped in a NSImage through a NSGraphicsContext on a separate looping thread calling NSView's needsDisplay then [display], to display smooth animations.
I also picked up that NSOpenGLView has been deprecated.
With all that said, what would be the best way to go to perform threaded drawing in a NSView?
Thanks
Post not yet marked as solved
Hi, I work on an App, that can generate videos.(or it should)I don’t know which Frameworks/library’s you can use to do this.I have input(image,pos,…) from this information(s) I want generate a video.The code should be compatible with SwiftUI, so that I can have sth. Like a live preview in iMovi or clips.(My current code is flexible, so it don’t matters if I have to restructure some small thinks)
Thanks in advance
Post not yet marked as solved
I have a DLP-Link 3D projector which I'd like to make use of by means of a hand-made player.
So far in my project: A class MovieView : NSView within a NSWindow, with stub drawing codes.
I know that, if I place drawing codes in the func draw(...) function, then NSGraphicsContext.current will be set up for me to use. But I'm drawing from a high-priority thread (the DisplayLink), so I obviously have to set it up myself.
How should I do that in Swift?