Core Graphics

RSS for tag

Harness the power of Quartz technology to perform lightweight 2D rendering with high-fidelity output using Core Graphics.

Core Graphics Documentation

Posts under Core Graphics tag

61 Posts
Sort by:
Post marked as solved
1 Replies
145 Views
We use “CGDisplayStreamCreateWithDispatchQueue” function to create a display stream, sometimes the application show crash on macOS Ventura. It can be use normally on macOS Monterey/Big Sur/Catalina/… I can see this bug by other Application(Through the Console->Crash Report), which is also use this function. The relevant code is as follows:     CGDisplayStreamRef _displayStream;     CGDirectDisplayID displayID = CGMainDisplayID();     unsigned long modeWidth;     unsigned long modeHeight;     modeWidth = CGDisplayPixelsWide(displayID);     modeHeight = CGDisplayPixelsHigh(displayID);     CGDisplayModeRef displayModeRef = CGDisplayCopyDisplayMode(displayID);     if (displayModeRef){         modeWidth = CGDisplayModeGetPixelWidth(displayModeRef);         modeHeight = CGDisplayModeGetPixelHeight(displayModeRef);     }     CGDisplayModeRelease(displayModeRef);     NSLog(@"enable stream, displayID[0x%x], mode[%lu x %lu]\n", displayID, modeWidth, modeHeight);     const int var_prop = 2;     const void* keys[var_prop] = { kCGDisplayStreamDestinationRect, kCGDisplayStreamShowCursor};     const void* values[var_prop] = { CGRectCreateDictionaryRepresentation(CGRectMake(0, 0, modeWidth, modeHeight)) , kCFBooleanTrue};     CFDictionaryRef properties = CFDictionaryCreate(NULL, keys, values, var_prop, NULL, NULL);     _displayStream = CGDisplayStreamCreateWithDispatchQueue(displayID, modeWidth, modeHeight, 'BGRA', properties, dispatchQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {             if(status == kCGDisplayStreamFrameStatusStopped)             {                 NSLog(@"kCGDisplayStreamFrameStatusStopped is received!");                 bIsStreamStatusStopped = true;                 return;             }             if(status == kCGDisplayStreamFrameStatusFrameComplete && frameSurface)             {                 CFRetain(frameSurface);                 IOSurfaceIncrementUseCount(frameSurface);                 CFRetain(updateRef);                 [[NSOperationQueue mainQueue] addOperationWithBlock:^{                     if(!bIsStreamStatusStopped) {                         self.view.layer.contents = (__bridge id _Nullable)(frameSurface);                     }                 }];               CFRelease(updateRef);                 IOSurfaceDecrementUseCount(frameSurface);                 CFRelease(frameSurface);             }     });     if (_displayStream)     {         CGError err = CGDisplayStreamStart(_displayStream);         bIsStreamStatusStopped = false;         if (err != CGDisplayNoErr)         {             NSLog(@"Error %u starting display stream", (unsigned)err);             CFRelease(_displayStream);             _displayStream = 0;         }     }     else         NSLog(@"create stream failed.\n"); The crash log is as follows: 0 CoreFoundation 0x189889e48 CFGetTypeID + 148 1 CoreFoundation 0x1898ac970 __CFPropertyListIsValidAux + 60 2 CoreFoundation 0x1898aedd4 __CFPropertyListIsDictPlistAux + 188 3 CoreFoundation 0x1898eab40 __CFDictionaryApplyFunction_block_invoke + 28 4 CoreFoundation 0x1898b2c38 CFBasicHashApply + 148 5 CoreFoundation 0x1898a4814 CFDictionaryApplyFunction + 320 6 CoreFoundation 0x1898acb9c __CFPropertyListIsValidAux + 616 7 CoreFoundation 0x1898beea8 CFPropertyListWrite + 92 8 CoreFoundation 0x1898de418 CFPropertyListCreateData + 144 9 SkyLight 0x18e4aba04 CGSPropertyListCreateSerializedData + 72 10 SkyLight 0x18e4b72ac CGSPropertyListCreateSerializedBytes + 68 11 SkyLight 0x18e699e14 CGSPropertyListPerformWithSerializedBytes + 64 12 SkyLight 0x18e5c25c4 SLDisplayStreamCreate + 296 13 SkyLight 0x18e5c3008 SLDisplayStreamCreateWithDispatchQueue + 52 14 macOS InstantView 0x1043617f8 0x104328000 + 235512 15 macOS InstantView 0x1043619fc 0x104328000 + 236028 16 macOS InstantView 0x104361444 0x104328000 + 234564 17 SkyLight 0x18e4b9f8c displayConfigFinalizedProc + 276 18 SkyLight 0x18e4b1558 CGSPostLocalNotification + 172 19 SkyLight 0x18e4b1148 (anonymous namespace)::notify_datagram_handler(unsigned int, CGSDatagramType, void*, unsigned long, void*) + 116 20 SkyLight 0x18e7d3bec CGSDatagramReadStream::dispatchMainQueueDatagrams() + 228 21 SkyLight 0x18e7d3ae8 invocation function for block in CGSDatagramReadStream::mainQueueWakeup() + 28 22 libdispatch.dylib 0x189680a48 _dispatch_call_block_and_release + 32 23 libdispatch.dylib 0x189682570 _dispatch_client_callout + 20 24 libdispatch.dylib 0x189690d28 _dispatch_main_queue_drain + 928 25 libdispatch.dylib 0x189690978 _dispatch_main_queue_callback_4CF + 44 26 CoreFoundation 0x18992a77c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 27 CoreFoundation 0x1898e82d0 __CFRunLoopRun + 2036 28 CoreFoundation 0x1898e7388 CFRunLoopRunSpecific + 612 29 HIToolbox 0x192f06a68 RunCurrentEventLoopInMode + 292 30 HIToolbox 0x192f068ac ReceiveNextEventCommon + 672 31 HIToolbox 0x192f065f4 _BlockUntilNextEventMatchingListInModeWithFilter + 72 32 AppKit 0x18cb2621c _DPSNextEvent + 632 33 AppKit 0x18cb253ac -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 728 34 AppKit 0x18cb1972c -[NSApplication run] + 464 35 AppKit 0x18caf0a24 NSApplicationMain + 880 36 macOS InstantView 0x1043546cc 0x104328000 + 181964 37 dyld
Posted
by Christal.
Last updated
.
Post not yet marked as solved
2 Replies
477 Views
Hi team, We have an iOS app. Since July 15, 2022, our users met a kind of app crash due to an invalid memory fetch. The time is when Apple released iOS 16 beta officially. After Sep 12, crash count started to increase drastically. The time is Apple released iOS 16 officially. Crash backtrace can be seen as follows. Thread 14 Crashed: 0 libsystem_platform.dylib 0x00000001f8810930 _platform_memmove + 96 1 CoreGraphics 0x00000001adb64104 CGDataProviderCreateWithCopyOfData + 20 2 CoreGraphics 0x00000001adb4cdb4 CGBitmapContextCreateImage + 172 3 VisionKitCore 0x00000001ed813f10 -[VKCRemoveBackgroundResult _createCGImageFromBGRAPixelBuffer:cropRect:] + 348 4 VisionKitCore 0x00000001ed813cc0 -[VKCRemoveBackgroundResult createCGImage] + 156 5 VisionKitCore 0x00000001ed8ab6f8 __vk_cgImageRemoveBackgroundWithDownsizing_block_invoke + 64 6 VisionKitCore 0x00000001ed881474 __63-[VKCRemoveBackgroundRequestHandler performRequest:completion:]_block_invoke.5 + 436 7 MediaAnalysisServices 0x00000001eec58968 __92-[MADService performRequests:onPixelBuffer:withOrientation:andIdentifier:completionHandler:]_block_invoke.38 + 400 8 CoreFoundation 0x00000001abff0a14 __invoking___ + 148 9 CoreFoundation 0x00000001abf9cf2c -[NSInvocation invoke] + 428 10 Foundation 0x00000001a6464d38 __NSXPCCONNECTION_IS_CALLING_OUT_TO_REPLY_BLOCK__ + 16 11 Foundation 0x00000001a64362fc -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 520 12 Foundation 0x00000001a6a10f44 __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188 13 libxpc.dylib 0x00000001f89053e4 _xpc_connection_reply_callout + 124 14 libxpc.dylib 0x00000001f88f8580 _xpc_connection_call_reply_async + 88 15 libdispatch.dylib 0x00000001b340205c _dispatch_client_callout3 + 20 16 libdispatch.dylib 0x00000001b341ff58 _dispatch_mach_msg_async_reply_invoke + 344 17 libdispatch.dylib 0x00000001b340956c _dispatch_lane_serial_drain + 376 18 libdispatch.dylib 0x00000001b340a214 _dispatch_lane_invoke + 436 19 libdispatch.dylib 0x00000001b3414e10 _dispatch_workloop_worker_thread + 652 20 libsystem_pthread.dylib 0x00000001f88a4df8 _pthread_wqthread + 288 21 libsystem_pthread.dylib 0x00000001f88a4b98 start_wqthread + 8 Last but not the least. The users who met this kind of app crash use iOS16+. We think this crash is related to iOS 16 SDK. We're appreciate that you can provide some clues how to fix this kind of crash.
Posted
by feiyz.
Last updated
.
Post marked as solved
1 Replies
183 Views
I have a MacOS package dependency which defines some class: FontPens import Foundation ... public class BoundsPen: Pen { var bounds = CGRect.null private var currentPoint = CGPoint.zero .... After upgrading Xcode to 14.1 both lines throws errors Type 'CGRect' has no member 'null' and Type 'CGPoint' has no member 'zero'. Calling CGPoint.zero and CGRect.null from an app is OK if Foundation is imported. Is it a way to solve this problem without changing package source?
Posted
by ludzik.
Last updated
.
Post not yet marked as solved
13 Replies
8k Views
Hi There, I just bought a IIYAMA G-MASTER GB3461WQSU-B1 which has a native resolution of 3440x1440 but my MacBook Pro (Retina, 15-inch, Mid 2014) doesn't recognise the monitor and I can't run it at its full resolution. It is currently recognised as a PL3461WQ 34.5-inch (2560 x 1440). Is there anything that I can do to get it sorted or I will have to wait until this monitor driver is added to the Big Sur list? Thanks
Posted Last updated
.
Post not yet marked as solved
0 Replies
149 Views
CAEmitterLayer emits multiple CAEmitterCells with specified trajectories. When the trajectory CAEmitterCells fly to the end of time, it will become a particle graph displayed by CAEmitterCells, and then the particle graph will be scattered 360 °. However, only one trajectory particle can be animated completely, but more than two particles will appear. The trajectory particle graph will turn white Adding a boomEmitterCell will cause the contents of CAEmitterCell in getDogLeftDownEmitterWithImageName and getRedLeftEmitterWithImageName to not display the picture and white blocks CAEmitterLayer *emitterLayer = [[CAEmitterLayer alloc] init]; emitterLayer.emitterPosition = CGPointMake(self.view.layer.bounds.size.width -100, self.view.layer.bounds.size.height - 100); emitterLayer.emitterSize = CGSizeMake(50, 100.f); emitterLayer.emitterShape = kCAEmitterLayerLine; emitterLayer.emitterMode = kCAEmitterLayerOutline; emitterLayer.renderMode = kCAEmitterLayerOldestLast; CAEmitterCell *dogleftEmitterCell = [self getRedLeftEmitterWithImageName:[imageArray objectAtIndex:0]]; CAEmitterCell *redLeftEmitterCell = [self getRedLeftEmitterWithImageName:[imageArray objectAtIndex:0]]; emitterLayer.emitterCells = @[dogleftEmitterCell,redLeftEmitterCell]; [self.view.layer addSublayer:emitterLayer]; //狗头左下 - (CAEmitterCell *)getDogLeftDownEmitterWithImageName:(NSString *)imageName { CAEmitterCell *emitterCell = [[CAEmitterCell alloc] init]; emitterCell.name = @"左下狗头"; emitterCell.contents = (__bridge id _Nullable)[UIImage imageNamed:@"emoji_6"].CGImage; //产生频率 emitterCell.birthRate = 1; //存活时长 emitterCell.lifetime = 0.6; //速度 emitterCell.velocity = 100; emitterCell.xAcceleration = -1000.f; // 模拟重力影响 emitterCell.scale = 0.2; emitterCell.scaleSpeed = 0.25; emitterCell.emissionLongitude = M_PI_2; // emitterCell.emissionRange = M_PI_4; emitterCell.emitterCells = @[[self boomEmitterCell]]; return emitterCell; } //红包左上 - (CAEmitterCell *)getRedLeftEmitterWithImageName:(NSString *)imageName { CAEmitterCell *emitterCell = [[CAEmitterCell alloc] init]; emitterCell.name = @"红包"; emitterCell.contents = (__bridge id _Nullable)[UIImage imageNamed:@"emoji_7"].CGImage; //产生频率 emitterCell.birthRate = 10; //存活时长 emitterCell.lifetime = 0.6; // emitterCell.beginTime = self.beginTime; //速度 emitterCell.velocity = 100; emitterCell.yAcceleration = -1000.f; // 模拟重力影响 emitterCell.scale = 0.2; // emitterCell.scaleRange = 0.06; emitterCell.scaleSpeed = 0.25; emitterCell.emissionLongitude = M_PI; // CAEmitterCell *emitterCell = [self boomEmitterCell]; emitterCell.emitterCells = @[[self boomEmitterCell]]; return emitterCell; } - (CAEmitterCell *)boomEmitterCell { // 爆炸 CAEmitterCell * explodeCell = [CAEmitterCell emitterCell]; explodeCell.name = @"explodeCell"; explodeCell.birthRate = 2.f; explodeCell.lifetime = 0.6f; // explodeCell.velocity = 0.f; // explodeCell.scale = 1.0; // explodeCell.redSpeed = -1.5; //爆炸的时候变化颜色 // explodeCell.blueRange = 1.5; //爆炸的时候变化颜色 // explodeCell.greenRange = 1.f; //爆炸的时候变化颜色 // explodeCell.birthRate = 1.0; // explodeCell.velocity = 0; // explodeCell.scale = 2.5; // explodeCell.redSpeed =-1.5; // explodeCell.blueSpeed =+1.5; // explodeCell.greenSpeed =+1.0; // explodeCell.lifetime = 0.35; explodeCell.contents = (__bridge id _Nullable)[[UIImage imageNamed:@"allStart"] CGImage]; // 火花 // CAEmitterCell * sparkCell = [CAEmitterCell emitterCell]; // sparkCell.name = @"sparkCell"; // // sparkCell.birthRate = 3.f; // sparkCell.lifetime = 3.f; // sparkCell.velocity = 125.f; //// sparkCell.yAcceleration = 75.f; // 模拟重力影响 // sparkCell.emissionRange = M_PI * 2; // 360度 // // sparkCell.scale = 1.2f; // sparkCell.contents = (id)[[UIImage imageNamed:@"star_white_stroke"] CGImage]; // sparkCell.redSpeed = 0.4; // sparkCell.greenSpeed = -0.1; // sparkCell.blueSpeed = -0.1; // sparkCell.alphaSpeed = -0.25; // explodeCell.emitterCells = @[sparkCell]; return explodeCell; }
Posted Last updated
.
Post not yet marked as solved
3 Replies
217 Views
Sometimes while drawing from my main thread I have error conditions that I need to relay to the user, and often need to get the user to interact (enter text, press a button, etc) -- and then CONTINUE from inside the main thread drawing process that was interrupted. However, the main thread doesn't like it when you try to draw while already drawing from within the main thread (I think it even throws an exception if you try). Is there a canonical approach to handling errors that occur during drawing that require more sophisticated error handling?
Posted Last updated
.
Post not yet marked as solved
1 Replies
235 Views
What version of Xcode are you using? Xcode Version 14.0 (14A309) Did you see an error message? Yes What was the error? CoreGraphics PDF has logged an error. Set environment variable "CG_PDF_VERBOSE" to learn more. /* com.apple.actool.compilation-results */ Please describe the issue: After compiling the xcasset with certain pdfs, actool will produce the info “CoreGraphics PDF has logged an error”. The UIImage for the pdf fails rendering. Please list the steps you took to reproduce the issue: Create a new iOS project Drag the pdf(attachment named ***.pdf) in the xcasset Build Check the compile log What did you expect to happen? The pdf image renders as expected. Attachment https://github.com/PhilCai1993/Xcode14_PDF_Xcassets_Issue
Posted
by caizheren.
Last updated
.
Post not yet marked as solved
5 Replies
4.5k Views
The DDC/CI application can work well on MacbookPro/Mac Pro (Big Sur), but it doesn't work on M1 Mac (both macOS 11.0.1 and 11.1). M1’s graphics card is Apple, not Intel or AMD. Does this incompatible issue relate with new graphics card or kernel change? Any alternative solution for M1?
Posted Last updated
.
Post not yet marked as solved
5 Replies
250 Views
Dear Experts, I create a UIImage for an SFSymbol using [UIImage systemImageNamed], get its CGImage, and look at the sizes of each: UIImageConfiguration* config = [UIImageSymbolConfiguration configurationWithPointSize: 64 weight: UIImageSymbolWeightLight scale: UIImageScaleMedium]; UIImage* img = [UIImage systemImageNamed: @"chevron.compact.down" withConfiguration: config]; CGImageRef c = [img CGImage]; printf("UIImage size %f x %f, CGImage size %f x %f\n", img.size.width, img.size.height, CGImageGetWidth(c), CGImageGetHeight(c)); (Consider that pseudo-code, it's not an exact copy-paste.) Results: UIImage is 70.3333 x 25.6667 and CGImage is 163 x 43. So the aspect ratios (W/H) are 2.74 and 3.79 respectively. That can't be right! I don't expect the UIImage and the CGImage dimensions to be the same, because of the UIImage's scale (which is 3 in this case). But that should be the same for both dimensions. The effect is most pronounced with symbols that have an aspect ratio far from 1, e.g. recordingtape, ellipsis, and this chevron.compact.down. I believe that the CGImage aspect ratios are the correct ones. What is going on here?
Posted
by endecotp.
Last updated
.
Post not yet marked as solved
1 Replies
317 Views
Hello I want to programmatically generate a random patterned image, think wallpaper. The method of generation is yet to be established and is unimportant for this question. The output will either be a UIImage or a CIImage object. Then I want to take a random area of that pattern, a crop effectively, and make one of any number of alterations to it. A non-exhaustive list might include: rotation, swapping colours, other colour effects, shift or scroll. The area affected might also be one of any number of shapes. A non-exhaustive list might include: square, rectangle, triangle, circle, oval. Essentially, I want to take a uniform pattern and break or disrupt it in random ways. So I've been Googling how I might do this and everything seems to point to the Core Image framework, particularly the filters. Although it seems less clear to me how to do transformations, crops and composite images. My question simply is, am I on the right path pursuing Core Image or are there other ways to achieve these effects? Thanks Jim
Posted
by dcjams.
Last updated
.
Post not yet marked as solved
0 Replies
210 Views
Hi I'm trying to make an application that sends keypresses to the system. For that I'm using CoreGraphics framework. It works fine on MacOS Mojave but not on Monterey (with M1 architecture). In each system, I allowed the application in Privacy(System preferences) to control the computer. I'm using the following code void postKeyCode(int keyCode, bool down){ CGEventRef keyboardEvent = CGEventCreateKeyboardEvent(NULL, keyCode, down); CGEventPost(kCGHIDEventTap, keyboardEvent); CFRelease(keyboardEvent); } Is there any additional requirements to allow the application ?
Posted
by cglinel.
Last updated
.
Post marked as solved
1 Replies
338 Views
I am trying to create a simple bitmap image. I was able to create one using an IOS-based (UIKit) playground but I am unable to to do it in a MacOs (Quartz) framework. I think those are the right terms ... I'm a newbee at this! My code, below, is needlessly verbose but I wanted to understand every part. The key is the CGContext call at the bottom, for every iteration it has returned NULL and fails. I have tried it with the current data: nil and supplying a data: &Data reference for the image. Both attempts fail. I must be doing (or assuming) something incorrectly. Here's the code: import PlaygroundSupport import Cocoa import CoreGraphics let tftWidthPixels = 240 let tftHeightPixels = 135 let tftSize = CGSize(width: tftWidthPixels, height: tftHeightPixels) // Border for guideline extensions let topBorderPixels = 200 let leftBorderPixels = 200 let pixelSizeWidth = 10  // This seems to work well let pixelSizeHeight = pixelSizeWidth  // Square let pixelSize = CGSize(width: pixelSizeWidth, height: pixelSizeHeight) // This should (must?) be odd let lineWidth = 3 // We outline every tft pixel let imageVerticalLines = tftWidthPixels + 1 let imageHorizontalLines = tftHeightPixels + 1 let imageHeightPixels = tftHeightPixels * pixelSizeHeight + topBorderPixels let imageWidthPixels = tftWidthPixels * pixelSizeWidth + leftBorderPixels let bitsPerByte = 8 let bytesPerPixel = 4 let bitsPerPixel = bytesPerPixel * bitsPerByte let bytesPerRow = bytesPerPixel * imageWidthPixels let imageSize = CGSize(width: imageWidthPixels, height: imageHeightPixels) let imageSizeBytes = imageWidthPixels * imageHeightPixels * bytesPerPixel let startRow = (lineWidth % 2 + Int(lineWidth/2) - 1) + topBorderPixels // Allow for pixel overlap let startColumn = (lineWidth % 2 + Int(lineWidth/2) - 1) + leftBorderPixels let lineColorNormal = CGColor(gray: 0.5, alpha: 1.0) let lineColorDark = CGColor(gray: 0.25, alpha: 1.0) let borderLineColor = CGColor(gray: 0.1, alpha: 1.0) let pageFillColor = CGColor.white let imageFormat = CIFormat.ARGB8 let imageColorSpace = CGColorSpace(name: CGColorSpace.sRGB)! let imageBitMapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.byteOrder32Big.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue) let imageIntent = CGColorRenderingIntent.defaultIntent //  Basically this is the cell size to draw, every n lines will be lineColorDark let majorGridVertical = 10 let majorGridHorizontal = 10 let bitmapInfo = CGBitmapInfo.byteOrder32Big.rawValue | CGImageAlphaInfo.none.rawValue print("Byte order mask=\(CGBitmapInfo.byteOrderMask)") let imageCount = imageHeightPixels*imageWidthPixels*bitsPerPixel/bitsPerByte print("Image count=\(imageCount)") print("Context parameters: width=\(imageWidthPixels), height=\(imageHeightPixels), bytesPerRow=\(bytesPerRow), \ncolorSpace=\(imageColorSpace), \nbitMapInfo=\(bitmapInfo)") guard var graphContext = CGContext(data: nil, width: imageWidthPixels, height: imageHeightPixels, bitsPerComponent: bitsPerByte, bytesPerRow: 0, space: imageColorSpace, bitmapInfo: bitmapInfo) else {     fatalError("Unable to create the CGContext") } // All is well, so far. What a wonder!```
Posted Last updated
.
Post not yet marked as solved
1 Replies
553 Views
I created an AX observer for kAXMainWindowChangedNotification. In the callback function I call CGWindowListCopyWindowInfo with kCGWindowListOptionOnScreenOnly option to get the id of the current main window (Windows are filtered by PID because I am only interested in the frontmost app). What I experienced is that the received window order reflects the previous state (before the window switch). If I delay the call of CGWindowListCopyWindowInfo with a few milliseconds then the order is perfect but it does not seem to be a stable solution. Is there any other way to wait until every API is notified about the changes then call CGWindowListCopyWindowInfo to get the most recent window information? Subscription: AXUIElementRef appElem = AXUIElementCreateApplication(processId.intValue); CFArrayRef windows; AXError copyResult = AXUIElementCopyAttributeValues(appElem, kAXWindowsAttribute, 0, 1, &windows); AXError createResult = AXObserverCreate(processId.intValue, windowSwitchedCallback, &observer);     if (copyResult != createResult != kAXErrorSuccess) {    return; }   AXObserverAddNotification(observer, appElem, kAXMainWindowChangedNotification, (__bridge void *)(self));   CFRunLoopAddSource([[NSRunLoop currentRunLoop] getCFRunLoop], AXObserverGetRunLoopSource(observer), kCFRunLoopDefaultMode); Callback: void windowSwitchedCallback(AXObserverRef observer, AXUIElementRef element, CFStringRef notificationName, void *refCon) { if (CFStringCompare(notificationName, kAXMainWindowChangedNotification, 0) == kCFCompareEqualTo) {     NSTimeInterval delayInMSec = 10;     dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInMSec * NSEC_PER_MSEC));     dispatch_after(popTime, dispatch_get_main_queue(), ^(void){       NSDictionary *activeWindow = [(__bridge ActiveWindowObserver*)(refCon) getActiveWindow];       NSLog(@"%@ windowChanged",activeWindow);     });   } } ActiveWindowObserver: - (NSDictionary*) getActiveWindow {   CFArrayRef windowsRef = CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly, kCGNullWindowID);   NSArray *windowsArray = (NSArray *)CFBridgingRelease(windowsRef);   NSPredicate *pIdPredicate = [NSPredicate predicateWithFormat:@"kCGWindowOwnerPID == %@ && kCGWindowLayer == 0", processId];   NSArray *filteredWindows = [windowsArray filteredArrayUsingPredicate:pIdPredicate];   id activeWindow = filteredWindows.count > 0 ? filteredWindows.firstObject : nil;   return activeWindow; }
Posted
by Spi96.
Last updated
.
Post not yet marked as solved
0 Replies
327 Views
I am trying to iterate over images in Photo Library and extract faces using CIDetector. The images are required to keep their original resolutions. To do so, I taking the following steps: 1- Getting assets given a date interval (usually more than a year) func loadAssets(from fromDate: Date, to toDate: Date, completion: @escaping ([PHAsset]) -> Void) { fetchQueue.async { let authStatus = PHPhotoLibrary.authorizationStatus() if authStatus == .authorized || authStatus == .limited { let options = PHFetchOptions() options.predicate = NSPredicate(format: "creationDate >= %@ && creationDate <= %@", fromDate as CVarArg, toDate as CVarArg) options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)] let result: PHFetchResult = PHAsset.fetchAssets(with: .image, options: options) var _assets = [PHAsset]() result.enumerateObjects { object, count, stop in _assets.append(object) } completion(_assets) } else { completion([]) } } } where: let fetchQueue = DispatchQueue.global(qos: .background) 2- Extracting faces I then extract face images using: func detectFaces(in image: UIImage, accuracy: String = CIDetectorAccuracyLow, completion: @escaping ([UIImage]) -> Void) { faceDetectionQueue.async { var faceImages = [UIImage]() let outputImageSize: CGFloat = 200.0 / image.scale guard let ciImage = CIImage(image: image), let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [CIDetectorAccuracy: accuracy]) else { completion(faceImages); return } let faces = faceDetector.features(in: ciImage) // Crash happens here let group = DispatchGroup() for face in faces { group.enter() if let face = face as? CIFaceFeature { let faceBounds = face.bounds let offset: CGFloat = floor(min(faceBounds.width, faceBounds.height) * 0.2) let inset = UIEdgeInsets(top: -offset, left: -offset, bottom: -offset, right: -offset) let rect = faceBounds.inset(by: inset) let croppedFaceImage = ciImage.cropped(to: rect) let scaledImage = croppedFaceImage .transformed(by: CGAffineTransform(scaleX: outputImageSize / croppedFaceImage.extent.width, y: outputImageSize / croppedFaceImage.extent.height)) faceImages.append(UIImage(ciImage: scaledImage)) group.leave() } else { group.leave() } } group.notify(queue: self.faceDetectionQueue) { completion(faceImages) } } } where: private let faceDetectionQueue = DispatchQueue(label: "face detection queue", qos: DispatchQoS.background, attributes: [], autoreleaseFrequency: DispatchQueue.AutoreleaseFrequency.workItem, target: nil) I use the following extension to get the image from assets: extension PHAsset { var image: UIImage { autoreleasepool { let manager = PHImageManager.default() let options = PHImageRequestOptions() var thumbnail = UIImage() let rect = CGRect(x: 0, y: 0, width: pixelWidth, height: pixelHeight) options.isSynchronous = true options.deliveryMode = .highQualityFormat options.resizeMode = .exact options.normalizedCropRect = rect options.isNetworkAccessAllowed = true manager.requestImage(for: self, targetSize: rect.size, contentMode: .aspectFit, options: options, resultHandler: {(result, info) -> Void in if let result = result { thumbnail = result } else { thumbnail = UIImage() } }) return thumbnail } } } The code works fine for a few (usually less that 50) assets, but for more number of images it crashes at: let faces = faceDetector.features(in: ciImage) // Crash happens here I get this error: validateComputeFunctionArguments:858: failed assertion `Compute Function(ciKernelMain): missing sampler binding at index 0 for [0].' If I reduce the size of the image fed to detectFaces(:) e.g. 400 px, I can analyze a few hundred images (usually less than 1000) but as I mentioned, using the asset's image in the original size is a requirement. My guess is it has something to do with a memory issue when I try to extract faces with CIDetector. Any idea what this error is about and how I can fix the issue?
Posted
by Asteroid.
Last updated
.
Post not yet marked as solved
0 Replies
243 Views
My app uses CGLayerRef to do some of its drawing. With Ventura, this doesn't work reliably. I'm wondering if there have been changes to CGLayerRef that might account for this new undesired behavior? I did not see anything in the release notes regarding it. To be more specific, sometimes some of the drawing I do does not seem to make it to the layer. So when the layer is later drawn it appears that some of my window content is missing. I would replicate this in a sample app if I could, but it is not possible.
Posted
by hecht.
Last updated
.
Post marked as solved
3 Replies
428 Views
There is no API to create an intersection between two CGPaths, however CoreGraphics knows how to do it behind the scenes. When calling CGContextClip (link) it will intersect current and clipping paths and store it in the clipping path. I was thinking to utilize this to perform intersections between paths I have. The problem is I can not find a way to retrieve back the clipping path from CGContext. Am I correct that such API does not exist or did I miss something?
Posted
by artium.
Last updated
.
Post not yet marked as solved
0 Replies
345 Views
When I tried to create CGContext from CVImageBuffer with below code CGContext(data: CVPixelBufferGetBaseAddress(pixelBuffer), width: width, height: height, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer), space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue).union(.byteOrder32Little).rawValue) CGContext is nil with below Error CGBitmapContextCreate: invalid data bytes/row: should be at least 8640 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst. Note: This happens only with specific image buffer size like 4k, etc. And 1080p and 720p works fine. Appreciate your help in advance!
Posted Last updated
.