Allow users to browse, edit, and save images using slideshows and Core Image filters using Quartz.

Quartz Documentation

Posts under Quartz tag

13 Posts
Sort by:
Post not yet marked as solved
0 Replies
115 Views
Getting crash on user side. not able to reproduce it on our side. Please check the crash log. Xcode version: 13.2.1 (13C100) plateform: iOS Crashed: com.apple.main-thread 0 libsystem_kernel.dylib 0x6bbc __pthread_kill + 8 1 libsystem_pthread.dylib 0xd854 pthread_kill + 208 2 libsystem_c.dylib 0x1f6ac abort + 124 3 QuartzCore 0xce57c CA::Render::Encoder::grow(unsigned long) + 236 4 QuartzCore 0xb179c CA::Render::Layer::encode(CA::Render::Encoder*) const + 104 5 QuartzCore 0xbcbb0 CA::Render::encode_set_object(CA::Render::Encoder*, unsigned long, unsigned int, CA::Render::Object*, unsigned int) + 192 6 QuartzCore 0x47138 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 280 7 QuartzCore 0x348f4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 388 8 QuartzCore 0x34878 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 264 9 QuartzCore 0x34878 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 264 10 QuartzCore 0x34878 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 264 11 QuartzCore 0x34878 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 264 12 QuartzCore 0x457fc CA::Context::commit_transaction(CA::Transaction*, double, double*) + 6204 13 QuartzCore 0x4cc64 CA::Transaction::commit() + 708 14 UIKitCore 0x18a3f4 _UIApplicationFlushRunLoopCATransactionIfTooLate + 80 15 UIKitCore 0x15bb24 __processEventQueue + 7292 16 UIKitCore 0x160c54 __eventFetcherSourceCallback + 168 17 CoreFoundation 0xb34ec CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION + 24 18 CoreFoundation 0xc361c __CFRunLoopDoSource0 + 204 19 CoreFoundation 0x5880 __CFRunLoopDoSources0 + 348 20 CoreFoundation 0xaef8 __CFRunLoopRun + 768 21 CoreFoundation 0x1e240 CFRunLoopRunSpecific + 572 22 GraphicsServices 0x1988 GSEventRunModal + 160 23 UIKitCore 0x4e541c -[UIApplication _run] + 1080 24 UIKitCore 0x27eb88 UIApplicationMain + 336 25 TCPApp 0xa34528 main + 39 (AppDelegate.swift:39) 26 ??? 0x104bd83d0 (Missing)
Posted
by
Post not yet marked as solved
0 Replies
145 Views
Greetings, Recently I want to make a application that is capable of rotating my external display. I found these related APIs CGBeginDisplayConfiguration, CGDisplayRotation. The first one let me change height and width of display, but not the rotation angle, and the second one only shows the current rotation angle. I wonder is it possible for us to change the rotation of display someway through API? Thank you, Kuroame
Posted
by
Post not yet marked as solved
0 Replies
189 Views
Please refer below crashlytics Stack trace we had many crashes seeing in firebase crashlytics for latest build. Crashed: com.apple.main-thread 0 libsystem_kernel.dylib 0x2d328 __abort_with_payload + 8 1 libsystem_kernel.dylib 0x2fc34 abort_with_payload_wrapper_internal + 104 2 libsystem_kernel.dylib 0x2fbcc abort_with_payload_wrapper_internal + 30 3 libobjc.A.dylib 0x2d0d4 _objc_fatalv(unsigned long long, unsigned long long, char const*, char*) + 116 4 libobjc.A.dylib 0x2d060 _objc_fatalv(unsigned long long, unsigned long long, char const*, char*) + 30 5 libobjc.A.dylib 0x5a60 weak_register_no_lock + 392 6 libobjc.A.dylib 0xb3e0 objc_initWeak + 400 7 UIKitCore 0x4f93f0 -[UITableView _configureCellPrefetchingHandlers] + 288 8 UIKitCore 0x17b500 -[UITableView layoutSubviews] + 296 9 UIKitCore 0x18b844 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2592 10 QuartzCore 0x401c0 CA::Layer::layout_if_needed(CA::Transaction*) + 532 11 QuartzCore 0x325fc CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 136 12 QuartzCore 0x46f70 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 452 13 QuartzCore 0x4fe78 CA::Transaction::commit() + 704 14 QuartzCore 0x31d7c CA::Transaction::flush_as_runloop_observer(bool) + 88 15 UIKitCore 0x53d9d8 _UIApplicationFlushCATransaction + 72 16 UIKitCore 0x7d8084 _UIUpdateSequenceRun + 84 17 UIKitCore 0xe5dcb0 schedulerStepScheduledMainSection + 144 18 UIKitCore 0xe5d478 runloopSourceCallback + 92 19 CoreFoundation 0xbbf04 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 20 CoreFoundation 0xccc90 __CFRunLoopDoSource0 + 208 21 CoreFoundation 0x6184 __CFRunLoopDoSources0 + 268 22 CoreFoundation 0xbb4c __CFRunLoopRun + 828 23 CoreFoundation 0x1f6b8 CFRunLoopRunSpecific + 600 24 GraphicsServices 0x1374 GSEventRunModal + 164 25 UIKitCore 0x513e88 -[UIApplication _run] + 1100 26 UIKitCore 0x2955ec UIApplicationMain + 364 27 libswiftUIKit.dylib 0x30ecc UIApplicationMain(_:_:_:_:) + 104 28 AbhiBus 0x81fc main + 12 (main.swift:12) 29 ??? 0x10211dce4 (Missing)
Posted
by
Post not yet marked as solved
0 Replies
196 Views
I used to have a project that used Quartz Composer and OpenGL, but Xcode 13 has deprecated these two components, which caused me to fail to get off-screen images during video production. The previous code to create the OpenGLContext is as follows: (id) initOffScreenOpenGLPixelsWide:(unsigned)width pixelsHigh:(unsigned)height { //Check parameters - Rendering at sizes smaller than 16x16 will likely produce garbage if((width < 16) || (height < 16)) { [self release]; return nil; } self = [super init]; if(self != nil) {         NSOpenGLPixelFormatAttribute pixattributes[] = {             NSOpenGLPFADoubleBuffer,             NSOpenGLPFANoRecovery,             NSOpenGLPFAAccelerated,             NSOpenGLPFADepthSize, 24,             (NSOpenGLPixelFormatAttribute) 0         };         _pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:pixattributes]; //Create the OpenGL context to render with (with color and depth buffers) _openGLContext = [[NSOpenGLContext alloc] initWithFormat:_pixelFormat shareContext:nil]; if(_openGLContext == nil) { DDLogInfo(@"Cannot create OpenGL context"); [self release]; return nil; }                  //Create the OpenGL pixel buffer to render into         NSOpenGLPixelBuffer* glPixelBuffer = [[NSOpenGLPixelBuffer alloc] initWithTextureTarget:GL_TEXTURE_RECTANGLE_EXT    textureInternalFormat:GL_RGBA textureMaxMipMapLevel:0 pixelsWide:width pixelsHigh:height];         if(glPixelBuffer == nil) {             DDLogInfo(@"Cannot create OpenGL pixel buffer");             [self release];             return nil;         }         [_openGLContext setPixelBuffer:glPixelBuffer cubeMapFace:0 mipMapLevel:0 currentVirtualScreen:[_openGLContext currentVirtualScreen]];                  //Destroy the OpenGL pixel buffer         [glPixelBuffer release];          NSMutableDictionary* attributes = [NSMutableDictionary dictionary];         [attributes setObject:[NSNumber numberWithUnsignedInt:k32BGRAPixelFormat] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]; [attributes setObject:[NSNumber numberWithUnsignedInt:width] forKey:(NSString*)kCVPixelBufferWidthKey]; [attributes setObject:[NSNumber numberWithUnsignedInt:height] forKey:(NSString*)kCVPixelBufferHeightKey]; //Create buffer pool to hold our frames OSErr theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef)attributes, &_bufferPool); if(theError != kCVReturnSuccess)  { DDLogInfo(@"CVPixelBufferPoolCreate() failed with error %i", theError); [self release]; return nil; } }     /*      *A context is current on a per-thread basis. Multiple threads must serialize calls into the same context object.      */     [self.openGLContext makeCurrentContext]; return self; } By creating an NSOpenGLPixelBuffer object, and then setting the pixelbuffer of NSOpenGLContext, but in Xcode13, NSOpenGLPixelBuffer cannot be created successfully. Looking at the help documentation, it is recommended to use GL_EXT_framebuffer_object instead. So I tried the following code::         //RGBA8 RenderBuffer, 24 bit depth RenderBuffer, 256x256         glGenFramebuffersEXT(1, &fb);         glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);         //Create and attach a color buffer                  glGenRenderbuffersEXT(1, &color_rb);         //We must bind color_rb before we call glRenderbufferStorageEXT         glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, color_rb);         //The storage format is RGBA8         glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_RGBA, width, height);         //Attach color buffer to FBO         glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, color_rb);         //-------------------------                  glGenRenderbuffersEXT(1, &depth_rb);         glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depth_rb);         glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT24, width, height);         //-------------------------         //Attach depth buffer to FBO         glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth_rb);         //-------------------------         //Does the GPU support current FBO configuration?         GLenum status;         status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);         switch(status)         {             case GL_FRAMEBUFFER_COMPLETE_EXT:                 DDLogInfo(@"gl no problem");                 break;             default:                 DDLogInfo(@"error");                 break;         }                  //-------------------------         //and now you can render to the FBO (also called RenderBuffer)         glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb); When running the program we can get the 'gl no problem' log. However, when reading off-screen image data, although glGetError does not return an error code, I can only read a black image. In previous versions, a QCRenderer rendered image could be successfully obtained. Reading off-screen images is implemented as follows: (CVPixelBufferRef) readPixelBuffer {     // Create pixel buffer from pixel buffer pool     CVPixelBufferRef bufferRef;     OSErr theError = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, _bufferPool, &bufferRef);     if(theError) {         DDLogInfo(@"CVPixelBufferPoolCreatePixelBuffer() failed with error %i", theError);         return nil;     }     theError = CVPixelBufferLockBaseAddress(bufferRef, 0); if(theError) { DDLogInfo(@"CVPixelBufferLockBaseAddress() failed with error %i", theError); return nil; } void* bufferPtr = CVPixelBufferGetBaseAddress(bufferRef);     size_t width = CVPixelBufferGetWidth(bufferRef);     size_t height = CVPixelBufferGetHeight(bufferRef); size_t bufferRowBytes = CVPixelBufferGetBytesPerRow(bufferRef);     CGLContextObj cgl_ctx = [_openGLContext CGLContextObj]; CGLLockContext(cgl_ctx);     //Read pixels back from the OpenGL pixel buffer in ARGB 32 bits format - For extra safety, we save / restore the OpenGL states we change     GLint save; glGetIntegerv(GL_PACK_ROW_LENGTH, &save); glPixelStorei(GL_PACK_ROW_LENGTH, (int)bufferRowBytes / 4); glReadPixels(0, 0, (GLsizei)width, (GLsizei)height, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, bufferPtr);     flipImage(bufferPtr, width, height, bufferRowBytes);     glPixelStorei(GL_PACK_ROW_LENGTH, save); CGLUnlockContext(cgl_ctx);     GLenum code = glGetError(); if(code) return nil; CVPixelBufferUnlockBaseAddress(bufferRef, 0);     return bufferRef; } Ask an expert how to solve this problem.
Posted
by
Post not yet marked as solved
1 Replies
262 Views
I wonder if anyone else misses Quartz Composer on the mac. It was a really powerful, native and free app for creatives, with a small but devoted user base, which would have only grown if Apple had shown any support. Upvote this post if you also agree Apple should bring support for Quartz Composer back.
Posted
by
Post not yet marked as solved
0 Replies
268 Views
If I enlarge up the Image 8x with pinch out gesture, there is a phenomenon that displaying area of the subimage and the touchable area of the sub image is not matched I think it looks like the touchable area has been moved from the image area to the top left by 20dp. Please guide me on how to exactly match the displayed area and the touched area of the subimage on the ImageView I set the autoresizesSubViews property of ImageView to true, and Subimages were scaled with CATransform3DMakeScale when pinched out.
Posted
by
Post not yet marked as solved
2 Replies
745 Views
When I opened the video in HTML, it crashed when I started playing it。 Here is my app crash report: 0 CoreFoundation __HALT + 2 arrow_right 1 QuartzCore CA::Layer::setter(unsigned int, _CAValueType, void const*) + 252 2 QuartzCore[CALayer setBackgroundColor:] + 56 3 QuartzCore[CAStateSetValue apply:] + 620 4 QuartzCore[CAStateController setState:ofLayer:transitionSpeed:] + 1364 5 AVKit[AVMicaPackage transitionToStateWithName:onLayer:] + 144 6 AVKit[AVMicaPackage transitionToStateWithName:] + 92 7 AVKit[AVMicaPackage _setState:] + 108 8 AVKit[AVMicaPackage setState:color:] + 116 9 AVKit[AVPlaybackControlsRoutePickerView updateButtonAppearance] + 204 10 AVKit[AVRoutePickerView layoutSubviews] + 632 11 UIKitCore[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2500 12 QuartzCore[CALayer layoutSublayers] + 296 13 QuartzCore CA::Layer::layout_if_needed(CA::Transaction*) + 524 14 QuartzCore CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 144 15 QuartzCore CA::Context::commit_transaction(CA::Transaction*, double, double*) + 416 16 QuartzCore CA::Transaction::commit() + 732 17 QuartzCore CA::Transaction::observer_callback(__CFRunLoopObserver*, unsigned long, void*) + 96 18 CoreFoundation CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION + 36 19 CoreFoundation __CFRunLoopDoObservers + 576 20 CoreFoundation __CFRunLoopRun + 1056 21 CoreFoundation CFRunLoopRunSpecific + 600 22 GraphicsServices GSEventRunModal + 164 23 UIKitCore[UIApplication _run] + 1072 24 UIKitCore UIApplicationMain + 168 25 AXNews main.m - 第 15 行 main + 15 26 libdyld.dylib start + 4
Posted
by
Post not yet marked as solved
2 Replies
1.5k Views
On iOS, I'm trying to convert the contents of a view produced by snapshotView(afterScreenUpdates:) into a UIImage. At the point where I'm trying to do this, I no longer have the underlying objects that were rendered into the snapshot - I only have the snapshot view.Although this snapshot view displays correctly on the device screen at all times, I seem to be unable to convert it to an image to save to disk.For example, say I do something like: let ssView = someView.snapshotView(afterScreenUpdates:true)And then later, I want to convert ssView to a UIImage. I have tried all the usual patterns that work with other (non-snapshot) views such as: UIGraphicsBeginImageContext(ssView.frame.size) ssView.drawHierarchy(in: ssView.bounds, afterScreenUpdates: true) let image = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext()or let renderer = UIGraphicsImageRenderer(size: ssView.frame.size) let image = renderer.image { context in return ssView.layer.render(in: context.cgContext) }And many other variations... I've also tried to render the hierarchy from superviews of the snapshot view but no matter what I try, it doesn't work. Attempting to capture the snapshot view directly produces a blank image and if the parent view is captured, the portion of the screen where the snapshot view resides is blank.Does anyone have any idea how to make this work?Thanks,David
Posted
by
Post not yet marked as solved
2 Replies
1.7k Views
Hi,in our app we render PDF pages with a CATiledLayer to screen. We have occasional crashes mostly on iPads. Every time it crashes there are two or three concurrent threads rendering the page. I'm not able to reproduce it.Any idea how to fix this?Code looks like this: override func draw(_ layer: CALayer, in ctx: CGContext) { guard let page = self.page else { return } ctx.setFillColor(red: 1.0,green: 1.0,blue: 1.0,alpha: 1.0) ctx.fill(ctx.boundingBoxOfClipPath) let layerSize = layer.bounds.size // Flip the context so that the PDF page is rendered right side up. ctx.translateBy(x: 0.0, y: layerSize.height) ctx.scaleBy(x: 1.0, y: -1.0) let pageMediaBox = page.getBoxRect(.mediaBox) let fillWithPDFPageVerticalProportion = layerSize.height / pageMediaBox.height let fillWithPDFPageHorizontalProportion = layerSize.width / pageMediaBox.width let fillWithPDFPageProportion = min(fillWithPDFPageVerticalProportion, fillWithPDFPageHorizontalProportion) ctx.translateBy( x: -(pageMediaBox.width * fillWithPDFPageProportion - layerSize.width) / 2, y: -(pageMediaBox.height * fillWithPDFPageProportion - layerSize.height) / 2 ) ctx.scaleBy(x: fillWithPDFPageProportion, y: fillWithPDFPageProportion) ctx.drawPDFPage(page) }This is a stack trace from the crashing thread:#8. Crashed: com.apple.root.default-qos0 libobjc.A.dylib 0x185a8c1a0 objc_retain + 161 ImageIO 0x1888dd5cc IIOImageRead::IIOImageRead(CGDataProvider*, bool) + 882 ImageIO 0x1888df93c CGImageReadCreateWithProvider + 1963 ImageIO 0x18874ac6c IIOImageSource::IIOImageSource(CGDataProvider*, IIODictionary*) + 964 ImageIO 0x18874ec50 CGImageSourceCreateWithDataProvider + 1725 CoreGraphics 0x18803a464 CGImageCreateWithJPEGDataProvider3 + 846 CoreGraphics 0x187fb7264 create_image_for_image + 1887 CoreGraphics 0x187fb713c CGPDFImageCreateImage + 1848 CoreGraphics 0x1880256f8 CGPDFDrawingContextDrawImage + 409 CoreGraphics 0x1880de03c op_Do + 10410 CoreGraphics 0x1882e6684 pdf_scanner_handle_xname + 14411 CoreGraphics 0x1882e5f0c CGPDFScannerScan + 36812 CoreGraphics 0x1882f0170 CGPDFDrawingContextDrawPage + 51613 CoreGraphics 0x1880c0a04 pdf_page_draw_in_context + 13214 CoreGraphics 0x187fa763c CGContextDrawPDFPageWithDrawingCallbacks + 7615 CoreGraphics 0x187fa7310 CGContextDrawPDFPage + 3216 myApp 0x1009e8cf0 specialized PDFTiledView.draw(CALayer, in : CGContext) -&gt; () (PDFTiledView.swift:59)17 myApp 0x1009e88ec @objc PDFTiledView.draw(CALayer, in : CGContext) -&gt; () (PDFTiledView.swift)18 QuartzCore 0x18a86b51c -[CALayer drawInContext:] + 29619 QuartzCore 0x18a79c780 tiled_layer_render(_CAImageProvider*, unsigned int, unsigned int, unsigned int, unsigned int, void*) + 153220 QuartzCore 0x18a836448 CAImageProviderThread(unsigned int*, bool) + 81221 libdispatch.dylib 0x1861a6a14 _dispatch_client_callout + 1622 libdispatch.dylib 0x1861adbc8 _dispatch_queue_override_invoke$VARIANT$mp + 71623 libdispatch.dylib 0x1861b3cf4 _dispatch_root_queue_drain + 60024 libdispatch.dylib 0x1861b3a38 _dispatch_worker_thread3 + 12025 libsystem_pthread.dylib 0x18644f06c _pthread_wqthread + 126826 libsystem_pthread.dylib 0x18644eb6c start_wqthread + 4There are two threads with this stack trace:#9. com.apple.root.default-qos0 libsystem_kernel.dylib 0x18633c138 __psynch_mutexwait + 81 libsystem_pthread.dylib 0x186453660 _pthread_mutex_lock_wait + 962 libsystem_pthread.dylib 0x1864535a4 _pthread_mutex_lock_slow$VARIANT$mp + 2643 CoreGraphics 0x187fb70ac CGPDFImageCreateImage + 404 CoreGraphics 0x1880256f8 CGPDFDrawingContextDrawImage + 405 CoreGraphics 0x1880de03c op_Do + 1046 CoreGraphics 0x1882e6684 pdf_scanner_handle_xname + 1447 CoreGraphics 0x1882e5f0c CGPDFScannerScan + 3688 CoreGraphics 0x1882f0170 CGPDFDrawingContextDrawPage + 5169 CoreGraphics 0x1880c0a04 pdf_page_draw_in_context + 13210 CoreGraphics 0x187fa763c CGContextDrawPDFPageWithDrawingCallbacks + 7611 CoreGraphics 0x187fa7310 CGContextDrawPDFPage + 3212 myApp 0x1009e8cf0 specialized PDFTiledView.draw(CALayer, in : CGContext) -&gt; () (PDFTiledView.swift:59)13 myApp 0x1009e88ec @objc PDFTiledView.draw(CALayer, in : CGContext) -&gt; () (PDFTiledView.swift)14 QuartzCore 0x18a86b51c -[CALayer drawInContext:] + 29615 QuartzCore 0x18a79c780 tiled_layer_render(_CAImageProvider*, unsigned int, unsigned int, unsigned int, unsigned int, void*) + 153216 QuartzCore 0x18a836448 CAImageProviderThread(unsigned int*, bool) + 81217 libdispatch.dylib 0x1861a6a14 _dispatch_client_callout + 1618 libdispatch.dylib 0x1861adbc8 _dispatch_queue_override_invoke$VARIANT$mp + 71619 libdispatch.dylib 0x1861b3cf4 _dispatch_root_queue_drain + 60020 libdispatch.dylib 0x1861b3a38 _dispatch_worker_thread3 + 12021 libsystem_pthread.dylib 0x18644f06c _pthread_wqthread + 126822 libsystem_pthread.dylib 0x18644eb6c start_wqthread + 4Thanks,Snert
Posted
by
Post not yet marked as solved
3 Replies
1.4k Views
Ok, maybe I'm hung up on Mac OS X design patterns, just tell me if that's the case.the problem:I need to draw a line on an existing CGImage.There's no LockFocus methods for CGImage, There's no obvious documented way to get the already created Context of a Bitmap in the CGImage. There's no obvious or explicit code explaining the procedure.I have the reference to the image, I have the two points that make up the line, I know all of the code to draw that line... I just cannot make the context associated to that image, the current context. What gives? what is the design pattern? Please tell me where I can read about this.
Posted
by