Core Graphics

RSS for tag

Harness the power of Quartz technology to perform lightweight 2D rendering with high-fidelity output using Core Graphics.

Core Graphics Documentation

Posts under Core Graphics tag

51 Posts
Sort by:
Post not yet marked as solved
0 Replies
72 Views
extension UIView { func takeSnapshot(rect : CGRect? = CGRect.zero) -> UIImage? { let renderer = UIGraphicsImageRenderer(size: frame.size) var image = renderer.image { _ in drawHierarchy(in: bounds, afterScreenUpdates: true) } if let imageRect = rect, imageRect != CGRect.zero { let screenshotFrame = CGRect(x: imageRect.origin.x * UIScreen.main.scale, y: imageRect.origin.y * UIScreen.main.scale, width: imageRect.size.width * UIScreen.main.scale, height: imageRect.size.height * UIScreen.main.scale) let imageRef = image.cgImage!.cropping(to: screenshotFrame) image = UIImage.init(cgImage: imageRef!, scale: image.scale, orientation: image.imageOrientation) } UIGraphicsEndImageContext() return image } } which was working fine until I updated to macOS 14.4.1 from 14.2.1 and to Xcode 15.3 from 15.0. issue From my Mac Catalyst app, if I try to take screenshot of imageView, the screenshot is brighter. If I try this method it seems working: func takeSnapshotWithoutScale() -> UIImage? { UIGraphicsBeginImageContextWithOptions(self.frame.size, false, 0) if let currentContext = UIGraphicsGetCurrentContext() { self.layer.render(in: currentContext) } let newImage = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() return newImage }
Posted Last updated
.
Post not yet marked as solved
0 Replies
104 Views
I am trying to create a near real-time drawing of waveform data from within a SwiftUI app. The data is streaming in from the hardware and I've verified that the draw(in ctx: CGContext) override in my custom CALayer is getting called. I have added this custom CALayer class as a sublayer to a UIView instance that I am making available via the UIViewRepresentable protocol. The only time I see updated output from the CALayer is when I rotate the device and layout happens (I assume). How can I force SwiftUI to update every time I render new data in my CALayer? More Info: I'm porting an app from the Windows desktop. Previously, I tried to make this work by simply generating a new UIImage from a CGContext every time I wanted to update the display. I quickly exhausted memory with that technique because a new context is being created every time I call UIGraphicsImageRenderer(size:).image { context in }. What I really wanted was something equivalent to a GDI WritableBitmap. Apparently this animal allows a programmer to continuously update and re-use the contents. I could not figure out how to do this in Swift without dropping down to the old CGBitmapContext stuff written in C and even then I wasn't sure if that would give me a reusable context that I could output in SwiftUI each time I refreshed it. CALayer seemed like the answer. I welcome any feedback on a better way to do what I'm trying to accomplish.
Posted Last updated
.
Post marked as solved
1 Replies
174 Views
I have written some code for an interactive canvas here and it all compiles and works correctly: import SwiftUI extension CGPoint: Hashable { public func hash(into hasher: inout Hasher) { hasher.combine(x) hasher.combine(y) } } struct Line { var points = [CGPoint]() var color: Color = .red var lineWidth: Double = 10.0 } struct CharacterCanvas: View { @State private var currentLine = Line() @State private var lines: [Line] = [] var body: some View { Canvas(opaque: false, colorMode: .linear, rendersAsynchronously: false) { context, size in for line in lines { var path = Path() path.addLines(line.points) context.stroke(path, with: .color(line.color), lineWidth: line.lineWidth) } } .gesture(DragGesture(minimumDistance: 0, coordinateSpace: .local) .onChanged({ value in let newPoint = value.location currentLine.points.append(newPoint) self.lines.append(currentLine) }) .onEnded({ value in self.currentLine = Line() }) ) .frame(minWidth: UIScreen.main.bounds.size.width, minHeight: UIScreen.main.bounds.size.width) .border(.red) .padding() Button("Clear") { currentLine = Line() lines = [] } ScrollView { Text("Screen Size: \(UIScreen.main.bounds.size.width)") VStack { if !lines.isEmpty { ForEach(lines.last!.points, id: \.self) { point in Text("\(point.x), \(point.y)") } } } } } } #Preview { CharacterCanvas() } I now want to find 10 equally spaced points for each Line struct based on their points array so I can feed that into a CoreML model to classify the line type. How would I go about finding these 10 equally spaced points? I might also need to generate additional points if there are less than 10 points in the points array. Thanks, Jesse
Posted
by jcovin293.
Last updated
.
Post marked as solved
2 Replies
199 Views
Hello, Documentation says CGDisplayCreateImage() is deprecated. Are there any equivalent which can be used instead of CGDisplayCreateImage()? (any function which implements the same functionality) Thank you for the help, Pavel
Posted
by __Pavel__.
Last updated
.
Post not yet marked as solved
3 Replies
622 Views
Hello, I'm wondering if there is a way to programmatically write a series of UIImages into an APNG, similar to what the code below does for GIFs (credit: https://github.com/AFathi/ARVideoKit/tree/swift_5). I've tried implementing a similar solution but it doesn't seem to work. My code is included below I've also done a lot of searching and have found lots of code for displaying APNGs, but have had no luck with code for writing them. Any hints or pointers would be appreciated. func generate(gif images: [UIImage], with delay: Float, loop count: Int = 0, _ finished: ((_ status: Bool, _ path: URL?) -> Void)? = nil) { currentGIFPath = newGIFPath gifQueue.async { let gifSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFLoopCount as String : count]] let imageSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFDelayTime as String : delay]] guard let path = self.currentGIFPath else { return } guard let destination = CGImageDestinationCreateWithURL(path as CFURL, __UTTypeGIF as! CFString, images.count, nil) else { finished?(false, nil); return } //logAR.message("\(destination)") CGImageDestinationSetProperties(destination, gifSettings as CFDictionary) for image in images { if let imageRef = image.cgImage { CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary) } } if !CGImageDestinationFinalize(destination) { finished?(false, nil); return } else { finished?(true, path) } } } My adaptation of the above code for APNGs (doesn't work; outputs empty file): func generateAPNG(images: [UIImage], delay: Float, count: Int = 0) { let apngSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGLoopCount as String : count]] let imageSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGDelayTime as String : delay]] guard let destination = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.png.identifier as CFString, images.count, nil) else { fatalError("Failed") } CGImageDestinationSetProperties(destination, apngSettings as CFDictionary) for image in images { if let imageRef = image.cgImage { CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary) } } }
Posted
by wmk.
Last updated
.
Post not yet marked as solved
0 Replies
225 Views
We have started seeing a bunch of crashes in my app with the following crash log: Seems to happen inconsistently in the app and we are not able to replicate the crash locally. Does anyone have any idea what the crash might be caused by? Is it a bug in iOS 17? Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000000 Exception Codes: 0x0000000000000001, 0x0000000000000000 VM Region Info: 0 is not in any region. Bytes before following region: 4343709696 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> __TEXT 102e7c000-102e8c000 [ 64K] r-x/r-x SM=COW ...ea.app/MyApp Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [71670] Triggered by Thread: 0 Thread 0 name: Thread 0 Crashed: 0 libsystem_platform.dylib 0x000000022147ced4 _platform_memmove + 52 1 QuartzCore 0x00000001b9a66864 CA::Render::InterpolatedFunction::encode(CA::Render::Encoder*) const + 248 (render-function.cpp:591) 2 QuartzCore 0x00000001b9a66684 CA::Render::GradientLayer::encode(CA::Render::Encoder*) const + 44 (render-gradient-layer.cpp:658) 3 QuartzCore 0x00000001b995eb6c CA::Render::Layer::encode(CA::Render::Encoder*) const + 284 (render-layer.cpp:5504) 4 QuartzCore 0x00000001b995ea0c CA::Render::encode_set_object(CA::Render::Encoder*, unsigned long, unsigned int, CA::Render::Object*, unsigned int) + 196 (render-coding.cpp:2822) 5 QuartzCore 0x00000001b995be3c invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 244 (CAContextInternal.mm:3657) 6 QuartzCore 0x00000001b995bce4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368 (CALayer.mm:2786) 7 QuartzCore 0x00000001b995bc70 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 (CALayer.mm:2772) 8 QuartzCore 0x00000001b995bca4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 304 (CALayer.mm:2779) 9 QuartzCore 0x00000001b995bc70 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 (CALayer.mm:2772) 10 QuartzCore 0x00000001b99a0334 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11192 (CAContextInternal.mm:3662) 11 QuartzCore 0x00000001b9996c3c CA::Transaction::commit() + 648 (CATransactionInternal.mm:432) 12 QuartzCore 0x00000001b99968e4 CA::Transaction::flush_as_runloop_observer(bool) + 88 (CATransactionInternal.mm:942) 13 UIKitCore 0x00000001ba5f7228 _UIApplicationFlushCATransaction + 52 (UIApplication.m:3158) 14 UIKitCore 0x00000001ba5f6d40 _UIUpdateSequenceRun + 84 (_UIUpdateSequence.mm:119) 15 UIKitCore 0x00000001ba5f6430 schedulerStepScheduledMainSection + 144 (_UIUpdateScheduler.m:1037) 16 UIKitCore 0x00000001ba5f64ec runloopSourceCallback + 92 (_UIUpdateScheduler.m:1186) 17 CoreFoundation 0x00000001b8370acc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 (CFRunLoop.c:1957) 18 CoreFoundation 0x00000001b836fd48 __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2001) 19 CoreFoundation 0x00000001b836e4fc __CFRunLoopDoSources0 + 244 (CFRunLoop.c:2038) 20 CoreFoundation 0x00000001b836d238 __CFRunLoopRun + 828 (CFRunLoop.c:2955) 21 CoreFoundation 0x00000001b836ce18 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420) 22 GraphicsServices 0x00000001fae315ec GSEventRunModal + 164 (GSEvent.c:2196) 23 UIKitCore 0x00000001ba77b2fc -[UIApplication _run] + 888 (UIApplication.m:3690) 24 UIKitCore 0x00000001ba77a938 UIApplicationMain + 340 (UIApplication.m:5275) 25 UIKitCore 0x00000001ba99e44c UIApplicationMain(_:_:_:_:) + 104 (UIKit.swift:539) 26 MyApp 0x0000000102e8da4c specialized static UIApplicationDelegate.main() + 28 (<compiler-generated>:27) 27 MyApp 0x0000000102e8da4c static AppDelegate.$main() + 28 (AppDelegate.swift:0) 28 MyApp 0x0000000102e8da4c main + 120 29 dyld 0x00000001dab57d44 start + 2104 (dyldMain.cpp:1269)
Posted Last updated
.
Post not yet marked as solved
0 Replies
253 Views
Problem I need to import a video, process and then export the video with alpha. I noticed the video gets a lot grayer/loses quality compared to the original. I don't need any compression. Sidenote: I need to export video's with transparency enabled, that's why I use AVAssetExportPresetHEVCHighestQualityWithAlpha. It seems that that is causing the problem, since AVAssetExportPresetHighestQuality is looking good. This are side-by-side frames of the original and a video that's processed. The left is the original frame, the right is a processed video: https://i.stack.imgur.com/ORqfz.png This is another example where the bottom is exported and the above is the original. You can see at the bar where the YouTube NL is displayed, that the above one is almost fully black, while the below one (exported) is really gray: https://i.stack.imgur.com/s8lCn.png As far as I know, I don't do anything special, I just load the video and directly export it. It still loses quality. How can I prevent this? Reproduction path You can either clone the repository, or see the code below. The repository is available here: https://github.com/Jasperav/VideoCompression/tree/main/VideoCompressionTests. After you cloned it, run the only unit-test and check the logging of where the output of the video is stored. You can then observe that temp.mov is a lot grayer than the original video. The code of importing and exporting the video is here. As far as I can see, I just import and directly export the movie without modifying it. What's the problem? import AppKit import AVFoundation import Foundation import Photos import QuartzCore import OSLog let logger = Logger() class VideoEditor { func export( url: URL, outputDir: URL ) async { let asset = AVURLAsset(url: url) let extract = try! await extractData(videoAsset: asset) try! await exportVideo(outputPath: outputDir, asset: asset, videoComposition: extract) } private func exportVideo(outputPath: URL, asset: AVAsset, videoComposition: AVMutableVideoComposition) async throws { let fileExists = FileManager.default.fileExists(atPath: outputPath.path()) logger.debug("Output dir: \(outputPath), exists: \(fileExists), render size: \(String(describing: videoComposition.renderSize))") if fileExists { do { try FileManager.default.removeItem(atPath: outputPath.path()) } catch { logger.error("remove file failed") } } let dir = outputPath.deletingLastPathComponent().path() logger.debug("Will try to create dir: \(dir)") try? FileManager.default.createDirectory(atPath: dir, withIntermediateDirectories: true) var isDirectory = ObjCBool(false) guard FileManager.default.fileExists(atPath: dir, isDirectory: &amp;isDirectory), isDirectory.boolValue else { logger.error("Could not create dir, or dir is a file") fatalError() } guard let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHEVCHighestQualityWithAlpha) else { logger.error("generate export failed") fatalError() } exporter.outputURL = outputPath exporter.outputFileType = .mov exporter.shouldOptimizeForNetworkUse = false exporter.videoComposition = videoComposition await exporter.export() logger.debug("Status: \(String(describing: exporter.status)), error: \(exporter.error)") if exporter.status != .completed { fatalError() } } private func extractData(videoAsset: AVURLAsset) async throws -&gt; AVMutableVideoComposition { guard let videoTrack = try await videoAsset.loadTracks(withMediaType: .video).first else { fatalError() } let composition = AVMutableComposition(urlAssetInitializationOptions: nil) guard let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: videoTrack.trackID) else { fatalError() } let duration = try await videoAsset.load(.duration) try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: duration), of: videoTrack, at: CMTime.zero) let naturalSize = try await videoTrack.load(.naturalSize) let preferredTransform = try await videoTrack.load(.preferredTransform) let mainInstruction = AVMutableVideoCompositionInstruction() mainInstruction.timeRange = CMTimeRange(start: CMTime.zero, end: duration) let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack) let videoComposition = AVMutableVideoComposition() let frameRate = try await videoTrack.load(.nominalFrameRate) videoComposition.frameDuration = CMTimeMake(value: 1, timescale: Int32(frameRate)) mainInstruction.layerInstructions = [layerInstruction] videoComposition.instructions = [mainInstruction] videoComposition.renderSize = naturalSize return videoComposition } }
Posted
by Jasperav.
Last updated
.
Post not yet marked as solved
0 Replies
247 Views
I’m showing a PDF page in UIView’s subview using PDFKit along with some UILabel and UIImageView. At a time I’m only showing one page in PDFView. Users can change the size and position of this PDFView. class ResizablePDFView: PDFView { override func draw(_ rect: CGRect) { super.draw(rect) } override func draw(_ layer: CALayer, in ctx: CGContext) { let isPDF = !UIGraphicsGetPDFContextBounds().isEmpty if isPDF { if let document = self.document, document.pageCount > 0, let page = document.page(at: 0) { ctx.saveGState() ctx.scaleBy(x: 1, y: -1) ctx.translateBy(x: 0, y: -bounds.size.height) let pageBounds = page.bounds(for: .mediaBox) ctx.scaleBy( x: bounds.size.width / pageBounds.size.width, y: bounds.size.height / pageBounds.size.height) ctx.translateBy(x: -pageBounds.origin.x, y: -pageBounds.origin.y) page.draw(with: .mediaBox, to: ctx) ctx.restoreGState() } }else { super.draw(layer, in: ctx) } } } class ResizableLabelView: UILabel { func setup() { self.font = UIFont.systemFont(ofSize: 20) self.textColor = UIColor.systemBlue } override func draw(_ rect: CGRect) { super.draw(rect) } override func draw(_ layer: CALayer, in ctx: CGContext) { let isPDF = !UIGraphicsGetPDFContextBounds().isEmpty if isPDF { draw(bounds) }else { super.draw(layer, in: ctx) } } } CanvasView setup, class ViewController: UIViewController { var canvasView: UIView! var pdfView: ResizablePDFView! var label: ResizableLabelView! override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. self.view.backgroundColor = UIColor.gray self.canvasView = UIView(frame: CGRect(origin: CGPoint.zero, size: CGSize(width: 400, height: 573))) self.canvasView.center = self.view.center self.canvasView.backgroundColor = UIColor.white self.view.addSubview(self.canvasView) self.setupPDF() self.setupLabel() } func setupPDF() { self.pdfView = ResizablePDFView(frame: CGRect(origin: .zero, size: self.canvasView.frame.size)) self.pdfView.backgroundColor = UIColor.clear self.canvasView.addSubview(self.pdfView) self.pdfView.autoScales = false self.pdfView.displayMode = .singlePage self.pdfView.displaysPageBreaks = false self.pdfView.pageBreakMargins = UIEdgeInsets.zero self.pdfView.pageShadowsEnabled = false if let file = Bundle.main.url(forResource: "sample_pdf", withExtension: "pdf") { if let pdfDocument = PDFDocument(url: file) { let pageNumber: Int = 0 if let page = pdfDocument.page(at: pageNumber) { let pageDocument = PDFDocument() pageDocument.insert(page, at: 0) self.pdfView.document = pageDocument self.pdfView.minScaleFactor = self.pdfView.scaleFactorForSizeToFit self.pdfView.maxScaleFactor = self.pdfView.scaleFactorForSizeToFit self.pdfView.scaleFactor = self.pdfView.scaleFactorForSizeToFit } } } } func setupLabel() { self.label = ResizableLabelView(frame: CGRect(x: 10, y: 10, width: 200, height: 50)) self.label.setup() self.label.text = "Sample Text" self.label.sizeToFit() self.canvasView.addSubview(self.label) } } now, I'm creating the PDF from canvasView @IBAction func exportButtonAction(_ sender: UIButton) { let filePath = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("Exported_PDF.pdf") UIGraphicsBeginPDFContextToFile(filePath.path, .zero, [kCGPDFContextCreator as String: "PDF Export Demo App"]) guard let canvas = self.canvasView else { return } UIGraphicsBeginPDFPageWithInfo(canvas.bounds, nil) guard let context = UIGraphicsGetCurrentContext() else { return } canvas.setNeedsDisplay() canvas.layer.render(in: context) UIGraphicsEndPDFContext() print(filePath) } Now, This will render UILabel and UIImageView in PDF properly without rasterization and selectable text, but It does not draw PDFView like original pdf with links.
What am I doing wrong here? how can I debug this issue? https://drive.google.com/drive/folders/1qLcGNGWxoXbWJnr6xBxueKjPHnfxYS6D?usp=drive_link
Posted
by BV-OB.
Last updated
.
Post not yet marked as solved
1 Replies
510 Views
If someone in Apple WWDR sees this, please take the feedback to heart and report it up the chain: When you announce that a technology is being deprecated — such as CGDisplayStream — and also publish WWDC sessions about the intended replacement — ScreenCaptureKit — then you also need to give third-party developers a clear deadline by which this technology will be deprecated so that they can plan engineering efforts around implementing the new feature, and have ample time to communicate this to their customers. If it's important for third-party developers to get on board with this change, you should use every available means to communicate this to them, including multiple email alerts to their registered email address. Additionally, if you plan to make a BREAKING change in a framework that results in a wildly different user experience, you should probably hold that off until the summer release for the next major OS. What you should definitely NOT do is roll out a new privacy prompt in a mid-year release of macOS; or give your developers, customers, and AppleSeed program participants zero advance notice that this alert is coming, ignore your own Human Interface Guidelines when designing said prompt, and perform no user experience design testing (aka "putting on your customer hat") during a presumed internal alpha testing cycle to refine the experience and still find the most effective and least annoying way to present this additional prompt and spur change with your third-party developers. Oh, wait, you've done exactly all those things the wrong way with respect to ScreenCaptureKit. Right now, a host of Apple device administrators and client platform engineers are sending mountains of feedback to you, and they're also scrambling to contact third-party developers to let them know this is coming. Most of the vendors being discussed in private forums are said to be caught off guard by this change. We anticipate that users are not going to like this, and there is no way we can manage it with MDM or configuration profiles. In short, the current experience is a ghastly mess. WE, the administrators, will get blamed for this, not the third-party developers. WE will have to explain to our leadership why this experience is terrible and cannot be managed. Engineers need deadlines to help plan their work and prioritize tasks. In this case, vendors have had no firm deadline for this effort. There's already precedence for Apple announcing estimated deadlines for deprecations and feature removals. You do your developers and customers a great disservice by not communicating schedules to them. Please do better. P.S.: Feedback filed as FB13619326.
Posted Last updated
.
Post not yet marked as solved
2 Replies
635 Views
I'm trying to resize NSImages on macOS. I'm doing so with an extension like this. extension NSImage { // MARK: Resizing /// Resize the image to the given size. /// /// - Parameter size: The size to resize the image to. /// - Returns: The resized image. func resized(toSize targetSize: NSSize) -> NSImage? { let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height) guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else { return nil } let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in return representation.draw(in: frame) }) return image } } The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached. This becomes pure red when examing the image result. If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?
Posted Last updated
.
Post not yet marked as solved
1 Replies
374 Views
CGImageSourceCreateThumbnailAtIndex function isn't generating cgImage for majority of the images on iOS 17.4 OS version. It works if I pass in option kCGImageSourceThumbnailMaxPixelSize, but doesn't work if this key is missing. This function works with and without kCGImageSourceThumbnailMaxPixelSize in stable OS versions. Is this a new change in iOS 17.4 beta versions?
Posted Last updated
.
Post not yet marked as solved
0 Replies
265 Views
Hello everyone, I found a few interesting places in the BadgeSymbol.swift file in this tutorial which looked like they could be improved: under GeometryReader, path.addlines just uses all of the points defined to create the shape. The code is written as an instruction to start and end at the same point which is redundant since a shape can be drawn starting at any point, and the start and endpoint will always be the same. path.move is not needed since path.addlines always defines a new shape based on the points in the new array. It doesn't reference back to a previous path.addlines block. The screen references only work in portrait mode. This is because middle is half of the smaller of the screen width (left to right) and height (top to bottom). So in landscape mode middle references the height to compute the left-right position which makes no sense. A cleaner way to do this is to reference the shape to the top center of the screen with middle to always be halfway between the left and right edges of the screen and spacing for distance from the top of the screen. And instead of using spacing to define the left/right bottom corners, the left/right position of these points should be referenced to the middle of the shape. The shape will always fit the screen as long as all of the size of the shape is constrained by let width = min(geometry.size.width, geometry.size.height) Anyways here is my code with comments for where changes have been made: import SwiftUI struct BadgeSymbol: View { static let symbolColor = Color(red: 79.0 / 255, green: 79.0 / 255, blue: 191.0 / 255) var body: some View { GeometryReader { geometry in Path { path in let width = min(geometry.size.width, geometry.size.height) let height = width * 0.75 let spacing = width * 0.030 //let middle = width * 0.5 //this reference is wrong in landscape mode where width = geometry.size.height let middle = geometry.size.width * 0.5 //this correctly reference halfway between the left and right edges of the screen let topWidth = width * 0.226 let topHeight = height * 0.488 path.addLines([ CGPoint(x: middle, y: spacing), CGPoint(x: middle - topWidth, y: topHeight - spacing), CGPoint(x: middle, y: topHeight / 2 + spacing), CGPoint(x: middle + topWidth, y: topHeight - spacing) //remove comma for last one /* CGPoint(x: middle, y: spacing) this is not needed...CoreGraphics closes the shape */ ]) /* path.move(to: CGPoint(x: middle, y: topHeight / 2 + spacing * 3)) this is not needed...CoreGraphics starts a new shape at the new path.addLines statement */ path.addLines([ CGPoint(x: middle - topWidth, y: topHeight + spacing), /*CGPoint(x: spacing, y: height - spacing), CGPoint(x: width - spacing, y: height - spacing), reference middle of the figure, not the edge of the screen! */ CGPoint(x: middle - topWidth*2.0, y: height - spacing), CGPoint(x: middle + topWidth*2.0, y: height - spacing), //now CGPoint(x: middle + topWidth, y: topHeight + spacing), CGPoint(x: middle, y: topHeight / 2 + spacing * 3) ]) } .fill(Self.symbolColor) } } } #Preview { BadgeSymbol() } I'm just learning so if I got any of this wrong please let me know!
Posted
by odonov8.
Last updated
.
Post not yet marked as solved
0 Replies
334 Views
The error is: "Error Domain=MTLLibraryErrorDomain Code=3 ":1:10: fatal error: cannot open file './metal_types': Operation not permitted #include "metal_types"" On my Mac mini (with intel chip), I run flutter application in VScode lldb debugger and got this error, flutter application cannot draw its UI and shows a blank white window. My Xcode version is latest version 15.2. Flutter application can run normally in Mac mini M1 in VSCode lldb debugger, and can run normally without debugger in Mac mini Intel chip. In Metal framework and Core Graphic framework location, there is no file named "metal_types". Before, it didn't happen. I could run normal in vscode lldb debugger on Mac mini intel chip and M1. Anyone knows anythings, please comments. Thank you!
Posted
by naminh.
Last updated
.
Post marked as solved
2 Replies
378 Views
My app depends on the user granting Accessibility access (Allow this application to control your computer). There’s no formal permissions API (that I know of) for this, it just happens implicitly when I use the API for the first time. I get an error if the user hasn’t granted permission. If the user grants permission and I'm able to successfully register my CGEventTap (a modifier key event tap), but then later revokes it, key responsiveness goes awry. I don’t get any kind of error to my callback, but I do get tapDisabledByTimeout events periodically. I believe something is causing significant delays (but not preventing) in delivering events to my tap. Upon receiving this, I'm considering attempting to register another tap as a way to test permission, and disabling the real one if I no longer have permission. Does anyone have any better ideas? For Apple: see FB13533901.
Posted
by JetForMe.
Last updated
.
Post not yet marked as solved
2 Replies
2.1k Views
I really love Quartz Composer from Apple which is a quite old app, not updated for years. It works well on my 2015 mid MacBook Pro, but not on new M1 iMac. Does anyone know how to run this great app on my new machine? Thank you!
Posted Last updated
.
Post not yet marked as solved
2 Replies
353 Views
Hey Guys, I am writing a little Swift apllication, which runs on my Mac, that is connected to the TV. To control the mouse I use my own SmartHome-App. In this app I have implemented a touchpad like from the MacBook. If the user triggers a tap/drag event, a UDP message will be sent to the mentioned application (MouseServer). The MouseServer is listening for UDP messages and moves the mouse, when the command for mouse move was recieved. Everything is working very well. But with the following mouse move code, I can't open the apple menu bar on top of the screen if I move the mouse to the top (when for example the browser is in fullscreen mode). I hope you know what I mean. If you are in a fullscreen window, the top menu bar within the apple icon disappears. But when you move the cursor to the top, the menu bar appears again. This doesn't work with my code. I've tried many different approches, but can't get it to work. Do you have any Idea? func moveMouse(x: Int, y: Int) { // show cursor NSCursor.setHiddenUntilMouseMoves(false) NSCursor.unhide() // generate the CGPoint object for click event lastPoint = CGPoint(x: x, y: y) print("X: \(x), Y: \(y)") // --- Variant 1 --- // move the cursor to desired position CGDisplayMoveCursorToPoint(CGMainDisplayID(), lastPoint) CGDisplayShowCursor(CGMainDisplayID()) // --- Variant 2 --- //if let eventSource = CGEventSource(stateID: .hidSystemState) { // let mouseEvent = CGEvent(mouseEventSource: eventSource, mouseType: .mouseMoved, mouseCursorPosition: lastPoint, mouseButton: .left) // mouseEvent?.post(tap: .cghidEventTap) //} // --- Variant 3 --- //moveMouseWithAppleScript(x: x, y: y) } func moveMouseWithAppleScript(x: Int, y: Int) { let script = """ tell application "System Events" set mousePos to {\(x), \(y)} do shell script "caffeinate -u -t 0.1" do shell script "osascript -e \\"tell application \\\\\\"System Events\\\\\\" set position of mousePos to {item 1 of mousePos, item 2 of mousePos} end tell\\"" end tell """ let appleScript = NSAppleScript(source: script) var error: NSDictionary? appleScript?.executeAndReturnError(&amp;error) if let error = error { print("Error executing AppleScript: \(error)") } } Best regards, Robin11
Posted
by Robin11.
Last updated
.
Post not yet marked as solved
0 Replies
327 Views
When zoom in, CATiledLayer works very well. It shows previous layer while rendering next layer. I cannot aware the rendering. When zoom out, it sucks. It leaves blank when rendering smaller scale layer. How can I solve this??? For example, when downscale, put the draw rect into main thread?
Posted Last updated
.
Post not yet marked as solved
4 Replies
690 Views
Yesterday, my code ran just fine under the previous Xcode version. Today, some print() statements seem to come with extra lines. I know it sounds stupid, but my code did not change in the meantime. It doesn't appear to come from anything I control, almost like some Apple code emits an extra line feed somewhere. It's just a Swift Mac App I built to make my digital art; otherwise, nothing else is incorrect, just these odd lines. It's not as simple as just making a test case with a few print("***") statements, it seems to require other code to run in between calls to print. Most of my app is using CoreGraphics. It has no UI. It's like when you see spurious Apple debugging info in the console sometimes, but it's only a blank line this time. It's not a big issue, just annoying.
Posted
by bigidle.
Last updated
.
Post not yet marked as solved
1 Replies
428 Views
I am trying to generate a PDF file with certain components draw with Spot Colours. Spot colours are used for printing and I am not clear on how one would do that but I think that if I can create a custom ColorSpace with a specific name or a color that has a specific name - our printer looks for the name Spot1 and they use the colour green. Can anyone shed any light on how I might be able to do this. For reference I have attached two pdf files with two different spot colours in them. I need to be able to create similar using CGContext and CGPDFDocument. I can already generate the PDF documents using CMYK colors but don't know how I can create the equivalent "spot" colors. At the moment I am loading the page from these attached pdf files and scaling them to fill the page to get a background with the spot color. This works fine but I also need to generate text and lines using this same spot color and I am not clear how I could do that using the Core Graphics APIs. My guess is I need to create a custom ColorSpace with a single color and then use that color for drawing with. The only 'custom' option for creating a ColorSpace seems to be the CGColorSpace(propertyListPList:) constructor, however there does not appear to be any documentation on what needs to be in the property list to do so. Nor can I find any examples of that. Any pointers would be appreciated. Regards
Posted
by duncang.
Last updated
.