PencilKit

RSS for tag

Capture touch input as an opaque drawing and turn it into high-quality images that can be displayed on iOS and macOS using PencilKit.

PencilKit Documentation

Posts under PencilKit tag

49 Posts
Sort by:
Post not yet marked as solved
0 Replies
247 Views
Hi. My goal is to convert PKDrawing to NSImage on Mac Server. For this, I use Vapor framework and receive POST request with PKDraiwng's base64 string(from PKDrawing.dataRepresentaion()). However, this task succeeds on local device, but fails on Remote Mac server. (with exact same logic & data) In failing cases, input base64 string was successfully converted to PKDrawing, but PKDrawing to NSImage converting was failed. (NSImage object is located at 0x000000 memory address) Is there any hardware dependencies to convert PKDrawing to NSImage? Please give me some help. let base64String = "d3Jk8AEACAASEAAAAAAAAAAAAAAAAAAAAAASECTanLurR0TesQ6DbJmhcB4aBggAEAAYABoGCAEQARgBIi4KFA39/Hw/FcXERD4dhYSEPiXNzEw/EhRjb20uYXBwbGUuaW5rLm1hcmtlchgDKuMHChDT0UH2JnhFKJXSKOjPQ4MBEgYIABABGAEaBggAEAEYACAAKp8HChD+PS4PhSlOD4NNWagXDeasEXgNipZotcNBGCwg4wMoHDIIAABcQegDAAA68AZww4BBYPKVQgAAAADNABSQwlXQOFhkkEEAiZhCaJFtPSkB0o7CVQhBMDaaQcBDlkLn+6k9OAHOjsJVX0IQuaRBUDOSQs3MzD04Ac6OwlVfQvA7r0HQS41CQmDlPTABzo7CVV9CsPq2QYC7iEKPwvU9JgHOjsJVqUGArrtBYPaFQgAAAD4mAc6OwlXCQOCNv0GQCINCkxgEPiYBzo7CVcJAQG3DQSDkf0InMQg+JwHOjsJV2kAIcMdBwAh5QrpJDD4qAc6OwlUoQWCeykEA3HFCTmIQPi4Bzo7CVX1BGD/NQeBdakLhehQ+MwHOjsJV80HYkNBBQJphQpqZGT45Ac6OwlV5QoDP0UEgHFpCLbIdPj0Bzo7CVdBCkDHTQQCeUkLByiE+QAHOjsJVGEM4cNRB4MJKQlTjJT5BAc6OwlU/Q0jS1UHARENC5/spPkMBzo7CVVtD6F/WQUB1O0J7FC4+QwHOjsJVW0MAc9hBQAMyQjMzMz5DAc6OwlVbQ6ix2UEghSpCx0s3PkMBzo7CVVtDuBPbQQAHI0JaZDs+QwHOjsJVW0NoA91BQNobQu58Pz5DAc6OwlVbQyCk30GAChVCgZVDPkMBzo7CVVtDeNLiQYCADkIUrkc+QwHOjsJVW0M4JOZBYAIHQs3MTD5DAc6OwlVbQ1hV7UFA2/lB9P1UPkMBzo7CVVtDgDf1QUD35kEbL10+QwHmjsJVW0PYGwFCwM7OQfp+aj5DAamPWVVbQ8y1BkKA0r9BtMh2PkMBJJD/VFtDkJYMQgC+tEFKDII+QwGIkMZUW0OwxxNCgDOwQXE9ij4/AeCQtFRbQ6jwG0IAebFBqvGSPjgB/ZC0VAdDyCEjQkDqvUGamZk+OAH9kLRUW0JsqypCACzOQXe+nz44Af2QtFRbQtzsL0JA49lBCtejPjgB/ZC0VFtCHHU1QoCa5UGwcqg+OAH9kLRUbUKMtjpCQAzwQUSLrD4+Af2QtFTqQnifP0KAfvhB16OwPkYB/ZC0VJ1DaBdHQoDJAULHS7c+UQH9kLRUoUQ0qU1CIL0EQqRwvT5XAf2QtFQtRXziU0JADgZCkxjEPlkB/ZC0VFdFgEtfQkCxBUJg5dA+WQH9kLRUV0XQNWZC4BoCQj0K1z7zAP2QtFRXRWRdbELAIPtB46XbPkwA/ZC0VD4tMhQNAACgQBUAAGBBHQAAhEIlAACMQkDAn9OfkwU6BggAEAAYAEIQxjieV95cTuaAtuVXCPxhZA==" let drawingData: PKDrawing = try! .init(data: Data(base64Encoded: base64String)!) let image = drawingData.image(from: drawingData.bounds, scale: 2) // -> fails
Posted
by
Post not yet marked as solved
0 Replies
235 Views
I have a drawing app that I created and have sold on the App Store since 2018. It requires an Apple Pencil. My app uses the azimuth feature to orient the brush pattern that is drawn on the screen. A user just contacted me and said his azimuth feature is not working. Now it works fine for me, and I have not heard complaints from other users, so this leads me to believe it could be a problem with his Pencil. I wanted to show my code just to rule out that the problem is on my end and also get suggestions on what I should tell this user to do. Should he just contact Apple about it?                      // calculate the pencil direction let vector1 = touch.azimuthUnitVector(in: selectedCanvas) let angle = atan2(vector1.dy, vector1.dx) let azimu = angle - CGFloat(Double.pi) / 2.0 // adjust for wonky azimuth rotation translation if azimu >= -4.5 && azimu <= 0 {       rot = abs(azimu) } else {        rot = 6.2 - azimu } The azimu is then used as a point rotation in a basic Bézier curve and drawn on the screen. If it works on one iPad, it should work on all of them, right? He’s using an iPad Pro and so am I. I asked if he was using an Apple Pencil and he said yes.
Posted
by
Post not yet marked as solved
0 Replies
286 Views
I have 1st gen I Pencil, After connected once its show Not connected. Then Tap for Connect its show long time for failed or out of range or make sure accessories turn on but what is the problem i couldn't understand and how can i repair it ?
Posted
by
Post not yet marked as solved
2 Replies
458 Views
Hi together, I am developing a document viewer for a specific API. I download the relevant files to a custom directory and open them using a QLPreviewController in SwiftUI. I built this with a UIViewControllerRepresentable. Everything is working fine except the saving of modified files using the pencil markup in the preview. Here is the error: https://pastebin.com/TRnfduE5 This is how my controller looks like: struct PreviewController: UIViewControllerRepresentable {     let url: URL     @Binding var isPresented: Bool          func makeUIViewController(context: Context) -> UINavigationController {         let controller = QLPreviewController()         controller.dataSource = context.coordinator         controller.navigationItem.leftBarButtonItem = UIBarButtonItem(                     barButtonSystemItem: .done, target: context.coordinator,                     action: #selector(context.coordinator.dismiss)                 )                  let navigationController = UINavigationController(rootViewController: controller)         return navigationController     }          func makeCoordinator() -> Coordinator {         return Coordinator(parent: self)     }          func updateUIViewController(_ uiViewController: UINavigationController, context: Context) {             let controller = QLPreviewController()             controller.dataSource = context.coordinator         }          class Coordinator: QLPreviewControllerDataSource {                  let parent: PreviewController                  init(parent: PreviewController) {             self.parent = parent         }                  @objc func dismiss() {             parent.isPresented = false         }                  func numberOfPreviewItems(             in controller: QLPreviewController         ) -> Int {             return 1         }                  func previewController(             _ controller: QLPreviewController, previewItemAt index: Int         ) -> QLPreviewItem {             return parent.url as QLPreviewItem         }                  func previewController(_ controller: QLPreviewController, editingModeFor previewItem: QLPreviewItem         ) -> QLPreviewItemEditingMode {             .createCopy         }                  func previewController(_: QLPreviewController, didUpdateContentsOf: QLPreviewItem){             print("Updated.")         }                  func previewController(_: QLPreviewController, didSaveEditedCopyOf: QLPreviewItem, at: URL) {             print("Saved: " + at.path)         }              } } Does anyone know, what the problem is here? And I made another experience: as you see there, I'm currently only logging the output / actions - and there isn't anything being logged, if I made just one edit. It only throws the error above after the 1st edit - so if I edit sth again and tap done or the pencil icon again... Is this ok so? Thanks for any help or advise!
Posted
by
Post not yet marked as solved
0 Replies
339 Views
Hello! I am working on an app that uses a native iOS as a base to show a 3D Room in Unity. I archive this by using: https://docs.unity3d.com/Manual/UnityasaLibrary-iOS.html I want to use iOS native because of the pencilKit support. I am now looking for the best option to share data between the two instances. I am not sure if the better way is to just let the native Swift App talk to a Realm and just send in the data I want to store from Unity to Swift, or if its possible to let both apps talk to the same database. The goal is to create a 3D Object in Unity (that has some properties like coordinates) and assign a PKDrawingFile to it. Thank you for all your help, Jakob
Posted
by
Post not yet marked as solved
0 Replies
224 Views
How to remove or hide thumbnail bar from QLPreviewcontroller in iOS 13? Based of our requirement we have to block delete pdf page option. So I tried to hide or remove thumbnail pdf bar, but I didn't get any solution. Basically I need markup tool all features(like drawing, signature, add text and add shape) to edit without allowing to page deletion. Please guide me to achieve this Thanks & Regards Ponlingam S
Posted
by
Post not yet marked as solved
0 Replies
247 Views
In WWDC-20 Inspect, modify, and construct PencilKit drawings, Will Thimbleby said: Spline-based recognition can make use of maskedPathRanges to provide a sensible interpretation of masked strokes, and this is what we do for handwriting recognition in Notes. If I have a PKCanvas and make a PKDrawing with Apple Pencil, how can I convert these PKStrokePaths: into str = "Hello, World!" Can this be done using Apple frameworks? Cheers!
Posted
by
Post not yet marked as solved
3 Replies
606 Views
I am starting to work with PencilKit and have a small sample app set up with Swift. When I draw a line too long it will crash from com.apple.pencilkit.renderer with the following: -[MTLDebugCommandBuffer lockPurgeableObjects]:2103: failed assertion `MTLResource 0x280584a10 (label: (null)), referenced in cmd buffer 0x107019400 (label: Live rendering command buffer) is in volatile or empty purgeable state at commit' I can draw lots of small lines without issue. Doesn't seem to matter what size the canvas is or any other specific attributes I can determine. It's a quick sample so there's really not much going on, and this appears to be coming from pencil kit and metal under the hood. I am using an iPad pro, iOS 15.1 and a pencil2 and using finger to draw with the same crash if it helps.
Posted
by
Post not yet marked as solved
2 Replies
385 Views
H i, I have used the PKCanvas view in my app. writing something in my PKCanvas View then convert it into a PDF. But adding the header for each page in PDF by using the following code,   let printable:CGRect = CGRect(x: 0, y: 50, width: 595 , height: 841)       render.setValue(NSValue(cgRect: page), forKey: "paperRect")       render.setValue(NSValue(cgRect: printable), forKey: "printableRect")       // 4. Create PDF context and draw       let pdfData = NSMutableData()       UIGraphicsBeginPDFContextToData(pdfData, CGRect(x:0,y:0,width: 595.2,height:841), nil)       for i in 1...render.numberOfPages {           UIGraphicsBeginPDFPage();           let bounds = UIGraphicsGetPDFContextBounds()           render.drawPage(at: i-1, in: bounds)       }       UIGraphicsEndPDFContext(); But while renderer the PKCanvas View it was broken, I think PKCanvas view rendering is not properly working in iOS 15.0,15.0.1 and 15.0.2. I have attached the following screenshot,
Posted
by
Post not yet marked as solved
2 Replies
589 Views
I would like to adjust the PKStrokePoint of the canvas drawing in the PencilKit API. The code below should generate the same strokes, but the drawing doesn't generate well depending on the type of pen selected. func generate_sameDrawing(drawing :PKDrawing) -> PKDrawing{     var newDrawingStrokes : [PKStroke] = []           for stroke in drawing.strokes {       var newPoints : [PKStrokePoint] = []       stroke.path.forEach { (point) in         let newPoint = PKStrokePoint(location: point.location,                        timeOffset: point.timeOffset,                        size: point.size,                        opacity: point.opacity,                        force: point.force,                        azimuth: point.azimuth,                        altitude: point.altitude)         newPoints.append(newPoint)       }       let newPath = PKStrokePath(controlPoints: newPoints, creationDate: Date())       let newStroke = PKStroke(ink: PKInk(stroke.ink.inkType, color: stroke.ink.color), path: newPath)                 newDrawingStrokes.append(newStroke)             }     let newDrawing = PKDrawing(strokes: newDrawingStrokes)     return newDrawing   } Here is an image showing the result. The pen and marker generate exactly the same drawing, but with pencil type, the thickness and opacity of the stroke change, and the stroke location also moves a little. If you find any bugs/mistakes do let me know.
Posted
by
Post not yet marked as solved
2 Replies
518 Views
I'm currently using PencilKit in an app that is forced to run in dark mode (UIUserInterfaceStyle == Dark in Info.plist). But for one UIViewController only, I'm using the light mode for note taking with PencilKit. I change the ToolPicker style by using toolPicker.overrideUserInterfaceStyle = .light and it works great that way. Unfortunately, if I want a custom color and I tap on the ColorPicker icon, it seems to still be in dark mode, because when I choose white, it draws black, and if I choose black, it draws black. This doesn't happen if I tap the black color in the predefined palette (not the color picker). Any idea how to force to color picker to respect the tool picker userInterfaceStyle?
Posted
by
Post not yet marked as solved
0 Replies
419 Views
I am working on an app that allow user to taking notes, and I want to support inline editing that means users can use PencilKit feature and edit text in a single note just like the Notes app. Is there any good idea to achieve this using SwiftUI?
Posted
by
Post marked as solved
3 Replies
896 Views
There is error when launch app which use Pencil kit When I try with iOS15 simulator or Xcode editor preview it crashed Here is error message Library not loaded: /usr/lib/swift/libswiftPencilKit.dylibI Reason: tried: '/Users/.../Products/Debug-iphonesimulator/libswiftPencilKit.dylib' (no such file) There is no problem with iOS14.5 simulator
Posted
by
Post not yet marked as solved
0 Replies
384 Views
I have done the same thing in SwiftUI using UIViewRepresentable, but toolPicker doesn't show so I checked isFirstResponder property and I found that it was still false after I called canvas.becomeFirstResponder(). Check this out: struct NoteCanvasView: UIViewRepresentable {     func makeUIView(context: Context) -> PKCanvasView {         let canvas = PKCanvasView()         canvas.drawingPolicy = .anyInput         canvas.delegate = context.coordinator.self                  let toolPicker = PKToolPicker()         toolPicker.setVisible(true, forFirstResponder: canvas)         toolPicker.addObserver(canvas)         print(canvas.canBecomeFirstResponder)         canvas.becomeFirstResponder()         print(canvas.isFirstResponder)         return canvas     }          func updateUIView(_ canvas: PKCanvasView, context: Context) {         canvas.becomeFirstResponder()     }          func makeCoordinator() -> Coordinator {         Coordinator(self)     }          class Coordinator: NSObject {         var parent: NoteCanvasView         init(_ parent: NoteCanvasView) {             self.parent = parent         }     } } I found canvas.canBecomeFirstResponder returns true and canvas.isFirstResponder always returns false. Is this a bug in current version of SwiftUI??
Posted
by
Post not yet marked as solved
0 Replies
406 Views
Hi , I was watching https://developer.apple.com/videos/play/wwdc2020/10148/ And I did read the demo project code but I don't understand how Apple know when write "W" in textfield it's "W" in drawing area ? I understand that we separate every char from drawing data , but how it matching with the textfield text ? this part it's confusing me
Posted
by
Post not yet marked as solved
0 Replies
484 Views
Hy! Hope you are all doing well. I already ask this question but no one give me any solution. I saw video of Inspecting, Modifying, and Constructing PencilKit Drawings. This video clear my all concepts. But I've one issue I want to draw on numbers like alphabets but there is no numbers file. In this code only uppercase and lowercase drawing file but these files not open. Firstly kindly tell me how to open this file and how to draw on numbers like alphabets. Your solutions are appreciation for me. I am waiting for your solution. Thanks.
Posted
by
Post not yet marked as solved
0 Replies
534 Views
I'm implementing swift UI image view overlaid by canvas view in Pencil kit Because Pencil kit is in UIKit I create another class for creating gesture recognizer Thankfully it work so I can zoom or pan on canvas view and the underling image view is responding But unlikely using Gesture and GestureState in SwiftUI It cause memory issue when I just zoom image it take almost 1GB Here is my code class GestureDelegate: NSObject, ObservableObject, UIGestureRecognizerDelegate { var zoomScale: CGFloat { fixedZoomScale * gestureZoomScale } var fixedZoomScale: CGFloat = 1 @Published var gestureZoomScale: CGFloat = 1 private(set) lazy var pinchGestureRecognizer: UIPinchGestureRecognizer = { let pinchGesture = UIPinchGestureRecognizer(target: self, action: #selector(pinchImage(_:))) pinchGesture.delegate = self return pinchGesture }() Here is canvas view in UIViewRepresentable There is pan gesture recognizer also but there is no problem with pan gesture struct BlurMaskView: UIViewRepresentable { private let canvas: PKCanvasView private let gestureDelegate: GestureDelegate func makeUIView(context: Context) -> PKCanvasView { canvas.drawingPolicy = .anyInput canvas.tool = tool canvas.backgroundColor = .clear canvas.addGestureRecognizer(gestureDelegate.pinchGestureRecognizer) canvas.addGestureRecognizer(gestureDelegate.panGestureRecognizer) return canvas }
Posted
by
Post not yet marked as solved
0 Replies
509 Views
Hy! Hope you are all doing well. I saw video of Inspecting, Modifying, and Constructing PencilKit Drawings. This video clear my all concepts. But I've one issue I want to draw on numbers like alphabets but there is no numbers file. In this code only uppercase and lowercase drawing file but these files not open. Firstly kindly tell me how to open this file and how to draw on numbers like alphabets. Your solutions are appreciation for me. I am waiting for your solution. Thanks.
Posted
by
Post not yet marked as solved
0 Replies
399 Views
on iOS, I want to add up undo/redo and a close button. On ipadOS, I only need to add a close button What’s your experience in adding a close button to the ToolPicker? Or at least have the position of the window so I can add an overlapping box (even on floating).