Post not yet marked as solved
It will be more natural if the other side of pencil can be used as eraser instead of having to choose eraser from toolbar (even double click on pencil).
Post not yet marked as solved
I am starting to work with PencilKit and have a small sample app set up with Swift. When I draw a line too long it will crash from com.apple.pencilkit.renderer with the following:
-[MTLDebugCommandBuffer lockPurgeableObjects]:2103: failed assertion `MTLResource 0x280584a10 (label: (null)), referenced in cmd buffer 0x107019400 (label: Live rendering command buffer) is in volatile or empty purgeable state at commit'
I can draw lots of small lines without issue. Doesn't seem to matter what size the canvas is or any other specific attributes I can determine. It's a quick sample so there's really not much going on, and this appears to be coming from pencil kit and metal under the hood.
I am using an iPad pro, iOS 15.1 and a pencil2 and using finger to draw with the same crash if it helps.
Post not yet marked as solved
In WWDC-20 Inspect, modify, and construct PencilKit drawings, Will Thimbleby said:
Spline-based recognition can make use of maskedPathRanges to provide a sensible interpretation of masked strokes, and this is what we do for handwriting recognition in Notes.
If I have a PKCanvas and make a PKDrawing with Apple Pencil, how can I convert these PKStrokePaths:
into
str = "Hello, World!"
Can this be done using Apple frameworks?
Cheers!
Post not yet marked as solved
How to remove or hide thumbnail bar from QLPreviewcontroller in iOS 13?
Based of our requirement we have to block delete pdf page option. So I tried to hide or remove thumbnail pdf bar, but I didn't get any solution. Basically I need markup tool all features(like drawing, signature, add text and add shape) to edit without allowing to page deletion.
Please guide me to achieve this
Thanks & Regards
Ponlingam S
Post not yet marked as solved
Hello!
I am working on an app that uses a native iOS as a base to show a 3D Room in Unity. I archive this by using:
https://docs.unity3d.com/Manual/UnityasaLibrary-iOS.html
I want to use iOS native because of the pencilKit support.
I am now looking for the best option to share data between the two instances. I am not sure if the better way is to just let the native Swift App talk to a Realm and just send in the data I want to store from Unity to Swift, or if its possible to let both apps talk to the same database.
The goal is to create a 3D Object in Unity (that has some properties like coordinates) and assign a PKDrawingFile to it.
Thank you for all your help,
Jakob
Post not yet marked as solved
Hi together,
I am developing a document viewer for a specific API. I download the relevant files to a custom directory and open them using a QLPreviewController in SwiftUI. I built this with a UIViewControllerRepresentable.
Everything is working fine except the saving of modified files using the pencil markup in the preview.
Here is the error:
https://pastebin.com/TRnfduE5
This is how my controller looks like:
struct PreviewController: UIViewControllerRepresentable {
let url: URL
@Binding var isPresented: Bool
func makeUIViewController(context: Context) -> UINavigationController {
let controller = QLPreviewController()
controller.dataSource = context.coordinator
controller.navigationItem.leftBarButtonItem = UIBarButtonItem(
barButtonSystemItem: .done, target: context.coordinator,
action: #selector(context.coordinator.dismiss)
)
let navigationController = UINavigationController(rootViewController: controller)
return navigationController
}
func makeCoordinator() -> Coordinator {
return Coordinator(parent: self)
}
func updateUIViewController(_ uiViewController: UINavigationController, context: Context) {
let controller = QLPreviewController()
controller.dataSource = context.coordinator
}
class Coordinator: QLPreviewControllerDataSource {
let parent: PreviewController
init(parent: PreviewController) {
self.parent = parent
}
@objc func dismiss() {
parent.isPresented = false
}
func numberOfPreviewItems(
in controller: QLPreviewController
) -> Int {
return 1
}
func previewController(
_ controller: QLPreviewController, previewItemAt index: Int
) -> QLPreviewItem {
return parent.url as QLPreviewItem
}
func previewController(_ controller: QLPreviewController, editingModeFor previewItem: QLPreviewItem
) -> QLPreviewItemEditingMode {
.createCopy
}
func previewController(_: QLPreviewController, didUpdateContentsOf: QLPreviewItem){
print("Updated.")
}
func previewController(_: QLPreviewController, didSaveEditedCopyOf: QLPreviewItem, at: URL) {
print("Saved: " + at.path)
}
}
}
Does anyone know, what the problem is here?
And I made another experience: as you see there, I'm currently only logging the output / actions - and there isn't anything being logged, if I made just one edit. It only throws the error above after the 1st edit - so if I edit sth again and tap done or the pencil icon again...
Is this ok so?
Thanks for any help or advise!
Post not yet marked as solved
I have 1st gen I Pencil, After connected once its show Not connected. Then Tap for Connect its show long time for failed or out of range or make sure accessories turn on but what is the problem i couldn't understand and how can i repair it ?
Post not yet marked as solved
I have a drawing app that I created and have sold on the App Store since 2018. It requires an Apple Pencil. My app uses the azimuth feature to orient the brush pattern that is drawn on the screen. A user just contacted me and said his azimuth feature is not working. Now it works fine for me, and I have not heard complaints from other users, so this leads me to believe it could be a problem with his Pencil. I wanted to show my code just to rule out that the problem is on my end and also get suggestions on what I should tell this user to do. Should he just contact Apple about it?
// calculate the pencil direction
let vector1 = touch.azimuthUnitVector(in: selectedCanvas)
let angle = atan2(vector1.dy, vector1.dx)
let azimu = angle - CGFloat(Double.pi) / 2.0
// adjust for wonky azimuth rotation translation
if azimu >= -4.5 && azimu <= 0 {
rot = abs(azimu)
} else {
rot = 6.2 - azimu
}
The azimu is then used as a point rotation in a basic Bézier curve and drawn on the screen.
If it works on one iPad, it should work on all of them, right? He’s using an iPad Pro and so am I. I asked if he was using an Apple Pencil and he said yes.
Post not yet marked as solved
Hi.
My goal is to convert PKDrawing to NSImage on Mac Server.
For this, I use Vapor framework and receive POST request with PKDraiwng's base64 string(from PKDrawing.dataRepresentaion()).
However, this task succeeds on local device, but fails on Remote Mac server. (with exact same logic & data)
In failing cases, input base64 string was successfully converted to PKDrawing, but PKDrawing to NSImage converting was failed. (NSImage object is located at 0x000000 memory address)
Is there any hardware dependencies to convert PKDrawing to NSImage?
Please give me some help.
let base64String = "d3Jk8AEACAASEAAAAAAAAAAAAAAAAAAAAAASECTanLurR0TesQ6DbJmhcB4aBggAEAAYABoGCAEQARgBIi4KFA39/Hw/FcXERD4dhYSEPiXNzEw/EhRjb20uYXBwbGUuaW5rLm1hcmtlchgDKuMHChDT0UH2JnhFKJXSKOjPQ4MBEgYIABABGAEaBggAEAEYACAAKp8HChD+PS4PhSlOD4NNWagXDeasEXgNipZotcNBGCwg4wMoHDIIAABcQegDAAA68AZww4BBYPKVQgAAAADNABSQwlXQOFhkkEEAiZhCaJFtPSkB0o7CVQhBMDaaQcBDlkLn+6k9OAHOjsJVX0IQuaRBUDOSQs3MzD04Ac6OwlVfQvA7r0HQS41CQmDlPTABzo7CVV9CsPq2QYC7iEKPwvU9JgHOjsJVqUGArrtBYPaFQgAAAD4mAc6OwlXCQOCNv0GQCINCkxgEPiYBzo7CVcJAQG3DQSDkf0InMQg+JwHOjsJV2kAIcMdBwAh5QrpJDD4qAc6OwlUoQWCeykEA3HFCTmIQPi4Bzo7CVX1BGD/NQeBdakLhehQ+MwHOjsJV80HYkNBBQJphQpqZGT45Ac6OwlV5QoDP0UEgHFpCLbIdPj0Bzo7CVdBCkDHTQQCeUkLByiE+QAHOjsJVGEM4cNRB4MJKQlTjJT5BAc6OwlU/Q0jS1UHARENC5/spPkMBzo7CVVtD6F/WQUB1O0J7FC4+QwHOjsJVW0MAc9hBQAMyQjMzMz5DAc6OwlVbQ6ix2UEghSpCx0s3PkMBzo7CVVtDuBPbQQAHI0JaZDs+QwHOjsJVW0NoA91BQNobQu58Pz5DAc6OwlVbQyCk30GAChVCgZVDPkMBzo7CVVtDeNLiQYCADkIUrkc+QwHOjsJVW0M4JOZBYAIHQs3MTD5DAc6OwlVbQ1hV7UFA2/lB9P1UPkMBzo7CVVtDgDf1QUD35kEbL10+QwHmjsJVW0PYGwFCwM7OQfp+aj5DAamPWVVbQ8y1BkKA0r9BtMh2PkMBJJD/VFtDkJYMQgC+tEFKDII+QwGIkMZUW0OwxxNCgDOwQXE9ij4/AeCQtFRbQ6jwG0IAebFBqvGSPjgB/ZC0VAdDyCEjQkDqvUGamZk+OAH9kLRUW0JsqypCACzOQXe+nz44Af2QtFRbQtzsL0JA49lBCtejPjgB/ZC0VFtCHHU1QoCa5UGwcqg+OAH9kLRUbUKMtjpCQAzwQUSLrD4+Af2QtFTqQnifP0KAfvhB16OwPkYB/ZC0VJ1DaBdHQoDJAULHS7c+UQH9kLRUoUQ0qU1CIL0EQqRwvT5XAf2QtFQtRXziU0JADgZCkxjEPlkB/ZC0VFdFgEtfQkCxBUJg5dA+WQH9kLRUV0XQNWZC4BoCQj0K1z7zAP2QtFRXRWRdbELAIPtB46XbPkwA/ZC0VD4tMhQNAACgQBUAAGBBHQAAhEIlAACMQkDAn9OfkwU6BggAEAAYAEIQxjieV95cTuaAtuVXCPxhZA=="
let drawingData: PKDrawing = try! .init(data: Data(base64Encoded: base64String)!)
let image = drawingData.image(from: drawingData.bounds, scale: 2) // -> fails
Post not yet marked as solved
I've got a question about PKPoint processing. I've got an application where I have to get both image and points which generated this image. However I've checked that points have been already interpolated, so there's no way to get raw points (which I thought should be similar to display's refresh rate x seconds requested to draw).
Is there a way to get raw points (not processed) and not interpolated points?
Post not yet marked as solved
I thought that PencilKit's PKPoint force was in a some kind of interval. However I don't find anywhere a max value for this pressure property.
Is there a max value? If not, how could I know the almost max pressure level value?
Post not yet marked as solved
I'm developing an app that allows you to view PDFs (music notes). I'm using PDFkit to display the pdf and need the pencilkit functionality while viewing the pdf (in the same way that you could mark up a book you are reading in iBooks when you click on the "Mark up" button). I am able to add a canvas on top of the pdf view however, I cannot save the canvas to be merged with the PDF. QuickLook offers a solution because the pencilkit looks to be built into it and works perfectly just the way I need it to, however, I don't need the other things that come with QuickLook. I desperately need your help with how I can get the pencilkit functionality on my custom PDF viewing app. At this point, I can't even find a solution for the "Mark up" button (pencil.tip.crop.circle) to be filled when tapped. Thank you in advance!
Post not yet marked as solved
It seems PKCanvasView overrides the property UIScrollViewDelegate which inherits from the UIScrollView to PKCanvasViewDelegate. And does not provide access to UIScrollViewDelegate.
In order to implement zooming, so I added a PKCanvasView into my own UIScrollView. And implemented delegate method viewForZooming in which return PKCanvasView.
But all drawing in PKCanvasView was blurred when zooming or scale. How to re-render drawing after zoom to make it has reasonable stroke width and clear ?
Some related code:
let canvasView = PKCanvasView()
let scrollView = UIScrollView()
override func viewDidLoad() {
super.viewDidLoad()
self.view.addSubview(scrollView)
scrollView.addSubview(canvasView)
scrollView.delegate = self
scrollView.minimumZoomScale = 0.5
scrollView.maximumZoomScale = 2.5
}
func viewForZooming(in scrollView: UIScrollView) -> UIView? {
return canvasView
}
Some solutions I had tried:
1: Reset PKCanvasView contentScaleFactor
func scrollViewDidEndZooming(_ scrollView: UIScrollView, with view: UIView?, atScale scale: CGFloat) {
if let canvas = view {
let contentScale = scale * UIScreen.main.scale
canvas.contentScaleFactor = contentScale
}
}
Not worked!
2: Re-render PKStroke:
func reRender(_ scale: CGFloat) {
let newStrokeWidth = strokeWidth * scale
var newDrawingStrokes: [PKStroke] = []
for stroke in canvasView.drawing.strokes {
canvasView.tool = PKInkingTool(.pen, color: .red, width: newStrokeWidth)
var newPoints = [PKStrokePoint]()
stroke.path.forEach { (point) in
let newPoint = PKStrokePoint(location: point.location,
timeOffset: point.timeOffset,
size: CGSize(width: newStrokeWidth, height: newStrokeWidth),
opacity: CGFloat(1), force: point.force,
azimuth: point.azimuth, altitude: point.altitude)
newPoints.append(newPoint)
}
let newPath = PKStrokePath(controlPoints: newPoints, creationDate: Date())
let newStroke = PKStroke(ink: PKInk(.pen, color: UIColor.red), path: newPath)
newDrawingStrokes.append(newStroke)
}
let newDrawing = PKDrawing(strokes: newDrawingStrokes)
canvasView.drawing = newDrawing
}
Not worked! Still blurred, just changed strokeWidth by multiply scale.
3: I try to reset PKDrawing or PKStroke transform by using scrollView scale. Then PKDrawing position disordered and it was still blurred.
Please help me.
Post not yet marked as solved
I created a document based app with SwiftUI and a PKCanvasView.
Now I want to save the drawings of the PKCanvasView inside of the file.
How can I do this?
Post not yet marked as solved
The title is my question.
Post not yet marked as solved
Many users reported my app use too much energy. I have profiled the app and find out the problem is from PencilKit.
It only happens on a canvas with more than 5000 strokes: When user use pen or any ink tool to draw something on the screen and leave the pencil off from the screen, the CPU go immediately to 100-165% for 3 seconds then go back to 0%.
I have time profiled CPU and found out that 90% of the calculation are from CoreHandwriting.
I suppose each time when user did draw something on the screen, PencilKit will try to recognize text from the handwriting in the background. That leads to high CPU usage.
If this is true, I hope there will be an option to disable the CoreHandwriting.
Otherwise I hope PencilKit will be updated to consume less energy, because my app received many bad reviews because of this problem and I can't do anything.
Post not yet marked as solved
When I use pencil or highlighter of PencilKit on debug, and write it quickly, it'll crash. But it's fine on release
the error message:
-[MTLDebugCommandBuffer lockPurgeableObjects]:2103: failed assertion `MTLResource 0x2816cd0a0 (label: (null)), referenced in cmd buffer 0x11a041400 (label: Live rendering command buffer) is in volatile or empty purgeable state at commit'
is it a bug, or am I making something wrong?
Post not yet marked as solved
For some reasons I am unable to set PKInk color to black.
It works with yellow, red, blue, …but black is just ignored and I get the stroke in white.
This is really weird and I don’t know why.
I tried by setting black to PKInkingTool and even by creating a stroke manually and set the PKInk color to black.
Do you have an idea?
Thanks.
Post not yet marked as solved
Every time I resize the PKCanvasView on a button tap, PKCanvasView, sometimes a few strokes later, sometimes on the first stroke after resize, removes all strokes except the one currently being drawn & this happens as soon as I start to draw. Then, when I lift the pencil up & away from screen, all previous strokes become visible except the one I just drew. And then, on the very next stroke, the drawing is updated correctly, with all strokes correctly visible.
This is how I update the view when it's resized:
UIView.animate(withDuration: 0.3, animations: {
self.descriptionPane.isHidden = self.canvasToggle.isSelected
(self.descriptionPane.superview as! UIStackView).layoutIfNeeded()
self.currentMaxZoomScale = self.canvasToggle.isSelected ? (self.canvasView.bounds.width / oldCanvasRect.width) : 1
self.canvasView.maximumZoomScale = self.currentMaxZoomScale
self.canvasView.setZoomScale(self.currentMaxZoomScale, animated: false)
self.canvasView.setContentOffset(CGPoint(x: 0,y: 0), animated: false)
}, completion: { isComplete in
self.canvasToggle.isUserInteractionEnabled = true
self.canvasView.isUserInteractionEnabled = true
})
I more often than not see the following warning when I this problem occurs.
[Stroke Generator] Missed updates at end of stroke: 2 (total points: 69)
Could anyone guide me on how I should approach this problem?
Is it even a good idea to resize the canvas?
Post not yet marked as solved
Tried to ask as a comment in the other thread:
https://developer.apple.com/forums/thread/650386?answerId=628394022#reply-to-this-question
But can't leave a comment in there for some reason (the thread is locked?). Asking exactly the same question, now for iOS 15. Anything changed in this area?
When selecting a stroke path for object on PKCanvas, the option "Snap to Shape" appears.
I understand this function is still in beta and has not made available natively to other PencilKit app. Is there a way using Stroke API to call this function directly after the user hold pencil for half a second when stroke is done drawing, just like how it behaves in native apps?