[tags:machine learning,vision]

102 results found

Post not yet marked as solved
2 Replies
You can take a look at this article, I hope it can help you. Barcode detection using Vision framework https://cornerbit.tech/barcode-detection-using-vision-framework/ Some screenshots of the article:
Post not yet marked as solved
1 Replies
Here are all the sounds supported in the beta version of Xcode. list of sounds
Post not yet marked as solved
1 Replies
If you are asking where to get the code snippets associated with that video, they are available in the Apple Developer app. Navigate to the video in the app, and select the Code tab to see the snippets.
Post not yet marked as solved
1 Replies
We haven't released that demo as a full sample project, however, the code snippets showing exactly how the regressor was trained are included with the video in the Developer app.
Post marked as solved
2 Replies
956 Views
Hi, I have two questions regarding the ActionAndVision sample application. After setting up the live AVSession how exactly and in which function is the sample buffer given to a visonhandler to perform a vision request. (for e.g. the getLastThrowType request) When and how is the captureOutput(...) func in the CameraViewController called? (line 268 ff) I appreciate any help, thank you very much.
Posted
by konsti_q.
Last updated
.
Post not yet marked as solved
2 Replies
[bold]( [Testx ](https://developer.apple.com/forums/content/attachment/ac12ee3b-4fb7-419d-899a-713ce85b7a75){: .log-attachment} https://www.example.com/)
Post not yet marked as solved
1 Replies
I found the code on Apple Developer app. Thank you!
Post not yet marked as solved
1 Replies
594 Views
I am trying to follow Frank's demo on analyzing punch card using contour detection and am stuck with contoursObservation.normalizedPath. I'm trying to draw the detected path on top of my UIImage. Any sample code or reference on converting normalised path to UIImage's context would help me greatly.
Posted Last updated
.
Post not yet marked as solved
1 Replies
You can actually find the mentioned kernel in the Code section for this video, but only in the Apple Developer App, not on the web. If you navigate to this video in the Developer App, click the Code tab, and scroll down to the 23:05 mark, you will see the kernel!
Post not yet marked as solved
1 Replies
1k Views
In the WWDC Explore Computer Vision APIs presentation it was said (at 23:30) that we will make the Core Image Kernel for the optical flow available in the slide attachments. Does anyone know where to find it?
Posted
by Stoneage.
Last updated
.
Post marked as solved
2 Replies
I have already found the answer. The method is called periodically and basically automatically whenever a new video frame is received via the delegate.
Post not yet marked as solved
1 Replies
827 Views
Demo Classifying Images with Vision and Core ML is crashing: The demo app Classifying Images with Vision and Core ML - https://developer.apple.com/documentation/vision/classifying_images_with_vision_and_core_ml has gone outdated, creating conversion effort to the newest swift version (with the convertor working incorrectly), but furthermore, the demo app crashes, raising this exception (right at the AppDelegate declaration): Thread 1: Your application has presented a UIAlertController () of style UIAlertControllerStyleActionSheet from Vision_ML_Example.ImageClassificationViewController (). The modalPresentationStyle of a UIAlertController with this style is UIModalPresentationPopover. You must provide location information for this popover through the alert controller's popoverPresentationController. You must provide either a sourceView and sourceRect or a barButtonItem. If this information is not known when you present the alert controller, you may provide it in the UIPopoverPresentationControllerDelegate met
Posted Last updated
.