[tags:machine learning,vision]

102 results found

Post marked as Apple Recommended
Vision can detect poses such as push-ups and sit-ups, although I encourage you to experiment using your dataset as many factors can affect the quality of detection. Check out Build an Action Classifier with Create ML - https://developer.apple.com/videos/play/wwdc2020/10043/ for some good practices on capturing training data and Detect Body and Hand Pose with Vision - https://developer.apple.com/videos/play/wwdc2020/10653/ for details on using Vision to detect poses.
Post not yet marked as solved
2 Replies
Generally, people with upright poses are working well, horizontal poses are typically good as well. Upside down poses may be slightly difficult. But you still need to check with your specific use cases.
Post not yet marked as solved
1 Replies
596 Views
I enjoyed learning about this app. I wanted play with the preview mode and went online to look for some corn hole boards. I tried adding a few to see what the model could pick out. I attached one that that I was certain would work, but I got a message about Unable to Compile Model. Each one game me this error. What is the issue? first image - https://www.icloud.com/iclouddrive/0QzETuxqPr-UcTHH5QQ0RBnmw#********-850x556 second image - https://www.icloud.com/iclouddrive/0r4NSUkUJ4aenU1k2n7KFOf9g#********-820x410
Posted
by
Post not yet marked as solved
1 Replies
What error did you see? What is the version of your Xcode? Also, what is the version of your macOS?
Post not yet marked as solved
0 Replies
1.1k Views
On Arkit project, in funtion - func session(_ session: ARSession, didUpdate frame: ARFrame) I tried to get - guard let observation = handPoseRequest.results?.first as? VNRecognizedPointsObservation else { return } and get - let thumb = try! observation.recognizedPoints(forGroupKey: .handLandmarkRegionKeyThumb) segmentation fault 11 pop up. Is this a bug ? or did I made any mistake?
Posted
by
Post not yet marked as solved
1 Replies
You can take a look at how it can be done in Detecting Hand Poses with Vision - https://developer.apple.com/documentation/vision/detecting_hand_poses_with_vision sample code. The CameraView class has an overlayLayer property that displays points detected by VNDetectHumanHandPoseRequest - https://developer.apple.com/documentation/vision/vndetecthumanhandposerequest and converted to UIKit coordinates. In the sample we only display two points corresponding to the thumb and index finger tips. However, VNRecognizedPointsObservation - https://developer.apple.com/documentation/vision/vnrecognizedpointsobservation provides points for all supported joints. In your code you can use these points to build UIBezierPath - https://developer.apple.com/documentation/uikit/uibezierpath that consists of circles and line segments representing a hand skeleton.
Post marked as solved
5 Replies
972 Views
Trying to run the demo project for wwc20-10099 in the simulator using the supplied sample.mov, the entire time the app has the Locating board overlay instead of finding the board before the bean bags begin to be tossed. Is this due to the environment? Has anybody got the demo video to work? Environment: 2018 Mac mini, 3.0GHz, 6 core, 8GB memory Big Sur 11.0 beta 20A4299v XCode 12.0 beta 12A6159 iOS default simulator in XCode 12 (a SE on OS 14)
Posted
by
Post not yet marked as solved
5 Replies
A little more info. The simulator shows The VNCoreMLTransform request failed during the detectBoard() routine in SetupViewController.swift with the NSUnderlyingError domain com.apple.CoreML code = 0. Any clues what to do about it? This is with the demo project unchanged other than the bundle ID and Team assigned.
Post marked as solved
5 Replies
Hi Eugene, You might be running into issues because the app doesn't run properly on Simulator. The 2nd section of the sample’s README - https://developer.apple.com/documentation/vision/building_a_feature-rich_app_for_sports_analysis says: Configure the Project and Prepare Your Environment You must run the sample app on a physical device with an A12 processor or later, running iOS 14. The sample relies on hardware features that aren't available in Simulator, such as the Apple Neural Engine. If you have a physical device that meets those requirements, try running the sample there and let us know how it goes. 🙂
Post not yet marked as solved
5 Replies
Thanks for the reply. I had (obviously) missed that in the README. :(