Hand Tracking in Reality Composer or Xcode

Hey all.

Wondering if there is a way to do Hand Tracking in Reality Composer, or if anyone has a tutorial to working with Hand Tracking in Xcode. Would love to take my projects to the next level and this seems capable on most other platform.

Thanks for the help.

I recommend that you have a look at this sample code (https://developer.apple.com/documentation/vision/detecting_hand_poses_with_vision) and its associated WWDC video(https://developer.apple.com/videos/play/wwdc2020/10653/), which explains how you can make use of Vision's hand pose detection.

Hand tracking is not available in Reality Composer, I recommend that you file an enhancement request using Feedback Assistant to request that feature.
Hand Tracking in Reality Composer or Xcode
 
 
Q