Hey Niantic!
I’ve been having a lot of fun working on some demos using palm detection, there is however one thing that i’d really like to have which is the orientation of a given hand.
At the moment we can estimate this very roughly by comparing the width and height of a hand and comparing that to the depth map and human segmentation layer but it’s still unreliable.
So in short my question: Are there any plans to get hand orientation or even gestures working in ARDK?