When the coding starting getting more complicated I stopped (to hopefully let Unity and Leap Motion software catch up). Unity only has tracking for the palm, not the fingers anyway. In a nutshell there is no persistance of object order and you can’t rotate your hand beyond sideways. “…there are still big issues with the way Leap Motion implements it’s pointable objects. It would seem that the Leap API is not yet terribly adapted for virtual reality input. It was fairly simple to pick a relative pointable object (Finger) and map it to a hand position for the IK,” said Witt. “I took a sample Unity project that had simple hand and finger tracking and with Unity Mecanim feature it has an inverse kinematic system for head, hands and feet. The positional head tracking is done with the webcam, which is likely part of the reason that it’s so jumpy. Harley Witt is the author of this prototype and he tells me that this prototype is built in Unity with models from Asset Store. Here’s a video showing early implementation of the Leap in unit combined with the Oculus Rift and positional webcam tracking. It seems like a natural fit for the Oculus Rift and virtual reality input, but so far we’ve seen much more development happening with the Razer Hydra than the Leap. It’s an $80 motion sensing peripheral, designed to sit on your desk, that boasts ultra low latency and ultra high accuracy for tracking hands and other objects. If you haven’t heard of the Leap Motion, you’ve got some catching up to do.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |