Leap Motion aims to help computers understand the human hand

xray hand
Credit: go_nils via Flickr

Meet the new Bone API

Replicating the motions of the human body, especially our hands, is one of the hardest tasks in computing. Our finger joints (and most other joints in our body) don't move like a perfect hinge or ball in socket. Depending on the position of the rest of the hand, the bones shift and glide past each other with subtle shifts.

These shifts are natural to our eyes and brains, but very difficult to accurately describe to a computer. But startup outfit Leap Motion, which makes a desktop sensor to capture hand motions as a controller, is in the process of releasing more software to bring those joints to life.

9504816739 f662bc09c7 o

Leap Motion in action.

Users make gestures over Leap Motion's sensor and then the application renders the appropriate action on the screen. In many applications a set of replica human hands appear on the screen to orient motions. Last December the company released sculpting software where pieces of digital clay spun on-screen and users could shape it with their hands in the air above the sensor.

This market is crowding quickly as platforms provide tools to move gaming and applications away from input controllers and towards the motion sensors of a holodeck. Today Myo released a new design of its armband sensor to go along with the Myo API for customizing apps. And giants like Nintendo Wii and Microsoft Kinect systems have dominated the market.

Leap Motion CTO David Holz argues the limits of applications today aren't size or speed, but how we interact with them. "The depth and complexity of software is limited by the interface," he says.

One aspect of that is letting the developer community run with the sensors. Painting applications and games abound in Leap Motion's store. There is also an integration with Google Earth to surf the globe with hand motions.

One Reddit user built a program with Leap Motion's to browse articles with one-finger swipes left and right, or upvote articles with two finger motions. Though, in watching his video, other than the novelty, it's tough feel a huge improvement over using a mouse and keyboard. But on the other hand, it's hard not to think of an extremely rudimentary version of the screens in Minority Report. Time will tell.

Leap Motion has been pushing the next generation of its tracking software, improving motions like touching one finger to another hand, or pinching two fingers together. Those seem like simple motions. But if a game requires a player to, say, pick up a ball and throw it, the sensors require incredible precision to know the exact moment the player's hand would have wrapped around and made a grip on the sphere -- not to mention when they've released it.

"I don't think we've fully seen the potential of that yet," Holz says.

This is where Leap Motion wants its recently released Bone API to fit in. Using the information about the dimensions of the finger -- down to the individual metatarsals and phalanges -- and the mechanics of the joints, the software tries to calculate the finger's positioning and path of motion. Hands come in all proportions and thicknesses so the sensors have to adjust for every user. This makes it easier for developers to build apps that have a better understanding of how the hand works. 

There's also a critical issue with having a fixed sensor resting on the desk. If the palms face upward and the fingers curl in, the sensor can't know what's going on above. It can't see the tips. In fact, for a lot of motions, one hand will block the sensor's view of the other. So Leap is continually trying to provide predictive guidance on those positions for its developers to hone their applications.

Then there's how we perceive digital digits. Unnatural motions mess with our heads and even skin movements are insanely complicated. As you ball a fist and the tendons, veins and hair shift around, the little skinfolds and the shifts in skin tone all pass muster with your brain.

But that's very hard for a computer to predict. When any of these factors lose their delicate balance our brains sound the alarm -- an intruder in the Uncanny Valley.

Holz advises not trying to make hands too life-like for now. For instance, leave skin tone as a grey, rather than fleshy colors; or if the hands need skin tone textures, only show joints or segments.

"We're still 100X away from where it could be," he says.

Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies