Gesture-driven robotic arm narrows the gap for 3D virtual user interfaces

Right now XBOX 360’s Kinect is easily one of the hottest consumer technologies on the market, but the Hoshino Biological Cybernetics Lab at Tsukuba University have managed to take it a step further with their concept at the 3D Expo. The system developed uses a series of cameras that sees and recognizes the shape and positioning of a person’s hands, and in turn manipulates a robotic arm to mimic the movements. But it does not just copy the movements, the system can estimate precise finger shapes at over 100 times per second so the movements are surgically precise and respond in nearly real-time. From here, the team at Hoshino Labs are planning to enhance the systems finger movement recognition and upgrade it for future 3D display devices which will demand gesture-based interactions for user input.

“This system uses two cameras, to recognize how the person’s hands and arms move and what shape they’ve taken. Based on the results, signals are output to the robot, and the robot moves in the same way as the person. There’s hardly any time-lag, and the movements are extremely accurate. Human hands vary greatly among individuals, and we’ve created a database that includes a huge number of different peoples hands. We’ve developed technology that can find the hands that most closely resemble those in front of the cameras, at high speed, with high precision. Most gesture recognition systems detect the position, orientation, and movement of hands. But our system can detect and recognize what shapes hands take as well.”

This is the start of things to come, and could robot fighting, or microscopic robotic doctors be really that far away?

[DigiInfo via Hoshino Cybernetics Lab]


Posted in: Uncategorized

Leave a Comment