SIGGRAPH 2008 > For Attendees > New Tech Demos > MeisterGRIP

Browse Program

By Format




MeisterGRIP: Cylindrical Interface for Intuitional Robot Operation

Theme: SIGGRAPH Core
Hall H

MeisterGRIP independently measures the contact points and grasping-force vectors of the five fingers and the palm by vision-based tracking of the markers in an elastic body. In any grasping posture or position, the device can relay the user's handling to the robot.

Enhanced Life
MeisterGRIP enables intuitive manipulation of a pair of robotic hands and arms by acquiring and transmitting the user's grasping conditions. The user simply grasps the device. Setup complexity is reduced, and there are fewer restrictions on the user. The system provides universal manipulation that can tolerate personal differences in hand size and grasping posture.

Goal
To achieve an intuitive, dexterous, and simple robot manipulation system that can be used by amateurs rather than skilled professionals.

Innovations
The user's grasping conditions are measured by the force-vector distribution caused by the user’s grasping hand. The device recognizes the five fingers and the palm of the user as six input positions from the pattern of the force vector distribution. This device has significant advantages compared to conventional robot hand controllers. The individual differences in users' hands and variations of the grasping posture are irrelevant, since the sensing points are determined at the initial gripping. Users can easily grasp and release the device like a mouse, in interactions that are so intuitive that users do not require any practice time. The method can be applied for any multipoint manipulations in real or virtual environments, especially for manipulation of a dexterous robot hand.

Vision
In the future, it will be possible to travel to a foreign country, interact with objects in a dangerous place, and communicate with the people in remote locations by manipulating a robot with the MeisterGRIP, even if the user is sitting on a sofa at home.

Contributors
Shuji Komeiji
Katsunari Sato
Kouta Minamizawa
Hideaki Nii
Naoki Kawakami
Susum Tachi
The University of Tokyo