Рет қаралды 5,965
The video demonstrates an experiment carried out in Nazarbayev University School of Science and Technology as part of the "Inertial Motion Capture Based Tele-Operation of a Mobile Robot Manipulator" project development.
The project is aimed at the development of a human-robot interface system incorporating a mobile manipulator (KUKA youBot) and full-body intertial motion capture system (XSENS MVN). Thesystem allows to achieve an intuitive robot tele-operation through the ability of the mobile platform to follow the motion of a human operator, and the ability of the manipulator to mimic the motion of the user's hand (right hand in case of the video).
Effective human-robot interface is enabled through a gesture recognition technique. It is seen on the video that the operator implements various control commands such as "Manipulator On/Off", "Mobile Platform On/Off", "Manipulator Pause/Resume", "Select", "Exit" and "Standstill" which are associated with an operator's left hand gestures. Principal Component Analysis (PCA) dimensionality reduction and Linear Discriminant Analysis classification allowed to obtain a 100% gesture recognition, as was recorded from the experimental session.