Project Details
Projekt Print View

DEXMAN: Improving robot’s DEXterous MANipulability by learning stiffness-based human motor skills and visuo-tactile exploration

Applicants Qiang Li, Ph.D.; Professor Dr. Jianwei Zhang, since 12/2020
Subject Area Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term from 2019 to 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 410916101
 
Roboticists have made huge efforts to mimic the human hand, not only from the form but also from the functionalities. However, robustly grasping and manipulating an unknown object/tool is still an open question to be thoroughly solved. In this project, we will investigate a human motor skill extraction based approach to achieve robust dexterous grasping and in-hand manipulation on a robotic arm/hand system:  A novel framework of augmented dynamic movement primitives DMPs embedding perception information for human skill extraction and generalization to new tasks  Reconstructing and tracking an unknown object by exploiting interactive manipulation and multi-modal feedback  Multiple sensor fusion based adaptive grasping and manipulation control framework enhanced by human motor skills extractionTo represent human motor skills while interacting with objects/tools that differ in size, shape and stiffness, we will create an augmented hierarchical primitive-based library, with respect to human hand/arm stiffness, motion and finger gaiting. With online perception feedback, this primitives library will provide the knowledge basis for generalizable skills transferring from human to a robotic hand-arm system, by a hierarchical adaptive grasping and manipulation control method based on multi-modal sensor fusion. The skills generalization will be achieved regarding different object/tool sizes, shapes and stiffness. Object properties will be modelled online through visuo/tactile based exploration control method. Multi-modal perception feedback will also be used as input of the primitive library to generalize motion, stiffness, and gaiting trajectories. We will demonstrate the proposed grasping and manipulation approach with a typical daily-of-live task such as grasping a knife and cutting a fruit. Three institutes with a clear record in human motor skills learning (SCUT), visuo-tactile based recognition and interaction (UNIBI), and visuo-tactile based adaptive grasping and dexterous manipulation with multi-fingered robotic hands (DLR), will tightly cooperate towards this aim.
DFG Programme Research Grants
International Connection China
Cooperation Partner Professor Dr. Chenguang Yang
Ehemaliger Antragsteller Professor Dr. Zhaopeng Chen, until 12/2020
 
 

Additional Information

Textvergrößerung und Kontrastanpassung