Data-Driven Imitation Learning of Human Motion Style for Robotic Motion Control
Published in DDCLS, 2025
This paper presents a data-driven framework for learning reusable skills and imitating human motion styles in robotic motion control, based on Generative Adversarial Imitation Learning (GAIL). The proposed framework addresses the challenge of enabling robots to learn and internalize motion skills from human demonstrations while adhering to physical constraints. By leveraging large-scale, unstructured motion data, robots can extract reusable skills that are aligned with their own physical limitations, without the need for complex manual annotations or editing. These learned skills can be applied to a variety of real-world tasks, enhancing the adaptability and versatility of robotic systems.
Recommended citation: Coming soon
Download Paper
