Dancing-to-Music Character Animation

Takaaki Shiratori, Atsushi Nakazawa, Katsushi Ikeuchi

  • In Computer Graphics Forum, Vol. 25, No.3 (Also in Eurographics 2006), pp. 449-458, September 2006 [paper]
  • In Proc. IEEE International Conference on Robotics and Automation (ICRA 2006), May 2006 [paper]


Abstract

In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional aspects, which often strongly affect human motion. This paper presents a new approach for synthesizing dance performance matched to input music, based on the emotional aspects of dance performance. Our method consists of a motion analysis, a music analysis, and a motion synthesis based on the extracted features. In the analysis steps, motion and music feature vectors are acquired. Motion vectors are derived from motion rhythm and intensity, while music vectors are derived from musical rhythm, structure, and intensity. For synthesizing dance performance, we first find candidates of motion segment sets whose features are matched to those of music segments, and then we find the motion segment set whose intensity is matched to that of music segments. Additionally, our system supports having animators control the synthesis process by choosing desired motion segments well matched to music segments. The experimental results indicate that our method actually creates dance performance as if a character was listening and expressively dancing to the music.






Try the web application with your own music!



This figure shows the feature matching result for music "Tonite." Yellow and light blue lines represent motion and music rhythm components, and blue and red lines represent motion and music intensity components.


Acknowledgement

The motion and music data used in this project were obtained from CMU Motion Capture Database.



Top
Home