Data-driven Autocompletion for Keyframe Animation
(Best Paper Award)


MIG ’18: Motion, Interaction and Games (MIG ’18)

Xinyi Zhang     Michiel van de Panne
University of British Columbia  

     

Abstract

We explore the potential of learned autocompletion methods for synthesizing animated motions from input keyframes. Our model uses an autoregressive two-layer recurrent neural network that is conditioned on target keyframes. The model is trained on the motion characteristics of example motions and sampled keyframes from those motions. Given a set of desired key frames, the trained model is then capable of generating motion sequences that interpolate the keyframes while following the style of the examples observed in the training corpus. We demonstrate our method on a hopping lamp, using a diverse set of hops from a physics-based model as training data. The model can then synthesize new hops based on a diverse range of keyframes. We discuss the strengths and weaknesses of this type of approach in some detail.

Paper
PDF
Video
Bibtex
@inproceedings{2018-MIG-autoComplete,
  title={Data-driven Autocompletion for Keyframe Animation},
  author={Xinyi Zhang and Michiel van de Panne},
  booktitle = {MIG ’18: Motion, Interaction and Games (MIG ’18)},
  year={2018}
}
Acknowledgements
We thank the anonymous reviewers for their helpful feedback. This research was funded in part by an NSERC Discovery Grant (RGPIN-2015-04843).