Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 101 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
101
Dung lượng
3,14 MB
Nội dung
Max-Planck-Institut f
¨
ur Informatik
Computer Graphics Group
Saarbr
¨
ucken, Germany
Character AnimationfromaMotionCapture Database
Master Thesis in Computer Science
Computer Science Department
University of Saarland
Edilson de Aguiar
Supervisors: Dipl. Inf. Christian Theobalt
Prof. Dr. Hans-Peter Seidel
Max-Planck-Institut f¨ur Informatik
Computer Graphics Group
Saarbr¨ucken, Germany
Begin: June 1,
End: November 26,
ii
Eidesstattliche Erkl
¨
arung
Hiermit erkl
¨
are ich an Eides statt, dass ich die vorliegende Mastersarbeit
selbst
¨
andig und ohne fremde Hilfe verfasst habe. Ich habe dazu keine weiteren
als die angef
¨
uhrten Hilfsmittel benutzt und die aus anderen Quellen entnommenen
Stellen als solche gekennzeichnet.
Saarbr¨ucken, den 26. November, 2003
Edilson de Aguiar
iv
Abstract
Character AnimationfromaMotionCapture Database
Edilson de Aguiar
Master Thesis in Computer Science
Computer Science Department
University of Saarland
This thesis discusses methods that use information contained in amotion capture
database to assist in the creation of a realistic character animation. Starting with
an animation sketch, where only a small number of keyframes for some degrees of
freedom are set, the motioncapture data is used to improve the initial motion qual-
ity. First, the multiresolution filtering technique is presented and it is shown how
this method can be used as a building block for character animation. Then, the hier-
archical fragment method is introduced, which uses signal processing techniques,
the skeleton hierarchy information and a simple matching algorithm applied to data
fragments to synthesize missing degrees of freedom in acharacter animation, from
a motioncapture database. In a third technique, a principal component model is
fitted to the motioncapturedatabase and it is demonstrated that using the motion
principle components acharacteranimation can be edited and enhanced after it has
been created. After comparing these methods, a hybrid approach combining the
individual technique’s advantages is proposed, which uses a pipeline in order to
create the characteranimation in a simple and intuitive way. Finally, the methods
and results are reviewed and approaches for future improvements are mentioned.
Acknowledgements
First I want to thank my supervisors: Dipl. Inf. Christian Theobalt and
Prof. Dr. Hans-Peter Seidel for their help and advice during the development
of this thesis. In addition, I thank all my friends in the IMPRS and Kerstin
Meyer-Ross, for her help here in Germany. For all my colleagues of the Computer
Graphics group at MPI, thank you, specially Volker Blanz, for his help with the
PCA theory and Thomas Annen and Grzegorz Krawczyk for helping me with the
Latex stuff.
I also wish to thank my family, who always supported me, encouraging me
and making me never give up, despite the distance. Thank you Mom, Dad, Raquel,
Rose and Enoc.
Edilson de Aguiar
Contents
1 Introduction 1
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 Fundamentals of CharacterAnimation 5
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Keyframing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.3 Physical Simulation . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4 MotionCapture . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.5 Project Implementation Aspects . . . . . . . . . . . . . . . . . . 10
2.5.1 MotionCaptureDatabase . . . . . . . . . . . . . . . . . 11
2.5.2 Skeleton Model . . . . . . . . . . . . . . . . . . . . . . . 12
3 Multiresolution Filtering Method 15
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2.1 Signal Processing Methods . . . . . . . . . . . . . . . . . 16
3.2.2 Multiresolution Methods . . . . . . . . . . . . . . . . . . 16
3.3 Multiresolution Filtering Method . . . . . . . . . . . . . . . . . . 16
3.4 Multiresolution filtering on motion data . . . . . . . . . . . . . . 18
3.5 Application to CharacterAnimation . . . . . . . . . . . . . . . . 19
3.6 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4 Fragment Based Methods 25
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
vii
CONTENTS
viii
4.3 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.4 Motion Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4.1 Motion Phases . . . . . . . . . . . . . . . . . . . . . . . 31
4.4.2 Frequency Analysis . . . . . . . . . . . . . . . . . . . . . 33
4.4.3 Correlation . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.5 Motion Synthesis and Texture . . . . . . . . . . . . . . . . . . . 34
4.5.1 Fragmentation . . . . . . . . . . . . . . . . . . . . . . . 34
4.5.2 Matching . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.3 Joining . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.5.4 Smoothing . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.6 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.7 Hierarchical Fragment Method . . . . . . . . . . . . . . . . . . . 41
4.7.1 Skeleton Hierarchy and Correlation . . . . . . . . . . . . 43
4.7.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.7.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . 47
4.8 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
5 Principal Component Analysis 53
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
5.3 Principal Component Analysis . . . . . . . . . . . . . . . . . . . 55
5.3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . 55
5.3.2 PCA Theory . . . . . . . . . . . . . . . . . . . . . . . . 55
5.3.3 Data Compression . . . . . . . . . . . . . . . . . . . . . 57
5.4 PCA for motion synthesis . . . . . . . . . . . . . . . . . . . . . . 58
5.4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5.4.2 Motion Synthesis . . . . . . . . . . . . . . . . . . . . . . 58
5.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
6 Hybrid Approach 67
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
6.2 Hybrid Approach . . . . . . . . . . . . . . . . . . . . . . . . . . 68
6.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
6.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
7 Conclusion and Future Work 77
List of Figures
2.1 Example of the motioncapture session and equipments used to
capture the motions used in the database. In (a) it is shown the
camera setup and in (b) the subject performing the motion. Images
used from http://www.e-motek.com. . . . . . . . . . . . . . . . . 12
2.2 Example of the motion data in the database. The figure shows
respectively the z-angle values for the pelvis, hip, clavicle, forearm
and knee joints. . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3 Joints and bones forming the skeleton model used in the project. . 14
2.4 The skeleton joint hierarchy. On the right side it shows the lower
kinematic sub-chain and on the left the upper kinematic sub-chain. 14
3.1 Generation of the Gaussian pyramid. The value of each node in the
next row,
, is computed as a weighted average of a sub-array
of nodes. In this example a sub-array of length five is used.
Adapted from [BA83]. . . . . . . . . . . . . . . . . . . . . . . . 17
3.2 Visualization of different frequency bands of a Gaussian pyramid
(shown only for the first 40 frames). The band g0 corresponds to
the original signal. The low-pass bands corresponding to the high
frequency are g1 and g2, to the middle are g3 and g4, and to the
low frequency are g5 and g6. . . . . . . . . . . . . . . . . . . . . 20
3.3 Visualization of different frequency bands of a Laplacian pyramid
(shown only for the first 40 frames). The band-pass bands corre-
sponding to the high frequency are l0 and l1, to the middle are l2
and l3, and to the low frequency are l4 and l5. . . . . . . . . . . . 21
3.4 Using multiresolution to increase the gain in the middle frequencies 23
3.5 Using multiresolution to decrease the gain in the middle frequencies 24
ix
LIST OF FIGURES
x
4.1 The input for the general fragment based method: (a) keyframed
and motioncapture data are decomposed into frequency bands; (b)
animators set the general method parameters. Driven and master
joints are chosen to guide the method, and a particular frequency
band of the master joint is chosen to guide the fragmentation step.
Joints in black will be textured and in blue and red will be synthe-
sized. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.2 A fragment based method is composed by four steps: fragmenta-
tion (a), matching (b), joining (c) and smoothing. At the end, an
original keyframed characteranimation is enhanced by synthesis
and texturing (d). . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3 Example of a walking animation: (a) set of four phases during a
human walking cycle; (b) the right hip z-angle values is plotted,
where it is possible to see the respective phases. . . . . . . . . . . 32
4.4 Plot of the pelvis joint angle against the hip joint angle for all ex-
amples in the database. The shape shown in red demonstrates a
good correlation between these joints. . . . . . . . . . . . . . . . 33
4.5 Example of the fragmentation step: (a) original degree of freedom;
(b) fragments created at locations where the first derivative changes
its sign. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4.6 Considering one driven fragment (a), in the matching step all data
fragments are compared with the driven fragments (b) being stretched
or compressed properly (c). At the end, a number of good frag-
ments are found (d). . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.7 In the joining step the good fragments found in the matching step
are concatenated or blended (a). Three different criteria were tested
and compared: (b) best fragment; (c) cost matrix and (d) best ani-
mation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4.8 In the smoothing step, the discontinuity magnitude (top left) is
multiplied with a smoothing function (top right) and the result is
added back to the original motion signal. In this way, the continu-
ous version shown on the bottom left is generated. Adapted from
[AF02]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
[...]... case that an animator wants to use a number of existing motion sequences, for instance stored in amotioncapture database, to generate new motions The idea is to use the style and life-like qualities of the motions in the database to add details and a particular style to an initial keyframed animation Then, a different approach to create acharacteranimation is proposed: the animator starts the animation. .. CHARACTERANIMATION 11 2.5.1 MotionCaptureDatabase The motioncapture data used in this work was obtained from MOTEK 2 , amotioncapture company that provides a set of motion sequences to the research community The company uses a VICON 8 optical motioncapture system with 8 to 24 cameras for data acquisition Using cameras placed around the capture space to track the positions of markers attached to... fragment based methods, showing that they can be successfully applied to characteranimation After decomposing keyframed and captured data into frequency bands, motion phases are used to divide the motioncapture data in small pieces, which are used to improve the original keyframed animation In chapter 5 a principal component model is fitted to the motioncapturedatabase and it is shown that using motion. .. platform In order to facilitate the skeleton and animation manipulation, the free open-source characteranimation library CAL3D 1 is used in the project The characteranimation library, CAL3D, is coded in C++ and uses the STL containers to store the data It provides basic data structures for skeleton-based character animation: sequencing and blending of animations, handling of bones, skeletons, materials... to achieve this goal Liu and Popovic [LP02] presented a method for prototyping realistic charactermotion using a constraint detection method that automatically generates the constraints by analyzing the input motion Tanco and Hilton [TH00] presented a system that synthesizes motion sequences fromadatabase of motioncapture examples using a statistical model created from the captured data Pullen and... reviewed and approaches for future improvements are mentioned Chapter 2 Fundamentals of CharacterAnimation 2.1 Introduction In this chapter the three main methods by which character animations are created will be briefly described: keyframe interpolation, physical simulation and motioncapture Each of these methods has its advantages and disadvantages and they are appropriate in different situations In... such as a humanoid character, usually has at least 50 degrees of freedom In keyframing, an animator must animate all these DOFs, one at a time To construct a more realistic model, its 5 CHAPTER 2 FUNDAMENTALS OF CHARACTERANIMATION 6 complexity is increased and the animator must keyframe more degrees of freedom Usually, constraints, like position of legs and arms at specific times, are always a problem... adaptation of motion data by amplifying or attenuating important frequencies of the motion data This technique can provide a high-level motion control and facilitates to reuse and adapt existing motions of an articulated character In the last instance, it can serve as building block for high-level charactermotion processing Multiresolution filtering is not usable as a stand-alone technique for character. .. animation with keyframing, a method that he is familiar with, to create some degrees of freedom After that, the motioncapturedatabase is used interactively to enhance the initial keyframed animation At the end, using the strengths of keyframing and motion capture, the character performs realistic motion while preserving the keyframed style and incorporating the details of the motions in the database. .. is to create sufficient detail in order to generate an characteranimation with a realistic appearance Achieving detail in a keyframed animation is extremely labor intensive However, with motioncapture the details are immediately present In other words, the data contains the motion signature The main problem with motioncapture is the lack of flexibility For instance, after collecting the data it is . in a character animation, from
a motion capture database. In a third technique, a principal component model is
fitted to the motion capture database and. methods that use information contained in a motion capture
database to assist in the creation of a realistic character animation. Starting with
an animation