Abstract
This paper presents a novel inner-outer lip facial animation parameters (FAPs) interpolation within the framework of the MPEG-4 standard to allow for interpolation and realistic animation that can be accomplished at very low bit rates. The proposed technique applies Maximum a posteriori (MAP) estimation to interpolate the inner lip FAPs given outer lip FAPs, or vice versa. The vectors of the lip FAPs along with the first order and the second order temporal derivatives over frames and the conditional probabilities of these vectors are modeled as Gaussian distributions. As a result, the MAP estimate of the inner lip FAPs can be obtained as a linear transformation of the outer lip FAPs, or vice versa. When data obtained from the training process are stored in the receiver, the inner (or outer) lip FAPs can be interpolated from the outer (or inner) lip FAPs, which are all that needs to be transmitted. Thus, without any increase in the amount of transmitted information, a more realistic talking animation that provides more visual information for speech reading can be achieved. Based on our experiments, the proposed technique reduces the Percentage Normalized Mean Error (PNME) error by half over the standard technique.
Original language | English (US) |
---|---|
Article number | 123 |
Pages (from-to) | 1077-1085 |
Number of pages | 9 |
Journal | Proceedings of SPIE - The International Society for Optical Engineering |
Volume | 5685 |
Issue number | PART 2 |
DOIs | |
State | Published - Jul 21 2005 |
Event | Proceedings of SPIE-IS and T Electronic Imaging - Image and Video Communications and Processing 2005 - San Jose, CA, United States Duration: Jan 18 2005 → Jan 20 2005 |
Keywords
- Animation
- FAP
- Interpolation
- MPEG-4
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Condensed Matter Physics
- Computer Science Applications
- Applied Mathematics
- Electrical and Electronic Engineering