Abstract
This paper proposes a novel probabilistic variational method with deterministic annealing for the maximum a posteriori (MAP) estimation of complex stochastic systems. Since the MAP estimation involves global optimization, in general, it is very difficult to achieve. Therefore, most probabilistic inference algorithms are only able to achieve either the exact or the approximate posterior distributions. Our method constrains the mean field variational distribution to be multivariate Gaussian. Then, a deterministic annealing scheme is nicely incorporated into the mean field fix-point iterations to obtain the optimal MAP estimate. This is based on the observation that when the covariance of the variational Gaussian distribution approaches to zero, the infimum point of the Kullback-Leibler (KL) divergence between the variational Gaussian and the real posterior will be the same as the supreme point of the real posterior. Although global optimality may not be guaranteed, our extensive synthetic and real experiments demonstrate the effectiveness and efficiency of the proposed method.
Original language | English (US) |
---|---|
Pages (from-to) | 1747-1761 |
Number of pages | 15 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 27 |
Issue number | 11 |
DOIs | |
State | Published - Nov 2005 |
Funding
This work was supported in part by US National Science Foundation Grants IIS-0347877, IIS-0308222, and North-western faculty startup funds for Ying Wu and Walter P. Murphy Fellowship for Gang Hua.
Keywords
- Deterministic annealing
- Graphical model
- Markov network
- Maximum a posteriori estimation
- Mean field variational analysis
ASJC Scopus subject areas
- Software
- Artificial Intelligence
- Applied Mathematics
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics