The Missing Information Principle in Computer Vision

J. Hornegger, H. Niemann

Abstract


Central problems in the field of computer vision are learning object models from examples, classification, and localization of objects. Jn this paper we will motivate the use of a classical statistical approach to deal with these problems: the missing information principle. Based on this general technique we derive the Expectation Maximization algorithm and deduce statistical methods for learning objects from invariant features using Hidden Markov Models and from non-invariant features using Gaussian mixture density functions. The derived training algorithms will also include the problem of learning 3D objects from two-dimensional views. Furthermore, it is shown how the position and orientation of a three-dimensional object can be computed. The paper concludes with some experimental results.


Keywords


Expectation Maximization algorithm, Hidden Markov Models, statistical object recognition

Full Text:

PDF


Creative Commons License
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.

Crossref Similarity Check logo

Crossref logologo_doaj

 Hrvatski arhiv weba logo