EN2520 Pattern Recognition And Computer Vision

Spring Semester, 2010

 

Professor David B. Cooper, Office: Barus Holley, 318, Tel: 863-2601,

E-mail: cooper@lems.brown.edu

 

First meeting: Wednesday, 9:00 AM in Barus & Holley 159. 

I would like to change the meeting times to twice a week.

 

Text: Pattern Classification, 2nd edition, by R. Duda, P. Hart, D. Stork.

Good Reference: Pattern Recognition and Machine Learning, by C.  Bishop.

 

This course is a solid reasonably broad treatment of pattern recognition, with some application to computer vision.   The pattern recognition topics in this course are fundamental to speech analysis, computer vision, machine learning, statistical signal processing, human perception and cognition, data mining, medical diagnosis by computer, etc.  These are problems involving making reliable decisions based on large complicated data sets in the presence of considerable uncertainty and noise.  Roughly 10% of the course will be devoted to application of the theory and techniques to estimating free-form 3D shapes and recognizing them based on unorganized messy 3D data or from images taken at multi-views.  The emphasis here is on how to put a complex  application problem within the pattern recognition framework.

 

Topics

 

1.      Bayesian Decision Theory:  A solid treatment of classification theory in terms of Bayesian costs, decision functions and the geometry of decision regions for continuous and discrete random variables.  Classification error probabilities and bounds; missing features; Bayesian belief networks.

2.      Maximum-Likelihood And Bayesian Parameter Estimation, and Bayesian Recognition Using A Priori Partially Unknown Distributions:  General theory; Sufficient statistics; Large sample behavior for arbitrary distributions; Principal component analysis and discriminants; EM algorithm.

3.      Nonparametric Recognition:  Parzen windows classifiers; K-Nearest-Neighbor classifiers.

4.      Support Vector Machines.

5.      Multilayer Neural Networks:  Introduction to feedforward operation and classification; Backpropagation algorithm; Behavior considerations.

6.      Decision Trees: CART (classification and regression trees).

7.      Algorithm-Independent Machine Learning:  Resampling for estimating statistics and classifier accuracy --- Bootstrap; Boosting.

8.      Unsupervised Learning And Clustering:  Mixture densities and identifiability; K-Means clustering; Unsupervised Bayesian learning; Decision-directed approximation; Hierarchical clustering; Minimum spanning trees.

9.      Applications to estimation and recognition of  3D geometry from 3D range data or from multi-view images.

 

Course Requirements

A midterm and a final examination, 6 homework assignments, a few MATLAB assignments.

 

Prerequisite: 

A solid one semester undergraduate course in statistics or probability theory, or the equivalent, and some knowledge of linear algebra.

 

Useful Books on Reserve for Course EN2520

1.  Probability, Random Variables and Stochastic Processes, 4th edition, by A. Papoulis and S.U. Pillai, McGraw Hill, 2002.   ISBN 0-07-366011-6.   QA273.P2  2002.

2.  Statistical Signal Processing.  Detection, Estimation and Time Series Analysis, by L.L. Scharf, Addison Wesley, 1990.   ISBN 0-201-19038-9.   TK5102.5.S3528  1990.

3.  Introduction to Statistical Pattern Recognition, 2nd edition, by K. Fukunaga, Academic Press, 1990.   ISBN 0-12-269851-7.

4.  Elements of Statistical Learning.  Data Mining, Inference, and Prediction,  by T. Hastie, R. Tibshirani, J. Friedman, Springer, 2001.    ISBN 0-387-95284-5.   Q325.75.F75  2001.