Fall 2000 276A Pattern Analysis and Machine Intelligence Boelter Hall 9436 TH 2-3:50
Instructor: Allen Klinger, Professor, Computer Science <klinger@cs.ucla.edu> 310 825-7695 direct; Office: 3531-H Boelter Hall
Secretary: Cynthia Bautista; <cynthia@cs.ucla.edu> 310 825-2660; Office: 3731-G Boelter Hall
Textbooks
Bishop, C., Neural Networks For Pattern Recognition, Oxford Univ. Press, 1995; also see Nilsson below.
Duda, R. and Hart, P., Pattern Classification and Scene Analysis, Wiley, 1974; see below.
Breiman, L., Classification and Regression Trees, Wadsworth, 1984; also see Devvijver or Fukunaga below.
Coward, L. Andrew, Pattern Thinking, Praeger, 1990; also see Adams below.
Devlin, Keith, Mathematics: The Science of Patterns, W. H. Freeman and Co., 1994; also see Polya below.
Description
Pattern Analysis and Machine Intelligence (PAMI) involves three main areas: Statistical Pattern Recognition, Learning Processes, and Structural Methods. This course introduces them and several application-oriented subtopics: both theoretic, Feature Extraction and Selection, Classification Rules, and practical, Vision, Image Processing, Understanding, Speech/Character Recognition, Motion, etc. The course involves text and research readings. Mathematics presented in the course suffices. Previous experience there is not required though a prior probability course would be helpful.
Requirements
Midterm examination, class presentations based on recent PAMI research (twenty-thirty minute talks), and term-project: written report, project-review talk. Term-project written report due last class meeting.
Organization
Course lectures, supplemental readings, and student talks are fundamental "Textbooks" act as a resource not the final word. Topics focus on fundamental problems. Specifics on pattern recognition (PR) and statistical learning from notes, Duda/Hart, and Nilsson books. Related mathematical issues presented in class. More than one class presentation required of all students. Interaction and discussion encouraged.
Course grades are based on student presentations, in-class comments, examinations, and written submissions. The major written submission is the term-project report. They are due at the end of the tenth week, December 3 and accepted only until December 5. The best policy is to confer with the instructor in office hours about the report content, papers read, etc., at least twice in the ten week term.
The general course outline on the next page is a structured framework for the course. Individual needs, class interests, and the many fields that involve patterns will govern the actual nature of the course. The course is to introduce students to a challenging research realm.
The last item listed at each week are reading assignments in Bishop or Duda/Hart (italic font), given in chapter (Roman numeral): page (Arabic numerals) format. Items in Bishop or Duda-Hart suitable for student presentation in class have pages underlined. Other items suitable for presentations may be taken from books (see below), IEEE Transactions (e.g., Pattern Analysis and Machine Intelligence - see below); and other sources. A term paper based on student selected items, including research papers, is a course requirement. At least one paper read/presented to the class must be from a refereed journal.
Week Dates/Class Meetings Topic Assignment
1-3 9/28-10/17 Statistical Pattern Recognition Basics
1 9/28, 10/3 Overview, Decisions I: 1-26
2 10/5, 10/10 Gaussian, Discriminant Models II: 33-42; III: 77-89
3 10/12, 10/17 Risk and Loss Models, Bayes I: 26-28; II: 42-46
4-6 10/19-11/5 Statistical Pattern Recognition Advanced Topics
4 10/19, 10/24 Nonparametric Decisions II: 47-73
5 10/26, 10/31 Perceptrons III: 89-105
6 11/2, 11/7 Fisher Discriminant III: 105-112, V: 138-172
7-9 11/9, 11/30 Structural Methods Conclude, Report
7 11/9, 11/11 Multi-Layer and Clustering IV: 116-133, VI: 211-225
8 11/16, 11/18 Higher-Order, Back-Prop IV: 133-148
9 11/23 Representation Notes, IX: 327-378
10 12/5, 7 Feature Selection VIII: 304-14, Report 12/3
11 12/15 Scheduled Final Tuesday 11:30 - 2:30 PM.
Books, Handbooks, Paper Reprint Volumes, and Monographs
Nilsson, N., The Mathematical Foundations of Learning Machines, with introduction by T. Sejnowski and H. White, San Mateo CA: Morgan Kaufman Publishers, 1990. Previously published as Learning Machines, 1965.
Sebestyen, G., Decision-Making Processes in Pattern Recognition, NY: Macmillan, 1962.
Fukunaga, K., Introduction to Statistical Pattern Recognition, NY: Academic Press, 1972.
Zurada, J., Introduction to Artificial Neural Systems, NY: West Publishing Company, 1992.
Devijver, P. and Kittler, J., Pattern Recognition; a Statistical Approach, Englewood Cliffs, NJ: Prentice-Hall, 1982.
Grenander, U., Lectures in Pattern Theory, Vol. 1, 2, and 3, NY: Springer-Verlag, 1976.
Fu, K-S., Syntactic Methods in Pattern Recognition, NY: Academic Press, 1974.
Ahuja, N. and Schachter, B., Pattern Models, NY: Wiley, 1983.
Agrawala, A. (ed.), Machine Recognition of Patterns, NY: IEEE Press, 1977.
Young, T. and Fu, K-S. (eds.), Handbook of Pattern Recognition and Image Processing, NY: Academic Press, 1986.
Duda, Richard O., Hart, Peter E., Stork, David G., Pattern Classification NY: John Wiley & Sons; ISBN: 0471056693, $100.00, 2nd edition, Nov 2000, to appear.
Polya, George, Mathematics and Plausible Reasoning (Induction and Analogy in Mathematics, Vol. I, Patterns of Plausible Inference, Vol. II), Princeton, New Jersey: Princeton University Press, 1954.
Adams, James, Conceptual Blockbusting, NY, W.W. Norton & Co., 1979.
Key Journals and Papers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition
Minsky, M. "Steps Toward Artificial Intelligence", Proc. IRE, 8-30, Jan. 1961, also in Feigenbaum, E. and Feldman, J. (eds.), Computers and Thought, NY: McGraw-Hill, 1963.
IEEE Transactions on Pattern Analysis and Machine Intelligence . 19,1997, TK 7882 P3 I11
Aug 19/8 ...Fast Pattern Matching... Uenohara, Kanade
...Bayean Belief Networks... van Engelen
July 19/7 ...Object Representation Moghaddam, Pentland
June 19/6 ...Rejection Rule Ha
...Pattern Classifiers Hsu/Way
May 19/5 ...Decision Trees Esposito, Malerba, Semeraro
Apr 19/4 ...Fingerprint Verification Jain, Hong, Bolle
Mar 19/3 ...Nearest Neighbor Classifier Ojouadi, Bouktache
Feb 19/2 ...User Sketches Del Bimbo, Pak
...Cluster-Based Templates Hochberg, Kelly, Thomas, Kerns
...Linear Discriminant ... Aladjem
Jan 19/1 ...Algebraic Invariants Mickel, Nandhakenmar, Velter
....Nearest Neighbor... Hamamoto, Uchimura, Tomita
July 22, 7; Papers on Object Recognition, Shape Extraction
and Analysis, Shape Extraction and Representation; Short Papers on
Clustering, Image Segmentation and Feature Extraction; Key Words,
Invariant(s), Stereo Matching, Occlusion Detection, Genetic
Algorithms
June 22, 6; Papers on Texture Analysis, Face and Gesture Recognition, Computer Vision and Image Analysis, Object Recognition, Short Papers on Feature Space Reduction and Visualization, Syntactic and Structural Pattern Analysis, Document Image Analysis, Shape Extraction; Key Words, Monte Carlo, Markov Chain, Evolutionary Pursuit, Image Retrieval, Spatial Point Patterns, Video Images
We use the term probability to mean several things, but the simplest notion is simply the ratio number of favorable cases to total number. I.e., if a six-sided cube is tossed the probability of a "1" face is 1/6.
(The following background material presents intuitively-understandable probability examples. One way to view this is as leading to numeric pattern recognition cases - see Devlin.)
Easily-understood probability examples come from games like dice and poker. For example,
... approximate odds against getting each hand. Odds are based on a five card hand as dealt to one player....Odds against a rare "royal flush" - ace, king, queen, jack, ten, all of one suit - are 650,000 to one.
One pair 1 1/3 to 1 Flush 500 to 1
Two pairs 20 to 1 Full house 690 to1
Three of a kind 45 to 1 Four of a kind 4,000 to 1
Straight 250 to 1 Straight flush 72,000 to 1
[from Smithsonian, Aug 97, p.60.]
Likewise rolling dice, solid symmetrical objects, enable equally probable outcomes via cubes, tetrahedrons, dodecahedrons, etc.
Exercises:
From Chernyak, Yuri B., and Robert M. Rose, The Chicken from Minsk, NY: HarperCollins BasicBooks, 1995, 1. and 2.:Two siblings are born naturally on the same date, in the same year, to the same mother and the same father. However, they are NOT twins - neither fraternal nor identical. Is this possible or impossible?
2. Family Fun with Mathematicians