Subject description - BE5B33RPZ

Summary of Study | Summary of Branches | All Subject Groups | All Subjects | List of Roles | Explanatory Notes               Instructions
BE5B33RPZ Pattern Recognition and Machine Learning Extent of teaching:2+2c
Guarantors:Matas J. Roles:P,PV Language of
teaching:
EN
Teachers:Drbohlav O., Matas J. Completion:Z,ZK
Responsible Department:13133 Credits:6 Semester:Z

Anotation:

The basic formulations of the statistical decision problem are presented. The necessary knowledge about the (statistical) relationship between observations and classes of objects is acquired by learning on the raining set. The course covers both well-established and advanced classifier learning methods, as Perceptron, AdaBoost, Support Vector Machines, and Neural Nets.

Study targets:

To teach the student to formalize statistical decision making problems, to use machine learning techniques and to solve pattern recognition problems with the most popular classifiers (SVM, AdaBoost, neural net, nearest neighbour).

Course outlines:

1. The pattern recognition problem. Overview of the Course. Basic notions.
2. The Bayesian decision-making problem, i.e. minimization of expected loss.
3. Non-bayesian decision problems.
4. Parameter estimation. The maximum likelihood method.
5. The nearest neighbour classifier.
6. Linear classifiers. Perceptron learning.
7. The Adaboost method.
8. Learning as a quadratic optimization problem. SVM classifiers.
9. Feed-forward neural nets. The backpropagation algorithm.
10. Decision trees.
11. Logistic regression.
12. The EM (Expectation Maximization) algorithm.
13. Sequential decision-making (Wald´s sequential test).
14. Recap.

Exercises outline:

Students solve four or five pattern recognition problems, for instance a simplified version of OCR (optical character recognition), face detection or spam detection using either classical methods or trained classifiers.
1. Introduction to MATLAB and the STPR toolbox, a simple recognition experiment
2. The Bayes recognition problem
3. Non-bayesian problems I: the Neyman-Pearson problem.
4. Non-bayesian problems II: The minimax problem.
5. Maximum likelihood estimates.
6. Non-parametric estimates, Parzen windows.
7. Linear classifiers, the perceptron algorithm
8. Adaboost
9. Support Vector Machines I 10.Support Vector Machines II
11. EM algoritmus I 12.EM algoritmus II
13. Submission of reports. Discussion of results.
14. Submission of reports. Discussion of results.

Literature:

1. Duda, Hart, Stork: Pattern Classification, 2001.
2. Bishop: Pattern Recognition and Machine Learning, 2006.
3. Schlesinger, Hlavac: Ten Lectures on Statistical and Structural Pattern Recognition, 2002.

Requirements:

Knowledge of linear algebra, mathematical analysis and probability and statistics.

Note:

https://cw.felk.cvut.cz/doku.php/courses/ae4b33rpz/lectures/start

Webpage:

http://cw.felk.cvut.cz/doku.php/courses/ae4b33rpz/start

Keywords:

pattern recognition, statistical decision-making, machine learning, classification

Subject is included into these academic programs:

Program Branch Role Recommended semester
BEECS Common courses PV 5
BPOI_BO_2016 Common courses P
BPOI1_2016 Computer and Information Science P
BPOI2_2016 Internet things P
BPOI3_2016 Software P
BPOI4_2016 Computer Games and Graphics P
BPOI_BO_2018 Common courses P
BPOI4_2018 Computer Games and Graphics P
BPOI3_2018 Software P
BPOI2_2018 Internet things P
BPOI1_2018 Computer and Information Science P
BPEECS_2018 Common courses PV 5


Page updated 11.12.2018 17:49:49, semester: Z,L/2020-1, L/2017-8, L/2019-20, Z,L/2018-9, Z/2019-20, Send comments about the content to the Administrators of the Academic Programs Proposal and Realization: I. Halaška (K336), J. Novák (K336)