全部版块 我的主页
论坛 计量经济学与统计论坛 五区 计量经济学与统计软件 Gauss专版
3490 0
2014-01-16
Stat 231--- CS 276APattern Recognition and Machine Learning

MW 3:30-4:45 PM, Fall 2013,     Math Science 5147 www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html


Course DescriptionThis course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,
which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis,
boosting techniques, kernel methods and support vector machine, and fast nearest neighbor indexing and hashing.
Prerequisites
  • Math 33A Linear Algebra and Its Applications, Matrix Analysis
  • Stat 100B Intro to Mathematical Statistics,
  • CS 180 Intro to Algorithms and Complexity.
Textbook
  • R. Duda, P. Hart, D. Stork, "Pattern Classification", second edition, 2000. [Good for CS students]
  • T. Hastie, R. Tibshurani, and J.H. Friedman, "The Elements of Statistical Learning: Data Mining, Inference, and Prediction", Spinger Series in Statistics, 2001. [Good for Statistics students]
InstructorsGrading Plan: 4 units, letter grades

Two Homework assignments

[td=37]

20%

Three projects:

[td]


15%
15%
15%


Middle-Term Exam: No.

[td]

0%

Final Exam: Dec 10, Tuesday 11:30AM-2:30PM (we only need 2 hours 12:15-14:15, close book exam)

[td]

35%

Grading policy
  • Homework policy:
    Homework must be finished independently. Do not discuss with classmates.
  • Project policy:
    You are encouraged to work and discuss in a group, but each person must finish his/her own project. Hand in
    (i) a brief description of the experiment in hard copy, (ii) results and plots in hard copy, (iii) your code in e-copy to the reader.
  • Late policy:
    You have a total of three late days for the class, but after using the three late days, no credit will be given for late homework/project.
Tentative Schedule for 2013

Lecture

Date

Topics

Reading Materials

Handouts

1

09-30

Introduction to Pattern Recognition[Problems, applications, examples, and project introduction]

Ch 1

syllabus.pdfLect1.pdf

2

10-02

Bayesian Decision Theory I[Bayes rule, discriminant functions]

Ch 2.1-2.6

Lect2.pdf

3

10-07

Bayesian Decision Theory II [loss functions and Bayesian error analysis]

Ch 2.1-2.6

Lect3.pdf

4

10-09

Component Analysis and Dimension Reduction I:[principal component analysis (PCA)], face modeling][Explanation of Project 1: code and data format]

Ch 3.8.1, Ch 10.13.1Project 1

HW1Lect4-5.pdf

5

10-14

Component  Analysis and Dimension Reduction II:[Fisher Linear Discriminant ][Multi-dimensional scaling (MDS)]

Ch 3.8.2, Ch10.14

FisherFace.pdfLect5-6.pdf

6

10-16

Component  Analysis and Dimension Reduction III:[Local Linear Embedding (LLE), Intrinsic dimension]

paper

LLE paper

7

10-21

Boosting Techniques I:  [perceptron, backpropagation and Adaboost]

Ch 9.5

Lect7-9.pdf

8

10-23

Boosting Techniques II:[RealBoost and Example on face detection][ Explanation of project II ]

Tutorial Handout 1Handout 2

9

10-28

Boosting Techniques III:[Probabilistic analysis, Logit boost]

10

10-30

Non-metric method I: [tree structured Classification: principle and example]

Ch 8.1-8.3

Lect10.pdf

11

11-04

Non-metric method II:Syntactic pattern recognition and example on human parsing

Ch 8.5-8.8

Lect11.pdf

12

11-06

Support vector machine I:  Kernel-induced feature space

Tutorial paper

Lect12-15.pdf

11-11

Veterans day holiday

13

11-13

Support vector machine II: [Support vector classifier][Explanation of project III]

Ch 5.11

14

11-18

Support vector machine III:[Loss functions, Latent SVM, Neual networks and DeepNet]

15

11-20

Parametric Learning       [ Maximum Likelihood Estimation (MLE) ]        [ Sufficient Statistics and Maximum entropy ]

Ch 3.1-3.6

Lect16.pdf

16

11-25

Non-parametric Learning I
  [ Parzen window and K-nn classifer]

Ch 4.1-4.5

Lect17.pdf

17

11-27

Non-parametric Learning II:[K-nn classifer and Error analysis]

Ch 4.6handout

Lect18.pdf

18

12-02

Non-parametric Learning III: [K-nn fast approximate computing:KD-tree and Hashing ]


paper1paper2

Lect 19.pdf

19

12-04

Data Clustering and Bi-clustering:   [K-mean clustering,  EM clustering by MLE, Provable 2-step EM,mean-shift and landscape ]

Ch 10.1-10.4Handout

Lect 20.pdf


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群