6. Nonparametric techniques

Size: px
Start display at page:

Download "6. Nonparametric techniques"

Transcription

1 6. Noparametrc techques

2 Motvato Problem: how to decde o a sutable model (e.g. whch type of Gaussa) Idea: just use the orgal data (lazy learg) 2

3 Idea 1: each data pot represets a pece of probablty P(x) x 1 x 2 x 3 x a Parze Wdow Method 3

4 Idea 2: gore probabltes just measure dstace to trag data Cosder two class problem? k-earest eghbor classfer 4

5 Idea 2: gore probabltes just measure dstace to trag data Cosder two class problem? k-earest eghbor classfer 5

6 6.1. Desty Estmato

7 Goal Determe probablty desty P(x) Gve: trag data x, 1 x,... 2 x Cosder rego R Q: should I cotue from here as a whte board lecture? 7

8 Estmate probablty P sde a rego Probablty of x beg R P R P( x) dx Suppose k trag vectors are sde R from a total of trag vectors What s a estmate for P? P k 8

9 Lmt of fte umber of trag samples Cosder Sequeces V (volume of rego) ad k Resultg sequece of probabltes P ( x) k V : 9

10 Expadg Number of Samples/Shrkg Volume k-nearest-neghbour Estmato =1 =4 =9 =16 =100 From: Duda+Hart: Patter Classfcato 10

11 Necessary codtos for covergece Covergec e meas P ( x) P( x) for P( x) s local property : V 0 for Relablt y of estmate:k for k 0 for :otherwse volume caot shrk to zero 11

12 Possble choce for k k ad pck V such that s clude exactly k samples 12

13 Expadg Number of Samples/Shrkg Volume k-nearest-neghbour Estmato =1 =4 =9 =16 =100 From: Duda+Hart: Patter Classfcato 13

14 6.2. Parze Wdows

15 Itroducto Each pece of trag cotrbutes t s ow bt of probablty dstrbuto Possble choce: Cubes Sphere Normal dstrbuto For the begg start wth cubes 15

16 Volume of d-dmesoal cubes Legth of edge h Volume of cube V h d d : dmeso of features space Home work : volume of a d-dmesoal sphere? 16

17 Itroduce Wdow Fucto Goal: geeralze ad formalze method ( x) x j : 1 0 f x j- th compoet of j else 1 2 for all x j 1... d Ut cube cetered at org Draw a ut cube for d=1 ad d=2 17

18 Shft ad scale the ut cube What s the wdow fucto for a cube cetered at x wth legth of edge h??? x x ( h ) 18

19 Number of Samples at a pot sde the volume V Express umber of data pots that cotrbute terms of wdow fuctos x x h x x ( h x If ( ) 1 the s the volume V ad hece cotrbutes to k k ( x) 1 ) x x ( h ) x 19

20 20 Estmate of probablty usg secto 6.1 h x x V x P 1 ) ( 1 ) ( V x k x P ) ( ) ( V h x x 1 ) ( Ths formula also works for other wdow fuctos

21 Other Wdow Fuctos Normal (Gaussa) dstrbuto (covarace matrx s ut matrx) Sphere ( x) 1 x t 1. x 2 e 2 1 ( x) 0 f else x 1 21

22 Gaussa Parze Wdow 1 d h 1 =1 h 1 =0.5 h 1 =0.1 h h / 22 1

23 Gaussa Parze Wdow 2 d 23

24 Gaussa Parze Wdow 2 d Coverges for all h 24

25 Gaussa Parze Wdow 1 d 25

26 Classfcato example I classfers based o Parze-wdow estmato: Estmate probablty desty usg a gve wdow Pck sutable h Classfy usg Bayes decso rule 26

27 Classfcato example Small h Large h 27

28 6.3. k -earest Neghbor Estmato

29 Basc dea Fd k most smlar cases to test sample x ad clam that x s lke majorty of these cases. 29

30 Other ames for smlar/related methods Istace-Based Methods (IBM), or Istace Based Learg (IBL) Memory-Based Methods (MBM), Case-Based Methods (CBM), Case-Based Reasog (CBR), Memory-Based Reasog (MBR), Smlarty-Based Reasog (SBR), Smlarty-Based Methods (SBM) 30

31 Estmate probablty earest eghbor case x 1 x 2 x x 3 x 1, x 2, x 3 : trag data x: pot where we wat probablty desty P(x) V=2 x-x 2 P( x) 2 1 x x 2 31

32 knn-estmato 1 Dmeso From: Duda+Hart: 32 Patter Classfcato

33 33 Estmatg the Posteror P(w x) c k k k k V 1 V class samples of : umber of samples V totalumber of : volume uder cosderato : w V k x P ), ( w c c V k V k x P x P 1 1 ), ( ) ( w k k V k V k x P x P x P ) ( ), ( ) ( w w k k x P ) (w

34 6.4. Nearest-Neghbor Rule

35 Voro-Tessellato See whte board 35

36 Voroo Cells 2 Dmesos From: Duda+Hart: 36 Patter Classfcato

37 Voroo Cells 3 Dmesos From: Duda+Hart: 37 Patter Classfcato

38 6.5. Error of Nearest Neghbor Rule

39 Error rate of Nearest-Neghbour Classfer Error of Nearest-Neghbour-Classfer (NN) NN-Classfer: Ca be as good as Bayes I worst case twce as bad Error rate of Bayes Classfer From: Duda+Hart: Patter Classfcato 39

40 k-nearest-neghbour-classfer From: Duda+Hart: 40 Patter Classfcato

41 Error of k-nearest-neghbour-classfer From: Duda+Hart: 41 Patter Classfcato

42 Classfcato Error Rate Mssclassfcato vs. Number of Neghbours k From: Haste 42 et al.: Statstcal Learg

43 Decso Boudary for a earest-eghbour classfer a Smulato (Probablty Dstrbuto gve) From: Haste 43 et al.: Statstcal Learg

44 Decso Boudary for a k-earest-eghbour classfer a Smulato (Probablty Dstrbuto gve) k=15 From: Haste 44 et al.: Statstcal Learg

45 Decso Boudares of Bayes Classfer for the kow Probabltes From: Haste 45 et al.: Statstcal Learg

46 Other Popular dstace fuctos L a dstace from 0: D X,0 Mahatta dstace or L 1 orm: d XY, a d 1 X a D X Y 1 a = 1/2, 1, 2, ad 10 Eucldea dstace or L 2 orm: 2 d 2 46 D XY, X Y 1

47 Summary Parze method k-nearest Neghbour classfer 47

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

Nonparametric Density Estimation Intro

Nonparametric Density Estimation Intro Noarametrc Desty Estmato Itro Parze Wdows No-Parametrc Methods Nether robablty dstrbuto or dscrmat fucto s kow Haes qute ofte All we have s labeled data a lot s kow easer salmo bass salmo salmo Estmate

More information

MIMA Group. Chapter 4 Non-Parameter Estimation. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 4 Non-Parameter Estimation. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Grou M D L M Chater 4 No-Parameter Estmato X-Shu Xu @ SDU School of Comuter Scece ad Techology, Shadog Uversty Cotets Itroducto Parze Wdows K-Nearest-Neghbor Estmato Classfcato Techques The Nearest-Neghbor

More information

Applications of Multiple Biological Signals

Applications of Multiple Biological Signals Applcatos of Multple Bologcal Sgals I the Hosptal of Natoal Tawa Uversty, curatve gastrectomy could be performed o patets of gastrc cacers who are udergoe the curatve resecto to acqure sgal resposes from

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

Nonparametric Techniques

Nonparametric Techniques Noparametrc Techques Noparametrc Techques w/o assumg ay partcular dstrbuto the uderlyg fucto may ot be kow e.g. mult-modal destes too may parameters Estmatg desty dstrbuto drectly Trasform to a lower-dmesoal

More information

CHAPTER VI Statistical Analysis of Experimental Data

CHAPTER VI Statistical Analysis of Experimental Data Chapter VI Statstcal Aalyss of Expermetal Data CHAPTER VI Statstcal Aalyss of Expermetal Data Measuremets do ot lead to a uque value. Ths s a result of the multtude of errors (maly radom errors) that ca

More information

Lecture 7: Linear and quadratic classifiers

Lecture 7: Linear and quadratic classifiers Lecture 7: Lear ad quadratc classfers Bayes classfers for ormally dstrbuted classes Case : Σ σ I Case : Σ Σ (Σ daoal Case : Σ Σ (Σ o-daoal Case 4: Σ σ I Case 5: Σ Σ j eeral case Lear ad quadratc classfers:

More information

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames KLT Tracker Tracker. Detect Harrs corers the frst frame 2. For each Harrs corer compute moto betwee cosecutve frames (Algmet). 3. Lk moto vectors successve frames to get a track 4. Itroduce ew Harrs pots

More information

Bayes Decision Theory - II

Bayes Decision Theory - II Bayes Decso Theory - II Ke Kreutz-Delgado (Nuo Vascocelos) ECE 175 Wter 2012 - UCSD Nearest Neghbor Classfer We are cosderg supervsed classfcato Nearest Neghbor (NN) Classfer A trag set D = {(x 1,y 1 ),,

More information

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision ew Schedule Dec. 8 same same same Oct. ^ weeks ^ week ^ week Fall 004 Patter Recogto for Vso 9.93 Patter Recogto for Vso Classfcato Berd Hesele Fall 004 Overvew Itroducto Lear Dscrmat Aalyss Support Vector

More information

Outline. Point Pattern Analysis Part I. Revisit IRP/CSR

Outline. Point Pattern Analysis Part I. Revisit IRP/CSR Pot Patter Aalyss Part I Outle Revst IRP/CSR, frst- ad secod order effects What s pot patter aalyss (PPA)? Desty-based pot patter measures Dstace-based pot patter measures Revst IRP/CSR Equal probablty:

More information

X ε ) = 0, or equivalently, lim

X ε ) = 0, or equivalently, lim Revew for the prevous lecture Cocepts: order statstcs Theorems: Dstrbutos of order statstcs Examples: How to get the dstrbuto of order statstcs Chapter 5 Propertes of a Radom Sample Secto 55 Covergece

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

Chapter 5 Properties of a Random Sample

Chapter 5 Properties of a Random Sample Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

LECTURE 2: Linear and quadratic classifiers

LECTURE 2: Linear and quadratic classifiers LECURE : Lear ad quadratc classfers g Part : Bayesa Decso heory he Lkelhood Rato est Maxmum A Posteror ad Maxmum Lkelhood Dscrmat fuctos g Part : Quadratc classfers Bayes classfers for ormally dstrbuted

More information

Machine Learning. Topic 4: Measuring Distance

Machine Learning. Topic 4: Measuring Distance Mache Learg Topc 4: Measurg Dstace Bra Pardo Mache Learg: EECS 349 Fall 2009 Wh measure dstace? Clusterg requres dstace measures. Local methods requre a measure of localt Search eges requre a measure of

More information

Generative classification models

Generative classification models CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato

More information

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections ENGI 441 Jot Probablty Dstrbutos Page 7-01 Jot Probablty Dstrbutos [Navd sectos.5 ad.6; Devore sectos 5.1-5.] The jot probablty mass fucto of two dscrete radom quattes, s, P ad p x y x y The margal probablty

More information

Chapter 8: Statistical Analysis of Simulated Data

Chapter 8: Statistical Analysis of Simulated Data Marquette Uversty MSCS600 Chapter 8: Statstcal Aalyss of Smulated Data Dael B. Rowe, Ph.D. Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 08 by Marquette Uversty MSCS600 Ageda 8. The Sample

More information

Special Instructions / Useful Data

Special Instructions / Useful Data JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Postpoed exam: ECON430 Statstcs Date of exam: Jauary 0, 0 Tme for exam: 09:00 a.m. :00 oo The problem set covers 5 pages Resources allowed: All wrtte ad prted

More information

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

Dimensionality reduction Feature selection

Dimensionality reduction Feature selection CS 750 Mache Learg Lecture 3 Dmesoalty reducto Feature selecto Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 750 Mache Learg Dmesoalty reducto. Motvato. Classfcato problem eample: We have a put data

More information

Classification : Logistic regression. Generative classification model.

Classification : Logistic regression. Generative classification model. CS 75 Mache Lear Lecture 8 Classfcato : Lostc reresso. Geeratve classfcato model. Mlos Hausrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Lear Bar classfcato o classes Y {} Our oal s to lear to classf

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY 3 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA PAPER I STATISTICAL THEORY & METHODS The Socety provdes these solutos to assst caddates preparg for the examatos future years ad

More information

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters

More information

9.1 Introduction to the probit and logit models

9.1 Introduction to the probit and logit models EC3000 Ecoometrcs Lecture 9 Probt & Logt Aalss 9. Itroducto to the probt ad logt models 9. The logt model 9.3 The probt model Appedx 9. Itroducto to the probt ad logt models These models are used regressos

More information

means the first term, a2 means the term, etc. Infinite Sequences: follow the same pattern forever.

means the first term, a2 means the term, etc. Infinite Sequences: follow the same pattern forever. 9.4 Sequeces ad Seres Pre Calculus 9.4 SEQUENCES AND SERIES Learg Targets:. Wrte the terms of a explctly defed sequece.. Wrte the terms of a recursvely defed sequece. 3. Determe whether a sequece s arthmetc,

More information

Logistic regression (continued)

Logistic regression (continued) STAT562 page 138 Logstc regresso (cotued) Suppose we ow cosder more complex models to descrbe the relatoshp betwee a categorcal respose varable (Y) that takes o two (2) possble outcomes ad a set of p explaatory

More information

Dimensionality Reduction and Learning

Dimensionality Reduction and Learning CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Exam: ECON430 Statstcs Date of exam: Frday, December 8, 07 Grades are gve: Jauary 4, 08 Tme for exam: 0900 am 00 oo The problem set covers 5 pages Resources allowed:

More information

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution: Chapter 4 Exercses Samplg Theory Exercse (Smple radom samplg: Let there be two correlated radom varables X ad A sample of sze s draw from a populato by smple radom samplg wthout replacemet The observed

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model Chapter 3 Asmptotc Theor ad Stochastc Regressors The ature of eplaator varable s assumed to be o-stochastc or fed repeated samples a regresso aalss Such a assumpto s approprate for those epermets whch

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

Integral Equation Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, Xin Wang and Karen Veroy

Integral Equation Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, Xin Wang and Karen Veroy Itroducto to Smulato - Lecture 22 Itegral Equato ethods Jacob Whte Thaks to Deepak Ramaswamy, chal Rewesk, X Wag ad Kare Veroy Outle Itegral Equato ethods Exteror versus teror problems Start wth usg pot

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA PAPER II STATISTICAL THEORY & METHODS The Socety provdes these solutos to assst caddates preparg for the examatos future years ad for

More information

Chapter 3 Sampling For Proportions and Percentages

Chapter 3 Sampling For Proportions and Percentages Chapter 3 Samplg For Proportos ad Percetages I may stuatos, the characterstc uder study o whch the observatos are collected are qualtatve ature For example, the resposes of customers may marketg surveys

More information

Support vector machines

Support vector machines CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.

More information

A Combination of Adaptive and Line Intercept Sampling Applicable in Agricultural and Environmental Studies

A Combination of Adaptive and Line Intercept Sampling Applicable in Agricultural and Environmental Studies ISSN 1684-8403 Joural of Statstcs Volume 15, 008, pp. 44-53 Abstract A Combato of Adaptve ad Le Itercept Samplg Applcable Agrcultural ad Evrometal Studes Azmer Kha 1 A adaptve procedure s descrbed for

More information

18.413: Error Correcting Codes Lab March 2, Lecture 8

18.413: Error Correcting Codes Lab March 2, Lecture 8 18.413: Error Correctg Codes Lab March 2, 2004 Lecturer: Dael A. Spelma Lecture 8 8.1 Vector Spaces A set C {0, 1} s a vector space f for x all C ad y C, x + y C, where we take addto to be compoet wse

More information

Naïve Bayes MIT Course Notes Cynthia Rudin

Naïve Bayes MIT Course Notes Cynthia Rudin Thaks to Şeyda Ertek Credt: Ng, Mtchell Naïve Bayes MIT 5.097 Course Notes Cytha Rud The Naïve Bayes algorthm comes from a geeratve model. There s a mportat dstcto betwee geeratve ad dscrmatve models.

More information

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015 Fall 05 Homework : Solutos Problem : (Practce wth Asymptotc Notato) A essetal requremet for uderstadg scalg behavor s comfort wth asymptotc (or bg-o ) otato. I ths problem, you wll prove some basc facts

More information

Summary of the lecture in Biostatistics

Summary of the lecture in Biostatistics Summary of the lecture Bostatstcs Probablty Desty Fucto For a cotuos radom varable, a probablty desty fucto s a fucto such that: 0 dx a b) b a dx A probablty desty fucto provdes a smple descrpto of the

More information

C-1: Aerodynamics of Airfoils 1 C-2: Aerodynamics of Airfoils 2 C-3: Panel Methods C-4: Thin Airfoil Theory

C-1: Aerodynamics of Airfoils 1 C-2: Aerodynamics of Airfoils 2 C-3: Panel Methods C-4: Thin Airfoil Theory ROAD MAP... AE301 Aerodyamcs I UNIT C: 2-D Arfols C-1: Aerodyamcs of Arfols 1 C-2: Aerodyamcs of Arfols 2 C-3: Pael Methods C-4: Th Arfol Theory AE301 Aerodyamcs I Ut C-3: Lst of Subects Problem Solutos?

More information

Announcements. Recognition II. Computer Vision I. Example: Face Detection. Evaluating a binary classifier

Announcements. Recognition II. Computer Vision I. Example: Face Detection. Evaluating a binary classifier Aoucemets Recogto II H3 exteded to toght H4 to be aouced today. Due Frday 2/8. Note wll take a whle to ru some thgs. Fal Exam: hursday 2/4 at 7pm-0pm CSE252A Lecture 7 Example: Face Detecto Evaluatg a

More information

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier Baa Classfcato CS6L Data Mg: Classfcato() Referece: J. Ha ad M. Kamber, Data Mg: Cocepts ad Techques robablstc learg: Calculate explct probabltes for hypothess, amog the most practcal approaches to certa

More information

CHAPTER 3 POSTERIOR DISTRIBUTIONS

CHAPTER 3 POSTERIOR DISTRIBUTIONS CHAPTER 3 POSTERIOR DISTRIBUTIONS If scece caot measure the degree of probablt volved, so much the worse for scece. The practcal ma wll stck to hs apprecatve methods utl t does, or wll accept the results

More information

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy Bouds o the expected etropy ad KL-dvergece of sampled multomal dstrbutos Brado C. Roy bcroy@meda.mt.edu Orgal: May 18, 2011 Revsed: Jue 6, 2011 Abstract Iformato theoretc quattes calculated from a sampled

More information

Lecture 3 Probability review (cont d)

Lecture 3 Probability review (cont d) STATS 00: Itroducto to Statstcal Iferece Autum 06 Lecture 3 Probablty revew (cot d) 3. Jot dstrbutos If radom varables X,..., X k are depedet, the ther dstrbuto may be specfed by specfyg the dvdual dstrbuto

More information

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Mache Learg CSE6740/CS764/ISYE6740, Fall 0 Itroducto to Regresso Le Sog Lecture 4, August 30, 0 Based o sldes from Erc g, CMU Readg: Chap. 3, CB Mache learg for apartmet hutg Suppose ou are to move to

More information

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The

More information

LECTURE - 4 SIMPLE RANDOM SAMPLING DR. SHALABH DEPARTMENT OF MATHEMATICS AND STATISTICS INDIAN INSTITUTE OF TECHNOLOGY KANPUR

LECTURE - 4 SIMPLE RANDOM SAMPLING DR. SHALABH DEPARTMENT OF MATHEMATICS AND STATISTICS INDIAN INSTITUTE OF TECHNOLOGY KANPUR amplg Theory MODULE II LECTURE - 4 IMPLE RADOM AMPLIG DR. HALABH DEPARTMET OF MATHEMATIC AD TATITIC IDIA ITITUTE OF TECHOLOGY KAPUR Estmato of populato mea ad populato varace Oe of the ma objectves after

More information

Support vector machines II

Support vector machines II CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error

More information

Econometric Methods. Review of Estimation

Econometric Methods. Review of Estimation Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators

More information

Idea is to sample from a different distribution that picks points in important regions of the sample space. Want ( ) ( ) ( ) E f X = f x g x dx

Idea is to sample from a different distribution that picks points in important regions of the sample space. Want ( ) ( ) ( ) E f X = f x g x dx Importace Samplg Used for a umber of purposes: Varace reducto Allows for dffcult dstrbutos to be sampled from. Sestvty aalyss Reusg samples to reduce computatoal burde. Idea s to sample from a dfferet

More information

CS 3710 Advanced Topics in AI Lecture 17. Density estimation. CS 3710 Probabilistic graphical models. Administration

CS 3710 Advanced Topics in AI Lecture 17. Density estimation. CS 3710 Probabilistic graphical models. Administration CS 37 Avace Topcs AI Lecture 7 esty estmato Mlos Hauskrecht mlos@cs.ptt.eu 539 Seott Square CS 37 robablstc graphcal moels Amstrato Mterm: A take-home exam week ue o Weesay ovember 5 before the class epes

More information

Fault Diagnosis Using Feature Vectors and Fuzzy Fault Pattern Rulebase

Fault Diagnosis Using Feature Vectors and Fuzzy Fault Pattern Rulebase Fault Dagoss Usg Feature Vectors ad Fuzzy Fault Patter Rulebase Prepared by: FL Lews Updated: Wedesday, ovember 03, 004 Feature Vectors The requred puts for the dagostc models are termed the feature vectors

More information

Simulation Output Analysis

Simulation Output Analysis Smulato Output Aalyss Summary Examples Parameter Estmato Sample Mea ad Varace Pot ad Iterval Estmato ermatg ad o-ermatg Smulato Mea Square Errors Example: Sgle Server Queueg System x(t) S 4 S 4 S 3 S 5

More information

Class 13,14 June 17, 19, 2015

Class 13,14 June 17, 19, 2015 Class 3,4 Jue 7, 9, 05 Pla for Class3,4:. Samplg dstrbuto of sample mea. The Cetral Lmt Theorem (CLT). Cofdece terval for ukow mea.. Samplg Dstrbuto for Sample mea. Methods used are based o CLT ( Cetral

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Revew o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for Chapter 4-5 Notes: Although all deftos ad theorems troduced our lectures ad ths ote are mportat ad you should be famlar wth, but I put those

More information

6.867 Machine Learning

6.867 Machine Learning 6.867 Mache Learg Problem set Due Frday, September 9, rectato Please address all questos ad commets about ths problem set to 6.867-staff@a.mt.edu. You do ot eed to use MATLAB for ths problem set though

More information

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b CS 70 Dscrete Mathematcs ad Probablty Theory Fall 206 Sesha ad Walrad DIS 0b. Wll I Get My Package? Seaky delvery guy of some compay s out delverg packages to customers. Not oly does he had a radom package

More information

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression Overvew Basc cocepts of Bayesa learg Most probable model gve data Co tosses Lear regresso Logstc regresso Bayesa predctos Co tosses Lear regresso 30 Recap: regresso problems Iput to learg problem: trag

More information

Lecture 3. Sampling, sampling distributions, and parameter estimation

Lecture 3. Sampling, sampling distributions, and parameter estimation Lecture 3 Samplg, samplg dstrbutos, ad parameter estmato Samplg Defto Populato s defed as the collecto of all the possble observatos of terest. The collecto of observatos we take from the populato s called

More information

3. Basic Concepts: Consequences and Properties

3. Basic Concepts: Consequences and Properties : 3. Basc Cocepts: Cosequeces ad Propertes Markku Jutt Overvew More advaced cosequeces ad propertes of the basc cocepts troduced the prevous lecture are derved. Source The materal s maly based o Sectos.6.8

More information

ECON 5360 Class Notes GMM

ECON 5360 Class Notes GMM ECON 560 Class Notes GMM Geeralzed Method of Momets (GMM) I beg by outlg the classcal method of momets techque (Fsher, 95) ad the proceed to geeralzed method of momets (Hase, 98).. radtoal Method of Momets

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

D KL (P Q) := p i ln p i q i

D KL (P Q) := p i ln p i q i Cheroff-Bouds 1 The Geeral Boud Let P 1,, m ) ad Q q 1,, q m ) be two dstrbutos o m elemets, e,, q 0, for 1,, m, ad m 1 m 1 q 1 The Kullback-Lebler dvergece or relatve etroy of P ad Q s defed as m D KL

More information

22 Nonparametric Methods.

22 Nonparametric Methods. 22 oparametrc Methods. I parametrc models oe assumes apror that the dstrbutos have a specfc form wth oe or more ukow parameters ad oe tres to fd the best or atleast reasoably effcet procedures that aswer

More information

Tema 5: Aprendizaje NO Supervisado: CLUSTERING Unsupervised Learning: CLUSTERING. Febrero-Mayo 2005

Tema 5: Aprendizaje NO Supervisado: CLUSTERING Unsupervised Learning: CLUSTERING. Febrero-Mayo 2005 Tema 5: Apredzae NO Supervsado: CLUSTERING Usupervsed Learg: CLUSTERING Febrero-Mayo 2005 SUPERVISED METHODS: LABELED Data Base Labeled Data Base Dvded to Tra ad Test Choose Algorthm: MAP, ML, K-Nearest

More information

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity ECONOMETRIC THEORY MODULE VIII Lecture - 6 Heteroskedastcty Dr. Shalabh Departmet of Mathematcs ad Statstcs Ida Isttute of Techology Kapur . Breusch Paga test Ths test ca be appled whe the replcated data

More information

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions Iteratoal Joural of Computatoal Egeerg Research Vol, 0 Issue, Estmato of Stress- Stregth Relablty model usg fte mxture of expoetal dstrbutos K.Sadhya, T.S.Umamaheswar Departmet of Mathematcs, Lal Bhadur

More information

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best Error Aalyss Preamble Wheever a measuremet s made, the result followg from that measuremet s always subject to ucertaty The ucertaty ca be reduced by makg several measuremets of the same quatty or by mprovg

More information

Centroids & Moments of Inertia of Beam Sections

Centroids & Moments of Inertia of Beam Sections RCH 614 Note Set 8 S017ab Cetrods & Momets of erta of Beam Sectos Notato: b C d d d Fz h c Jo L O Q Q = ame for area = ame for a (base) wdth = desgato for chael secto = ame for cetrod = calculus smbol

More information

The Optimal Algorithm. 7. Algorithm-Independent Learning. No Free Lunch theorem. Theorem: No Free Lunch. Aleix M. Martinez

The Optimal Algorithm. 7. Algorithm-Independent Learning. No Free Lunch theorem. Theorem: No Free Lunch. Aleix M. Martinez The Optmal Algorthm 7. Algorthm-Idepedet Learg Alex M. Martez alex@ece.osu.edu Hadouts Hadoutsfor forece ECE874, 007 007 I ths course we have defed a large umber of PR algorthms. The obvous questo to as

More information

Lattices. Mathematical background

Lattices. Mathematical background Lattces Mathematcal backgroud Lattces : -dmesoal Eucldea space. That s, { T x } x x = (,, ) :,. T T If x= ( x,, x), y = ( y,, y), the xy, = xy (er product of xad y) x = /2 xx, (Eucldea legth or orm of

More information

Functions of Random Variables

Functions of Random Variables Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,

More information

ρ < 1 be five real numbers. The

ρ < 1 be five real numbers. The Lecture o BST 63: Statstcal Theory I Ku Zhag, /0/006 Revew for the prevous lecture Deftos: covarace, correlato Examples: How to calculate covarace ad correlato Theorems: propertes of correlato ad covarace

More information

Comparing Different Estimators of three Parameters for Transmuted Weibull Distribution

Comparing Different Estimators of three Parameters for Transmuted Weibull Distribution Global Joural of Pure ad Appled Mathematcs. ISSN 0973-768 Volume 3, Number 9 (207), pp. 55-528 Research Ida Publcatos http://www.rpublcato.com Comparg Dfferet Estmators of three Parameters for Trasmuted

More information

Cross-validation estimations of hyper-parameters of Gaussian processes with inequality constraints

Cross-validation estimations of hyper-parameters of Gaussian processes with inequality constraints Cross-valdato estmatos of hyper-parameters of Gaussa processes wth equalty costrats Hassa Maatouk, Olver Roustat, Ya Rchet To cte ths verso: Hassa Maatouk, Olver Roustat, Ya Rchet. Cross-valdato estmatos

More information

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters

More information

Chapter 5 Elementary Statistics, Empirical Probability Distributions, and More on Simulation

Chapter 5 Elementary Statistics, Empirical Probability Distributions, and More on Simulation Chapter 5 Elemetary Statstcs, Emprcal Probablty Dstrbutos, ad More o Smulato Cotets Coectg Probablty wth Observatos of Data Sample Mea ad Sample Varace Regresso Techques Emprcal Dstrbuto Fuctos More o

More information

Radial Basis Function Networks

Radial Basis Function Networks Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s

More information

Lecture 9: Tolerant Testing

Lecture 9: Tolerant Testing Lecture 9: Tolerat Testg Dael Kae Scrbe: Sakeerth Rao Aprl 4, 07 Abstract I ths lecture we prove a quas lear lower boud o the umber of samples eeded to do tolerat testg for L dstace. Tolerat Testg We have

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

Chapter 14 Logistic Regression Models

Chapter 14 Logistic Regression Models Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as

More information

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i. CS 94- Desty Matrces, vo Neuma Etropy 3/7/07 Sprg 007 Lecture 3 I ths lecture, we wll dscuss the bascs of quatum formato theory I partcular, we wll dscuss mxed quatum states, desty matrces, vo Neuma etropy

More information

D. VQ WITH 1ST-ORDER LOSSLESS CODING

D. VQ WITH 1ST-ORDER LOSSLESS CODING VARIABLE-RATE VQ (AKA VQ WITH ENTROPY CODING) Varable-Rate VQ = Quatzato + Lossless Varable-Legth Bary Codg A rage of optos -- from smple to complex A. Uform scalar quatzato wth varable-legth codg, oe

More information

Chapter 11 Systematic Sampling

Chapter 11 Systematic Sampling Chapter stematc amplg The sstematc samplg techue s operatoall more coveet tha the smple radom samplg. It also esures at the same tme that each ut has eual probablt of cluso the sample. I ths method of

More information

ECE 194C Target Classification in Sensor Networks Problem. Fundamental problem in pattern recognition.

ECE 194C Target Classification in Sensor Networks   Problem. Fundamental problem in pattern recognition. ECE 94C arget Classfcato Sesor Netorks.ece.ucsb.edu/Facult/Ilts/ece94c Proble Gve sgature of a target, e.g. sesc, acoustc, vdeo. Detere hch categor the sgature belogs to Fudaetal proble patter recogto.

More information

Lecture 12: Multilayer perceptrons II

Lecture 12: Multilayer perceptrons II Lecture : Multlayer perceptros II Bayes dscrmats ad MLPs he role of hdde uts A eample Itroducto to Patter Recoto Rcardo Guterrez-Osua Wrht State Uversty Bayes dscrmats ad MLPs ( As we have see throuhout

More information

STK4011 and STK9011 Autumn 2016

STK4011 and STK9011 Autumn 2016 STK4 ad STK9 Autum 6 Pot estmato Covers (most of the followg materal from chapter 7: Secto 7.: pages 3-3 Secto 7..: pages 3-33 Secto 7..: pages 35-3 Secto 7..3: pages 34-35 Secto 7.3.: pages 33-33 Secto

More information

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ  1 STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS Recall Assumpto E(Y x) η 0 + η x (lear codtoal mea fucto) Data (x, y ), (x 2, y 2 ),, (x, y ) Least squares estmator ˆ E (Y x) ˆ " 0 + ˆ " x, where ˆ

More information

Clustering: K-Means. Machine Learning , Fall Bhavana Dalvi Mishra PhD student LTI, CMU

Clustering: K-Means. Machine Learning , Fall Bhavana Dalvi Mishra PhD student LTI, CMU Clusterg: K-Meas Mache Learg 0-60, Fall 204 Bhavaa Dalv Mshra PhD studet LTI, CMU Sldes are based o materals from Prof. Erc Xg, Prof. Wllam Cohe ad Prof. Adrew Ng Outle What s clusterg? How are smlarty

More information