MIMA Group. Chapter 4 Non-Parameter Estimation. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

Size: px
Start display at page:

Download "MIMA Group. Chapter 4 Non-Parameter Estimation. School of Computer Science and Technology, Shandong University. Xin-Shun SDU"

Transcription

1 Grou M D L M Chater 4 No-Parameter Estmato X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty

2 Cotets Itroducto Parze Wdows K-Nearest-Neghbor Estmato Classfcato Techques The Nearest-Neghbor rule(-nn The Nearest-Neghbor rule(k-nn Dstace Metrcs X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 2

3 Bayes Rule for Classfcato To comute the osteror robablty, we eed to kow the ror robablty ad the lkelhood. Case I: has certa arametrc form Mamum-Lkelhood Estmato Bayesa Parameter Estmato Problems: P( ( ( ( ( The assumed arametrc form may ot ft the groudtruth desty ecoutered ractce, e.g., assumed arametrc form: umodal; groud-truth: multmodal X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 3

4 No-Parameter Estmato Case II: does t have arametrc form How? ( Let the data seak for themselves! Parze Wdows K -Nearest-Neghbor X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 4

5 Goals Estmate class-codtoal destes ( Estmate osteror robabltes P( X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 5

6 Desty Estmato Assume ( s cotuous, ad R s small Fudametal fact The robablty of a vector fall to a rego R: P( XR ( ' d' R (V R P R ( d' Gve eamles (..d. {, 2,, }, let K deote the radom varable reresetg umber of samles fallg to R, K wll take Bomal dstrbuto: k K ~ B(, PR P( K k PR ( PR k X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 6 R k + R samles

7 Desty Estmato Assume ( s cotuous, ad R s small Fudametal fact The robablty of a vector fall to a rego R: P( XR ( ' d' R (V R P R ( d' ( V R P R E[ K]/ ( E[ K] P R VR Let k R deote the actual umber of samles R X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 7 ( R k R V / R + R samles

8 Desty Estmato Use subscrt to take samle sze to accout k / V We hoe that: lm ( To do ths, we should have lmv 0 ( k ( R V / R ( + R lm k lm k / 0 samles X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 8

9 Desty Estmato ( k / V What tems ca be cotrolled? How? F V ad determe k Parze Wdows F k ad determe V k -Nearest-Neghbor X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 9

10 Parze Wdows ( k / V F V ad determe k Assume R s a d-dmesoal hyercube The legth of each edge s h d V h Determe k wth wdow fucto a.k.a. kerel fucto, otetal fucto. Emauel Parze (929- X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 0

11 Wdow fucto It defes a ut hyercube cetered at the org. h h / 2 0 otherwse h h h X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty

12 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 2 Wdow fucto meas that falls wth the hyercube of volume v cetered at. k : # samles sde the hyercube cetered at, otherwse 0 2 / j j h h h h h h k

13 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 3 Parze Wdow Estmato s ot lmted to be the hyercube wdow fucto defed revously. It could be ay df fucto h k V k / ( h V ( (u ( u u d Parze df

14 Parze Wdow Estmato ( V h ( s a df fucto? Set (- /h =u. V h d d V h u du Wdow fucto Beg df Wdow wdth Trag data Parze df X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 4

15 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 5 Parze Wdow Estmato (: suerosto ( 叠加 of terolatos ( 插值 : cotrbutes to ( based o ts dstace from. h V ( h V ( ( - ( What s the effect of h (wdow wdth o the Parze df?

16 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 6 Parze Wdow Estmato The effect of h d h h h V ( Affects the wdth (horzotal scale Affects the amltude (vertcal scale

17 Parze Wdow Estmato ( V h Suose φ(. beg a 2-d Gaussa df. The shae of δ ( wth decreasg values of h X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 7

18 Parze Wdow Estmato ( ( ( - V h Whe h s very large, δ ( wll be broad wth small amltude. P ( wll be the suerosto of broad, slowly chagg fuctos,.e., beg smooth wth low resoluto. Whe h s very small, δ ( wll be shar wth large amltude. P ( wll be the suerosto of shar ulses,.e., beg varable/ustable wth hgh resoluto. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 8

19 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 9 Parze Wdow Estmato Paze wdow estmatos for fve samles, suosg that φ(. s a 2-d Gaussa df. h V ( ( - (

20 Parze Wdow Estmato Covergece codtos To esure covergece,.e., lm E[ ( ] ( lmvar[ ( ] 0 We have the followg addtoal costrats: su( u u lmv 0 lm ( u u d u 0 lm V X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 20

21 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 2 Illustratos Oe dmeso case: 2 / 2 2 ( e u u h h ( h h / h h / ~ N(0, X

22 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 22 Illustratos Oe dmeso case: h h ( h h / h h / 2 / 2 2 ( e u u

23 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 23 Illustratos Two dmeso case: h h / h h / h h 2 (

24 Classfcato Eamle Smaller wdow Larger wdow X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 24

25 Choosg Wdow Fucto V must aroach zero whe, but at a rate slower tha /, e.g., V V / The value of tal volume V s mortat. I some cases, a cell volume s roer for oe rego but usutable a dfferet rego. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 25

26 k -Nearest Neghbor ( k / V F k ad the determe V To estmate (, we ca ceter a cell about ad let t grow utl t catures k samles, k s some secfed fucto of, e.g., k Prcled rule to choose k lm k lmv 0 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 26

27 k -Nearest Neghbor Eght ots oe dmeso(=8, d= Red curve: k =3 Black curve: k =5 Thrty-oe ots two dmesos ( = 3, d=2 Black surface: k =5 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 27

28 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 28 Estmato of A Posteror robablty P ( =? c j j P, (, ( ( V k /, ( c j j V k /, ( k k

29 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 29 Estmato of A Posteror robablty P ( =? c j j P, (, ( ( V k /, ( c j j V k /, ( k k The value of V or k ca be determed base o Parze wdow or k -earest-eghbor techque.

30 Nearest Neghbor Classfer Store all trag eamles Gve a ew eamle to be classfed, search for the trag eamle (, y whose s most smlar (or closest to, ad redct y. (Lazy Learg (, k P ( c k (, j j X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 30

31 Decso Boudares Decso Boudares The voroo dagram Gve a set of ots, a Voroo dagram descrbes the areas that are earest to ay gve ot. These areas ca be vewed as zoes of cotrol. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 3

32 Decso Boudares Decso boudary s formed by oly retag these le segmet searatg dfferet classes. The more trag eamles we have stored, the more comle the decso boudares ca become. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 32

33 Decso Boudares Wth large umber of eamles ad ose the labels, the decso boudary ca become asty! It ca be bad some tmes-ote the slads ths fgure, they are formed because of osy eamles. If the earest eghbor haes to be a osy ot, the redcto wll be correct. How to deal wth ths? X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 33

34 Effect of k Dfferet k values gve dfferet results: Large k roduces smoother boudares The mact of class label oses caceled out by oe aother. Whe k s too large, what wll hae. Oversmlfed boudares, e.g., k=n, we always redct the majorty class X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 34

35 How to Choose k? Ca we choose k to mmze the mstakes that we make o trag eamles? (trag error What s the trag error of earest-eghbor? Ca we choose k to mmze the mstakes that we make o test eamles? (test error X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 35

36 How to Choose k? How do trag error ad test error chage as we chage the value ok k? X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 36

37 Model Selecto Choosg k for k-nn s just oe of the may model selecto roblems we face mache learg. Model selecto s about choosg amog dfferet models Lear regresso vs. quadratc regresso K-NN vs. decso tree Heavly studed mache learg, crucal mortace ractce. If we use trag error to select models, we wll always choose more comle oes. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 37

38 Model Selecto Choosg k for k-nn s just oe of the may model selecto roblems we face mache learg. Model selecto s about choosg amog dfferet models Lear regresso vs. quadratc regresso K-NN vs. decso tree Heavly studed mache learg, crucal mortace ractce. If we use trag error to select models, we wll always choose more comle oes. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 38

39 Model Selecto We ca kee art of the labeled data aart as valdato data. Evaluate dfferet k values based o the redcto accuracy o the valdato data Choose k that mmze valdato error Valdato ca be vewed as aother ame for testg, but the ame testg s tycally reserved for fal evaluato urose, whereas valdato s mostly used for model selecto urose. X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 39

40 Model Selecto The mact of valdato set sze If we oly reserve oe ot our valdato set, should we trust the valdato error as a relable estmate of our classfer s erformace? The larger the valdato set, the more relable our model selecto choces are Whe the total labeled set s small, we mght ot be able to get a bg eough valdato set leadg to urelable model selecto decsos X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 40

41 Model Selecto K-fold Cross Valdato Perform learg/testg K tmes Each tme reserve oe subset for valdato set, tra o the rest Secal case: Learve oe-out crass valdato X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 4

42 Other ssues of knn It ca be comutatoally eesve to fd the earest eghbors! Seed u the comutato by usg smart data structures to quckly search for aromate solutos For large data set, t requres a lot of memory Remove umortat eamles X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 42

43 Fal words o KNN KNN s what we call lazy learg (vs. eager learg Lazy: learg oly occur whe you see the test eamle Eager: lear a model before you see the test eamle, trag eamles ca be throw away after learg Advatage: Cocetually smle, easy to uderstad ad ela Very fleble decso boudares Not much learg at all! Dsadvatage It ca be hard to fd a good dstace measure Irrelevat features ad ose ca be very detrmetal Tycally ca ot hadle more tha 30 attrbutes Comutatoal cost: requres a lot comutato ad memory X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 43

44 Dstace Metrcs Dstace Measuremet s a mortace factor for earest-eghbor classfer, e.g., To acheve varat atter recogto ad data mg results. The effect of chage uts X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 44

45 Dstace Metrcs Dstace Measuremet s a mortace factor for earest-eghbor classfer, e.g., To acheve varat atter recogto ad data mg results. The effect of chage uts X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 45

46 Proertes of a Dstace Metrc Noegatvty Reflevty Symmetry Tragle Iequalty D( a, b 0 D( a, b 0 ff a D( a, b D( b, a b D( a, b D( b, c D( a, c X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 46

47 Mkowsk Metrc (L Norm. L orm Mahatta or cty block dstace 2. L 2 orm Eucldea dstace 3. L orm Chessboard dstace L L d / L d ( a, b a b a b ( d 2 2 a, b a b 2 a b ( a, b a b d a b / / 2 ma( a b X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 47

48 Mkowsk Metrc (L Norm. L orm Mahatta or cty block dstace 2. L 2 orm Eucldea dstace 3. L orm Chessboard dstace L L d / L d ( a, b a b a b ( d 2 2 a, b a b 2 a b ( a, b a b d a b / / 2 ma( a b X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 48

49 Summary Basc settg for o-arametrc techques Let the data seak for themselves Parametrc form ot assumed for class-codtoal df Estmate class-codtoal df from trag eamles Make redctos based o Bayes Theorem Fudametal results desty estmato ( k / V X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 49

50 Summary ( k / V Parze Wdows F V ad the determe k ( V h Wdow fucto Beg df Wdow wdth Trag data Parze df X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 50

51 Summary k -Nearest-Neghbor F k ad the determe V F k ad the determe V To estmate (, we ca ceter a cell about ad let t grow utl t catures k samles, where s some secfed fucto of, e.g., k Prcled rule to choose k lm k lmv 0 X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty 5

52 Grou Ay Questo? X-Shu SDU School of Comuter Scece ad Techology, Shadog Uversty

Nonparametric Density Estimation Intro

Nonparametric Density Estimation Intro Noarametrc Desty Estmato Itro Parze Wdows No-Parametrc Methods Nether robablty dstrbuto or dscrmat fucto s kow Haes qute ofte All we have s labeled data a lot s kow easer salmo bass salmo salmo Estmate

More information

6. Nonparametric techniques

6. Nonparametric techniques 6. Noparametrc techques Motvato Problem: how to decde o a sutable model (e.g. whch type of Gaussa) Idea: just use the orgal data (lazy learg) 2 Idea 1: each data pot represets a pece of probablty P(x)

More information

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

Parameter Estimation

Parameter Estimation arameter Estmato robabltes Notatoal Coveto Mass dscrete fucto: catal letters Desty cotuous fucto: small letters Vector vs. scalar Scalar: la Vector: bold D: small Hgher dmeso: catal Notes a cotuous state

More information

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation CS 750 Mache Learg Lecture 5 esty estmato Mlos Hausrecht mlos@tt.edu 539 Seott Square esty estmato esty estmato: s a usuervsed learg roblem Goal: Lear a model that rereset the relatos amog attrbutes the

More information

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

Generative classification models

Generative classification models CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

Random Variables. ECE 313 Probability with Engineering Applications Lecture 8 Professor Ravi K. Iyer University of Illinois

Random Variables. ECE 313 Probability with Engineering Applications Lecture 8 Professor Ravi K. Iyer University of Illinois Radom Varables ECE 313 Probablty wth Egeerg Alcatos Lecture 8 Professor Rav K. Iyer Uversty of Illos Iyer - Lecture 8 ECE 313 Fall 013 Today s Tocs Revew o Radom Varables Cumulatve Dstrbuto Fucto (CDF

More information

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier Baa Classfcato CS6L Data Mg: Classfcato() Referece: J. Ha ad M. Kamber, Data Mg: Cocepts ad Techques robablstc learg: Calculate explct probabltes for hypothess, amog the most practcal approaches to certa

More information

Applications of Multiple Biological Signals

Applications of Multiple Biological Signals Applcatos of Multple Bologcal Sgals I the Hosptal of Natoal Tawa Uversty, curatve gastrectomy could be performed o patets of gastrc cacers who are udergoe the curatve resecto to acqure sgal resposes from

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

Lecture 3 Naïve Bayes, Maximum Entropy and Text Classification COSI 134

Lecture 3 Naïve Bayes, Maximum Entropy and Text Classification COSI 134 Lecture 3 Naïve Baes, Mamum Etro ad Tet Classfcato COSI 34 Codtoal Parameterzato Two RVs: ItellgeceI ad SATS ValI = {Hgh,Low}, ValS={Hgh,Low} A ossble jot dstrbuto Ca descrbe usg cha rule as PI,S PIPS

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

Support vector machines

Support vector machines CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.

More information

Nonparametric Techniques

Nonparametric Techniques Noparametrc Techques Noparametrc Techques w/o assumg ay partcular dstrbuto the uderlyg fucto may ot be kow e.g. mult-modal destes too may parameters Estmatg desty dstrbuto drectly Trasform to a lower-dmesoal

More information

Radial Basis Function Networks

Radial Basis Function Networks Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s

More information

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters

More information

Summary of the lecture in Biostatistics

Summary of the lecture in Biostatistics Summary of the lecture Bostatstcs Probablty Desty Fucto For a cotuos radom varable, a probablty desty fucto s a fucto such that: 0 dx a b) b a dx A probablty desty fucto provdes a smple descrpto of the

More information

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

Support vector machines II

Support vector machines II CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Aalyss of Varace ad Desg of Exermets-I MODULE II LECTURE - GENERAL LINEAR HYPOTHESIS AND ANALYSIS OF VARIANCE Dr Shalabh Deartmet of Mathematcs ad Statstcs Ida Isttute of Techology Kaur Tukey s rocedure

More information

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Multivariate Transformation of Variables and Maximum Likelihood Estimation Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Postpoed exam: ECON430 Statstcs Date of exam: Jauary 0, 0 Tme for exam: 09:00 a.m. :00 oo The problem set covers 5 pages Resources allowed: All wrtte ad prted

More information

Outline. Point Pattern Analysis Part I. Revisit IRP/CSR

Outline. Point Pattern Analysis Part I. Revisit IRP/CSR Pot Patter Aalyss Part I Outle Revst IRP/CSR, frst- ad secod order effects What s pot patter aalyss (PPA)? Desty-based pot patter measures Dstace-based pot patter measures Revst IRP/CSR Equal probablty:

More information

Dimensionality Reduction and Learning

Dimensionality Reduction and Learning CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that

More information

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018 Chrs Pech Fal Practce CS09 Dec 5, 08 Practce Fal Examato Solutos. Aswer: 4/5 8/7. There are multle ways to obta ths aswer; here are two: The frst commo method s to sum over all ossbltes for the rak of

More information

IS 709/809: Computational Methods in IS Research. Simple Markovian Queueing Model

IS 709/809: Computational Methods in IS Research. Simple Markovian Queueing Model IS 79/89: Comutatoal Methods IS Research Smle Marova Queueg Model Nrmalya Roy Deartmet of Iformato Systems Uversty of Marylad Baltmore Couty www.umbc.edu Queueg Theory Software QtsPlus software The software

More information

Lecture 9. Some Useful Discrete Distributions. Some Useful Discrete Distributions. The observations generated by different experiments have

Lecture 9. Some Useful Discrete Distributions. Some Useful Discrete Distributions. The observations generated by different experiments have NM 7 Lecture 9 Some Useful Dscrete Dstrbutos Some Useful Dscrete Dstrbutos The observatos geerated by dfferet eermets have the same geeral tye of behavor. Cosequetly, radom varables assocated wth these

More information

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971)) art 4b Asymptotc Results for MRR usg RESS Recall that the RESS statstc s a specal type of cross valdato procedure (see Alle (97)) partcular to the regresso problem ad volves fdg Y $,, the estmate at the

More information

Artificial Intelligence Learning of decision trees

Artificial Intelligence Learning of decision trees Artfcal Itellgece Learg of decso trees Peter Atal atal@mt.bme.hu A.I. November 21, 2016 1 Problem: decde whether to wat for a table at a restaurat, based o the followg attrbutes: 1. Alterate: s there a

More information

Lecture 7: Linear and quadratic classifiers

Lecture 7: Linear and quadratic classifiers Lecture 7: Lear ad quadratc classfers Bayes classfers for ormally dstrbuted classes Case : Σ σ I Case : Σ Σ (Σ daoal Case : Σ Σ (Σ o-daoal Case 4: Σ σ I Case 5: Σ Σ j eeral case Lear ad quadratc classfers:

More information

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions Iteratoal Joural of Computatoal Egeerg Research Vol, 0 Issue, Estmato of Stress- Stregth Relablty model usg fte mxture of expoetal dstrbutos K.Sadhya, T.S.Umamaheswar Departmet of Mathematcs, Lal Bhadur

More information

Training Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ

Training Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ Stat 74 Estmato for Geeral Lear Model Prof. Goel Broad Outle Geeral Lear Model (GLM): Trag Samle Model: Gve observatos, [[( Y, x ), x = ( x,, xr )], =,,, the samle model ca be exressed as Y = µ ( x, x,,

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~

More information

Parametric Density Estimation: Bayesian Estimation. Naïve Bayes Classifier

Parametric Density Estimation: Bayesian Estimation. Naïve Bayes Classifier arametrc Dest Estmato: Baesa Estmato. Naïve Baes Classfer Baesa arameter Estmato Suose we have some dea of the rage where arameters should be Should t we formalze such ror owledge hoes that t wll lead

More information

6.867 Machine Learning

6.867 Machine Learning 6.867 Mache Learg Problem set Due Frday, September 9, rectato Please address all questos ad commets about ths problem set to 6.867-staff@a.mt.edu. You do ot eed to use MATLAB for ths problem set though

More information

2SLS Estimates ECON In this case, begin with the assumption that E[ i

2SLS Estimates ECON In this case, begin with the assumption that E[ i SLS Estmates ECON 3033 Bll Evas Fall 05 Two-Stage Least Squares (SLS Cosder a stadard lear bvarate regresso model y 0 x. I ths case, beg wth the assumto that E[ x] 0 whch meas that OLS estmates of wll

More information

2006 Jamie Trahan, Autar Kaw, Kevin Martin University of South Florida United States of America

2006 Jamie Trahan, Autar Kaw, Kevin Martin University of South Florida United States of America SOLUTION OF SYSTEMS OF SIMULTANEOUS LINEAR EQUATIONS Gauss-Sedel Method 006 Jame Traha, Autar Kaw, Kev Mart Uversty of South Florda Uted States of Amerca kaw@eg.usf.edu Itroducto Ths worksheet demostrates

More information

CHAPTER VI Statistical Analysis of Experimental Data

CHAPTER VI Statistical Analysis of Experimental Data Chapter VI Statstcal Aalyss of Expermetal Data CHAPTER VI Statstcal Aalyss of Expermetal Data Measuremets do ot lead to a uque value. Ths s a result of the multtude of errors (maly radom errors) that ca

More information

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model Chapter 3 Asmptotc Theor ad Stochastc Regressors The ature of eplaator varable s assumed to be o-stochastc or fed repeated samples a regresso aalss Such a assumpto s approprate for those epermets whch

More information

PROJECTION PROBLEM FOR REGULAR POLYGONS

PROJECTION PROBLEM FOR REGULAR POLYGONS Joural of Mathematcal Sceces: Advaces ad Applcatos Volume, Number, 008, Pages 95-50 PROJECTION PROBLEM FOR REGULAR POLYGONS College of Scece Bejg Forestry Uversty Bejg 0008 P. R. Cha e-mal: sl@bjfu.edu.c

More information

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression Overvew Basc cocepts of Bayesa learg Most probable model gve data Co tosses Lear regresso Logstc regresso Bayesa predctos Co tosses Lear regresso 30 Recap: regresso problems Iput to learg problem: trag

More information

Channel Models with Memory. Channel Models with Memory. Channel Models with Memory. Channel Models with Memory

Channel Models with Memory. Channel Models with Memory. Channel Models with Memory. Channel Models with Memory Chael Models wth Memory Chael Models wth Memory Hayder radha Electrcal ad Comuter Egeerg Mchga State Uversty I may ractcal etworkg scearos (cludg the Iteret ad wreless etworks), the uderlyg chaels are

More information

Special Instructions / Useful Data

Special Instructions / Useful Data JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth

More information

BASIC PRINCIPLES OF STATISTICS

BASIC PRINCIPLES OF STATISTICS BASIC PRINCIPLES OF STATISTICS PROBABILITY DENSITY DISTRIBUTIONS DISCRETE VARIABLES BINOMIAL DISTRIBUTION ~ B 0 0 umber of successes trals Pr E [ ] Var[ ] ; BINOMIAL DISTRIBUTION B7 0. B30 0.3 B50 0.5

More information

Machine Learning. Topic 4: Measuring Distance

Machine Learning. Topic 4: Measuring Distance Mache Learg Topc 4: Measurg Dstace Bra Pardo Mache Learg: EECS 349 Fall 2009 Wh measure dstace? Clusterg requres dstace measures. Local methods requre a measure of localt Search eges requre a measure of

More information

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The

More information

STK4011 and STK9011 Autumn 2016

STK4011 and STK9011 Autumn 2016 STK4 ad STK9 Autum 6 Pot estmato Covers (most of the followg materal from chapter 7: Secto 7.: pages 3-3 Secto 7..: pages 3-33 Secto 7..: pages 35-3 Secto 7..3: pages 34-35 Secto 7.3.: pages 33-33 Secto

More information

CS 3710 Advanced Topics in AI Lecture 17. Density estimation. CS 3710 Probabilistic graphical models. Administration

CS 3710 Advanced Topics in AI Lecture 17. Density estimation. CS 3710 Probabilistic graphical models. Administration CS 37 Avace Topcs AI Lecture 7 esty estmato Mlos Hauskrecht mlos@cs.ptt.eu 539 Seott Square CS 37 robablstc graphcal moels Amstrato Mterm: A take-home exam week ue o Weesay ovember 5 before the class epes

More information

Continuous Random Variables: Conditioning, Expectation and Independence

Continuous Random Variables: Conditioning, Expectation and Independence Cotuous Radom Varables: Codtog, xectato ad Ideedece Berl Che Deartmet o Comuter cece & Iormato geerg atoal Tawa ormal Uverst Reerece: - D.. Bertsekas, J.. Tstskls, Itroducto to robablt, ectos 3.4-3.5 Codtog

More information

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames KLT Tracker Tracker. Detect Harrs corers the frst frame 2. For each Harrs corer compute moto betwee cosecutve frames (Algmet). 3. Lk moto vectors successve frames to get a track 4. Itroduce ew Harrs pots

More information

9.1 Introduction to the probit and logit models

9.1 Introduction to the probit and logit models EC3000 Ecoometrcs Lecture 9 Probt & Logt Aalss 9. Itroducto to the probt ad logt models 9. The logt model 9.3 The probt model Appedx 9. Itroducto to the probt ad logt models These models are used regressos

More information

Model Fitting, RANSAC. Jana Kosecka

Model Fitting, RANSAC. Jana Kosecka Model Fttg, RANSAC Jaa Kosecka Fttg: Issues Prevous strateges Le detecto Hough trasform Smple parametrc model, two parameters m, b m + b Votg strateg Hard to geeralze to hgher dmesos a o + a + a 2 2 +

More information

Lecture 9: Tolerant Testing

Lecture 9: Tolerant Testing Lecture 9: Tolerat Testg Dael Kae Scrbe: Sakeerth Rao Aprl 4, 07 Abstract I ths lecture we prove a quas lear lower boud o the umber of samples eeded to do tolerat testg for L dstace. Tolerat Testg We have

More information

Dimensionality reduction Feature selection

Dimensionality reduction Feature selection CS 750 Mache Learg Lecture 3 Dmesoalty reducto Feature selecto Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 750 Mache Learg Dmesoalty reducto. Motvato. Classfcato problem eample: We have a put data

More information

CHAPTER 6. d. With success = observation greater than 10, x = # of successes = 4, and

CHAPTER 6. d. With success = observation greater than 10, x = # of successes = 4, and CHAPTR 6 Secto 6.. a. We use the samle mea, to estmate the oulato mea µ. Σ 9.80 µ 8.407 7 ~ 7. b. We use the samle meda, 7 (the mddle observato whe arraged ascedg order. c. We use the samle stadard devato,

More information

D KL (P Q) := p i ln p i q i

D KL (P Q) := p i ln p i q i Cheroff-Bouds 1 The Geeral Boud Let P 1,, m ) ad Q q 1,, q m ) be two dstrbutos o m elemets, e,, q 0, for 1,, m, ad m 1 m 1 q 1 The Kullback-Lebler dvergece or relatve etroy of P ad Q s defed as m D KL

More information

Functions of Random Variables

Functions of Random Variables Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,

More information

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections ENGI 441 Jot Probablty Dstrbutos Page 7-01 Jot Probablty Dstrbutos [Navd sectos.5 ad.6; Devore sectos 5.1-5.] The jot probablty mass fucto of two dscrete radom quattes, s, P ad p x y x y The margal probablty

More information

Objectives of Multiple Regression

Objectives of Multiple Regression Obectves of Multple Regresso Establsh the lear equato that best predcts values of a depedet varable Y usg more tha oe eplaator varable from a large set of potetal predctors {,,... k }. Fd that subset of

More information

Probability and Statistics. What is probability? What is statistics?

Probability and Statistics. What is probability? What is statistics? robablt ad Statstcs What s robablt? What s statstcs? robablt ad Statstcs robablt Formall defed usg a set of aoms Seeks to determe the lkelhood that a gve evet or observato or measuremet wll or has haeed

More information

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Mache Learg CSE6740/CS764/ISYE6740, Fall 0 Itroducto to Regresso Le Sog Lecture 4, August 30, 0 Based o sldes from Erc g, CMU Readg: Chap. 3, CB Mache learg for apartmet hutg Suppose ou are to move to

More information

STK3100 and STK4100 Autumn 2017

STK3100 and STK4100 Autumn 2017 SK3 ad SK4 Autum 7 Geeralzed lear models Part III Covers the followg materal from chaters 4 ad 5: Sectos 4..5, 4.3.5, 4.3.6, 4.4., 4.4., ad 4.4.3 Sectos 5.., 5.., ad 5.5. Ørulf Borga Deartmet of Mathematcs

More information

L5 Polynomial / Spline Curves

L5 Polynomial / Spline Curves L5 Polyomal / Sple Curves Cotets Coc sectos Polyomal Curves Hermte Curves Bezer Curves B-Sples No-Uform Ratoal B-Sples (NURBS) Mapulato ad Represetato of Curves Types of Curve Equatos Implct: Descrbe a

More information

Pinaki Mitra Dept. of CSE IIT Guwahati

Pinaki Mitra Dept. of CSE IIT Guwahati Pak Mtra Dept. of CSE IIT Guwahat Hero s Problem HIGHWAY FACILITY LOCATION Faclty Hgh Way Farm A Farm B Illustrato of the Proof of Hero s Theorem p q s r r l d(p,r) + d(q,r) = d(p,q) p d(p,r ) + d(q,r

More information

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines It J Cotemp Math Sceces, Vol 5, 2010, o 19, 921-929 Solvg Costraed Flow-Shop Schedulg Problems wth Three Maches P Pada ad P Rajedra Departmet of Mathematcs, School of Advaced Sceces, VIT Uversty, Vellore-632

More information

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1 STA 08 Appled Lear Models: Regresso Aalyss Sprg 0 Soluto for Homework #. Let Y the dollar cost per year, X the umber of vsts per year. The the mathematcal relato betwee X ad Y s: Y 300 + X. Ths s a fuctoal

More information

PTAS for Bin-Packing

PTAS for Bin-Packing CS 663: Patter Matchg Algorthms Scrbe: Che Jag /9/00. Itroducto PTAS for B-Packg The B-Packg problem s NP-hard. If we use approxmato algorthms, the B-Packg problem could be solved polyomal tme. For example,

More information

For combinatorial problems we might need to generate all permutations, combinations, or subsets of a set.

For combinatorial problems we might need to generate all permutations, combinations, or subsets of a set. Addtoal Decrease ad Coquer Algorthms For combatoral problems we mght eed to geerate all permutatos, combatos, or subsets of a set. Geeratg Permutatos If we have a set f elemets: { a 1, a 2, a 3, a } the

More information

STK3100 and STK4100 Autumn 2018

STK3100 and STK4100 Autumn 2018 SK3 ad SK4 Autum 8 Geeralzed lear models Part III Covers the followg materal from chaters 4 ad 5: Cofdece tervals by vertg tests Cosder a model wth a sgle arameter β We may obta a ( α% cofdece terval for

More information

Parametric Density Estimation: Bayesian Estimation. Naïve Bayes Classifier

Parametric Density Estimation: Bayesian Estimation. Naïve Bayes Classifier arametrc Dest Estmato: Baesa Estmato. Naïve Baes Classfer Baesa arameter Estmato Suppose we have some dea of the rage where parameters θ should be Should t we formalze such pror owledge hopes that t wll

More information

Chapter 5 Properties of a Random Sample

Chapter 5 Properties of a Random Sample Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample

More information

Simulation Output Analysis

Simulation Output Analysis Smulato Output Aalyss Summary Examples Parameter Estmato Sample Mea ad Varace Pot ad Iterval Estmato ermatg ad o-ermatg Smulato Mea Square Errors Example: Sgle Server Queueg System x(t) S 4 S 4 S 3 S 5

More information

Chapter 14 Logistic Regression Models

Chapter 14 Logistic Regression Models Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as

More information

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions. Ordary Least Squares egresso. Smple egresso. Algebra ad Assumptos. I ths part of the course we are gog to study a techque for aalysg the lear relatoshp betwee two varables Y ad X. We have pars of observatos

More information

Chapter 3 Sampling For Proportions and Percentages

Chapter 3 Sampling For Proportions and Percentages Chapter 3 Samplg For Proportos ad Percetages I may stuatos, the characterstc uder study o whch the observatos are collected are qualtatve ature For example, the resposes of customers may marketg surveys

More information

Line Fitting and Regression

Line Fitting and Regression Marquette Uverst MSCS6 Le Fttg ad Regresso Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 8 b Marquette Uverst Least Squares Regresso MSCS6 For LSR we have pots

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA PAPER II STATISTICAL THEORY & METHODS The Socety provdes these solutos to assst caddates preparg for the examatos future years ad for

More information

Classification : Logistic regression. Generative classification model.

Classification : Logistic regression. Generative classification model. CS 75 Mache Lear Lecture 8 Classfcato : Lostc reresso. Geeratve classfcato model. Mlos Hausrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Lear Bar classfcato o classes Y {} Our oal s to lear to classf

More information

Econometric Methods. Review of Estimation

Econometric Methods. Review of Estimation Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators

More information

Median as a Weighted Arithmetic Mean of All Sample Observations

Median as a Weighted Arithmetic Mean of All Sample Observations Meda as a Weghted Arthmetc Mea of All Sample Observatos SK Mshra Dept. of Ecoomcs NEHU, Shllog (Ida). Itroducto: Iumerably may textbooks Statstcs explctly meto that oe of the weakesses (or propertes) of

More information

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision ew Schedule Dec. 8 same same same Oct. ^ weeks ^ week ^ week Fall 004 Patter Recogto for Vso 9.93 Patter Recogto for Vso Classfcato Berd Hesele Fall 004 Overvew Itroducto Lear Dscrmat Aalyss Support Vector

More information

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b CS 70 Dscrete Mathematcs ad Probablty Theory Fall 206 Sesha ad Walrad DIS 0b. Wll I Get My Package? Seaky delvery guy of some compay s out delverg packages to customers. Not oly does he had a radom package

More information

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015 Fall 05 Homework : Solutos Problem : (Practce wth Asymptotc Notato) A essetal requremet for uderstadg scalg behavor s comfort wth asymptotc (or bg-o ) otato. I ths problem, you wll prove some basc facts

More information

2. Independence and Bernoulli Trials

2. Independence and Bernoulli Trials . Ideedece ad Beroull Trals Ideedece: Evets ad B are deedet f B B. - It s easy to show that, B deedet mles, B;, B are all deedet ars. For examle, ad so that B or B B B B B φ,.e., ad B are deedet evets.,

More information

Continuous Distributions

Continuous Distributions 7//3 Cotuous Dstrbutos Radom Varables of the Cotuous Type Desty Curve Percet Desty fucto, f (x) A smooth curve that ft the dstrbuto 3 4 5 6 7 8 9 Test scores Desty Curve Percet Probablty Desty Fucto, f

More information

= lim. (x 1 x 2... x n ) 1 n. = log. x i. = M, n

= lim. (x 1 x 2... x n ) 1 n. = log. x i. = M, n .. Soluto of Problem. M s obvously cotuous o ], [ ad ], [. Observe that M x,..., x ) M x,..., x ) )..) We ext show that M s odecreasg o ], [. Of course.) mles that M s odecreasg o ], [ as well. To show

More information

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits Block-Based Compact hermal Modelg of Semcoductor Itegrated Crcuts Master s hess Defese Caddate: Jg Ba Commttee Members: Dr. Mg-Cheg Cheg Dr. Daqg Hou Dr. Robert Schllg July 27, 2009 Outle Itroducto Backgroud

More information

1 Onto functions and bijections Applications to Counting

1 Onto functions and bijections Applications to Counting 1 Oto fuctos ad bectos Applcatos to Coutg Now we move o to a ew topc. Defto 1.1 (Surecto. A fucto f : A B s sad to be surectve or oto f for each b B there s some a A so that f(a B. What are examples of

More information

Quantitative analysis requires : sound knowledge of chemistry : possibility of interferences WHY do we need to use STATISTICS in Anal. Chem.?

Quantitative analysis requires : sound knowledge of chemistry : possibility of interferences WHY do we need to use STATISTICS in Anal. Chem.? Ch 4. Statstcs 4.1 Quattatve aalyss requres : soud kowledge of chemstry : possblty of terfereces WHY do we eed to use STATISTICS Aal. Chem.? ucertaty ests. wll we accept ucertaty always? f ot, from how

More information

Regression and the LMS Algorithm

Regression and the LMS Algorithm CSE 556: Itroducto to Neural Netorks Regresso ad the LMS Algorthm CSE 556: Regresso 1 Problem statemet CSE 556: Regresso Lear regresso th oe varable Gve a set of N pars of data {, d }, appromate d b a

More information

Bayes Decision Theory - II

Bayes Decision Theory - II Bayes Decso Theory - II Ke Kreutz-Delgado (Nuo Vascocelos) ECE 175 Wter 2012 - UCSD Nearest Neghbor Classfer We are cosderg supervsed classfcato Nearest Neghbor (NN) Classfer A trag set D = {(x 1,y 1 ),,

More information

LECTURE 2: Linear and quadratic classifiers

LECTURE 2: Linear and quadratic classifiers LECURE : Lear ad quadratc classfers g Part : Bayesa Decso heory he Lkelhood Rato est Maxmum A Posteror ad Maxmum Lkelhood Dscrmat fuctos g Part : Quadratc classfers Bayes classfers for ormally dstrbuted

More information