Nonparametric Density Estimation Intro

Size: px
Start display at page:

Download "Nonparametric Density Estimation Intro"

Transcription

1 Noarametrc Desty Estmato Itro Parze Wdows

2 No-Parametrc Methods Nether robablty dstrbuto or dscrmat fucto s kow Haes qute ofte All we have s labeled data a lot s kow easer salmo bass salmo salmo Estmate the robablty dstrbuto from the labeled data lttle s kow harder

3 NoParametrc Techques: Itroducto I revous lectures we assumed that ether. someoe gves us the desty ( c j ) I atter recogto alcatos ths ever haes 2. someoe gves us ( θ cj ) Does hae sometmes, but we are lkely to susect whether the gve ( θ) models the data well Most arametrc destes are umodal (have a sgle local mamum), whereas may ractcal roblems volve mult-modal destes

4 NoParametrc Techques: Itroducto Noarametrc rocedures ca be used wth arbtrary dstrbutos ad wthout ay assumto about the forms of the uderlyg destes There are two tyes of oarametrc methods: Parze wdows Estmate lkelhood ( c j ) Nearest Neghbors Byass lkelhood ad go drectly to osteror estmato P(c j )

5 NoParametrc Techques: Itroducto Noarametrc techques attemt to estmate the uderlyg desty fuctos from the trag data Idea: the more data a rego, the larger s the desty fucto () Pr [ X R] f( ) R d salmo legth

6 NoParametrc Techques: Itroducto [ ] How ca we aromate Pr X R ad Pr X R2? 6 6 Pr[ X R] ad Pr[ X R2] 20 Pr [ X R] f( ) Should the desty curves above R ad R 2 be equally hgh? No, sce R s smaller tha R 2 Pr X R f d () R 20 d [ ] ( ) f( ) d Pr[ R ] X 2 To get desty, ormalze by rego sze R R [ ] 2 R R2 salmo legth

7 NoParametrc Techques: Itroducto Assumg f() s bascally flat sde R, # of samles R total # of samles Pr [ X R] f( y) R dy f ( ) Volume( R) Thus, desty at a ot sde R, ca be aromated f ( ) # of samles R total # of samles Volume ( R) Now let s derve ths formula more formally

8 Bomal Radom Varable Let us fl a co tmes (each oe s called tral ) Probablty of head ρ, robablty of tal s -ρ Bomal radom varable K couts the umber of heads trals ( ) k( ) k P K k k ρ ρ Mea s where ( K) ρ E Varace s ( ) ( ) var K k ρ ρ! k!( k)!

9 Desty Estmato: Basc Issues From the defto of a desty fucto, robablty ρ that a vector wll fall rego R s: [ R] ρ Pr ( ' )d' Suose we have samles, 2,, draw from the dstrbuto (). The robablty that k ots fall R s the gve by bomal dstrbuto: Pr [ K k] R k k ( ) ρ ρ Suose that k ots fall R, we ca use MLE to estmate the value of ρ. The lkelhood fucto s ( ρ),..., k k ρ ( ρ) k k

10 Desty Estmato: Basc Issues ( ρ) k,..., k ρ ( ρ) k Ths lkelhood fucto s mamzed at ρ Thus the MLE s ˆ ρ k Assume that () s cotuous ad that the rego R s so small that () s aromately costat R R ( ') d' ( ) V s R ad V s the volume of R Recall from the revous slde: Thus () ca be aromated: R k ρ ( ') d' ( ) k / V () R

11 Desty Estmato: Basc Issues Ths s eactly what we had before: ( ) R k / V s sde some rego R k umber of samles sde R total umber of samles V volume of R Our estmate wll always be the average of true desty over R ( ) k / V ρˆ V R ( ') d' Ideally, () should be costat sde R V

12 Desty Estmato: Hstogram (l) ( ) k / V R R 3 R 2 6/9 ( ) 3/9 ( ) ( ) /9 0 If regos R s do ot overla, we have a hstogram

13 Desty Estmato: Accuracy How accurate s desty aromato ( )? V We have made two aromatos k. ˆ ρ as creases, ths estmate becomes more accurate 2. R ( ') d' ( ) V as R grows smaller, the estmate becomes more accurate As we shrk R we have to make sure t cotas samles, otherwse our estmated () 0 for all R Thus theory, f we have a ulmted umber of samles, we get covergece as we smultaeously crease the umber of samles, ad shrk rego R, but ot too much so that R stll cotas a lot of samles k /

14 Desty Estmato: Accuracy k / ( ) V I ractce, the umber of samles s always fed Thus the oly avalable oto to crease the accuracy s by decreasg the sze of R (V gets smaller) If V s too small, ()0 for most, because most regos wll have o samles Thus have to fd a comromse for V ot too small so that t has eough samles but also ot too large so that () s aromately costat sde V

15 . Desty Estmato: Two Aroaches ( ) 2. k-nearest Neghbors. Parze Wdows: k / V Choose a fed value for volume V ad determe the corresodg k from the data Choose a fed value for k ad determe the corresodg volume V from the data Uder arorate codtos ad as umber of samles goes to fty, both methods ca be show to coverge to the true ()

16 Parze Wdows ( ) k / V s sde some rego R k umber of samles sde R total umber of samles V volume of R To estmate the desty at ot, smly ceter the rego R at, cout the umber of samles R, ad substtute everythg our formula R ( ) 3 / 6 0

17 Parze Wdows I Parze-wdow aroach to estmate destes we f the sze ad shae of rego R Let us assume that the rego R s a d-dmesoal hyercube wth sde legth h thus t s volume s h d R R R h h h dmeso 2 dmesos 3 dmesos

18 Parze Wdows Let u[u, u 2,, u d ] ad defe a wdow fucto u j j,..., d (u) 2 0 otherwse dmeso (u) /2 u 2 s sde u u s 0 outsde /2 2 dmesos

19 Parze Wdows Recall we have d-dmesoal samles, 2,,. Let j be the jth coordate of samle.the - h j j j ) 2 0 otherwse u u j 2 - ( h R h,..., d - ( h ) f s sde the hyercube wth wdth h ad cetered at 0 otherwse

20 Parze Wdows How do we cout the total umber of samle ots, 2,, whch are sde the hyercube wth sde h ad cetered at? k k / V Recall ( ), Vh d h Thus we get the desred aalytcal eresso for the estmate of desty () ( ) h d h / h d h

21 Parze Wdows ( ) h d h Let s make sure () s fact a desty ( ) 0 volume of hyercube ( ) d h d h d h d h d d h h d

22 Parze Wdows: Eamle D ( ) h d h Suose we have 7 samles D{2,3,4,8,0,,2} 2 () Let wdow wdth h3, estmate desty at () () / > / 2 > / 2 3 [ ] 2 > 3 / 2

23 Parze Wdows: Sum of Fuctos Now let s look at our desty estmate () aga: ( ) d h h h d h sde square cetered at 0 otherwse Thus () s just a sum of bo lke fuctos each of heght d h

24 Parze Wdows: Eamle D Let s come back to our eamle 7 samles D{2,3,4,8,0,,2}, h3 () 2 To see what the fucto looks lke, we eed to geerate 7 boes ad add them u The wdth s h3 ad the heght s d h 2

25 Parze Wdows: Iterolato I essece, wdow fucto s used for terolato: each samle cotrbutes to the resultg desty at f s close eough to () 2

26 Parze Wdows: Drawbacks of Hyercube As log as samle ot ad are the same hyercube, the cotrbuto of to the desty at s costat, regardless of how close s to 2 h h 2 The resultg desty () s ot smooth, t has dscotutes ()

27 Parze Wdows: geeral ( ) h d h We ca use a geeral wdow as log as the resultg () s a legtmate desty,.e.. (u ) 0 satsfed f ( u ) 0 2. ( ) d satsfed f ( u) du (u) 2 (u) d h ( u) du ( )d d d d h h h chage coordate s to u, thus du h d h u

28 Parze Wdows: geeral ( ) h d h Notce that wth the geeral wdow we are o loger coutg the umber of samles sde R. We are coutg the weghted average of otetally every sgle samle ot (although oly those wth dstace h have ay sgfcat weght) Wth fte umber of samles, ad arorate codtos, t ca stll be show that ( ) ( )

29 Parze Wdows: Gaussa ( ) h d h A oular choce for s N(0,) desty u ( ) 2 / 2 u e 2ππ (u) u Solves both drawbacks of the bo wdow Pots whch are close to the samle ot receve hgher weght Resultg desty () s smooth

30 Parze Wdows: Eamle wth Geeral Let s come back to our eamle 7 samles D{2,3,4,8,0,,2}, h ( ) 7 7 ( ) () s the sum of of 7 Gaussas, each cetered at oe of the samle ots, ad each scaled by /7

31 Parze Wdows: Dd We Solve the Problem? Let s test f we solved the roblem. Draw samles from a kow dstrbuto 2. Use our desty aromato method ad comare wth the true desty We wll vary the umber of samles ad the wdow sze h We wll lay wth 2 dstrbutos N(0,) tragle ad uform mture

32 Parze Wdows: True Desty N(0,) h h 0.5 h 0. 0

33 Parze Wdows: True Desty N(0,) 00 h h 0.5 h 0. h h /

34 Parze Wdows: True desty s Mture of Uform ad Tragle h h 0.5 h 0.2 6

35 Parze Wdows: True desty s Mture of Uform ad Tragle h h 0.5 h h h /

36 Parze Wdows: Effect of Wdow Wdth h By choosg h we are guessg the rego where desty s aromately costat Wthout kowg aythg about the dstrbuto, t s really hard to guess were the desty s aromately costat () h h

37 Parze Wdows: Effect of Wdow Wdth h If h s small, we suermose shar ulses cetered at the data Each samle ot flueces too small rage of Smoothed too lttle: the result wll look osy ad ot smooth eough If h s large, we suermose broad slowly chagg fuctos, Each samle ot flueces too large rage of Smoothed too much: the result looks oversmoothed or outof-focus Fdg the best h s challegg, ad deed o sgle h may work well May eed to adat h for dfferet samle ots However we ca try to lear the best h to use from our labeled data

38 Learg wdow wdth h From Labeled Data Dvde labeled data to trag set, valdato set, test set For a rage of dfferet values of h (ossbly usg bary search), costruct desty estmate () usg Parze wdows Test the classfcato erformace o the valdato set for each value of h you tred For the fal desty estmate, choose h gvg the smallest error o the valdato set Now you ca test the erformace of the classfer o the test set Notce we eed valdato set to fd best arameter h, we ca t use test set for ths because test set caot be used for trag I geeral, eed valdato set f our classfer has some tuable arameters

39 Parze Wdows: Classfcato Eamle I classfers based o Parze-wdow estmato: We estmate the destes for each category ad classfy a test ot by the label corresodg to the mamum osteror The decso rego for a Parze-wdow classfer deeds uo the choce of wdow fucto as llustrated the followg fgure

40 Parze Wdows: Classfcato Eamle For small eough wdow sze h the classfcato o trag data s erfect However decso boudares are comle ad ths soluto s ot lkely to geeralze well to ovel data For larger wdow sze h, classfcato o trag data s ot erfect However decso boudares are smler ad ths soluto s more lkely to geeralze well to ovel data

41 Parze Wdows: Summary Advatages Ca be aled to the data from ay dstrbuto I theory ca be show to coverge as the umber of samles goes to fty Dsadvatages Number of trag data s lmted ractce, ad so choosg the arorate wdow sze h s dffcult May eed large umber of samles for accurate estmates Comutatoally heavy, to classfy oe ot we have to comute a fucto whch otetally deeds o all samles ( ) h d h But we eed a lot of samles for accurate desty estmato!

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

MIMA Group. Chapter 4 Non-Parameter Estimation. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 4 Non-Parameter Estimation. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Grou M D L M Chater 4 No-Parameter Estmato X-Shu Xu @ SDU School of Comuter Scece ad Techology, Shadog Uversty Cotets Itroducto Parze Wdows K-Nearest-Neghbor Estmato Classfcato Techques The Nearest-Neghbor

More information

6. Nonparametric techniques

6. Nonparametric techniques 6. Noparametrc techques Motvato Problem: how to decde o a sutable model (e.g. whch type of Gaussa) Idea: just use the orgal data (lazy learg) 2 Idea 1: each data pot represets a pece of probablty P(x)

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

Nonparametric Techniques

Nonparametric Techniques Noparametrc Techques Noparametrc Techques w/o assumg ay partcular dstrbuto the uderlyg fucto may ot be kow e.g. mult-modal destes too may parameters Estmatg desty dstrbuto drectly Trasform to a lower-dmesoal

More information

Parameter Estimation

Parameter Estimation arameter Estmato robabltes Notatoal Coveto Mass dscrete fucto: catal letters Desty cotuous fucto: small letters Vector vs. scalar Scalar: la Vector: bold D: small Hgher dmeso: catal Notes a cotuous state

More information

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation CS 750 Mache Learg Lecture 5 esty estmato Mlos Hausrecht mlos@tt.edu 539 Seott Square esty estmato esty estmato: s a usuervsed learg roblem Goal: Lear a model that rereset the relatos amog attrbutes the

More information

Econometric Methods. Review of Estimation

Econometric Methods. Review of Estimation Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Aalyss of Varace ad Desg of Exermets-I MODULE II LECTURE - GENERAL LINEAR HYPOTHESIS AND ANALYSIS OF VARIANCE Dr Shalabh Deartmet of Mathematcs ad Statstcs Ida Isttute of Techology Kaur Tukey s rocedure

More information

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018 Chrs Pech Fal Practce CS09 Dec 5, 08 Practce Fal Examato Solutos. Aswer: 4/5 8/7. There are multle ways to obta ths aswer; here are two: The frst commo method s to sum over all ossbltes for the rak of

More information

Summary of the lecture in Biostatistics

Summary of the lecture in Biostatistics Summary of the lecture Bostatstcs Probablty Desty Fucto For a cotuos radom varable, a probablty desty fucto s a fucto such that: 0 dx a b) b a dx A probablty desty fucto provdes a smple descrpto of the

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

2. Independence and Bernoulli Trials

2. Independence and Bernoulli Trials . Ideedece ad Beroull Trals Ideedece: Evets ad B are deedet f B B. - It s easy to show that, B deedet mles, B;, B are all deedet ars. For examle, ad so that B or B B B B B φ,.e., ad B are deedet evets.,

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Postpoed exam: ECON430 Statstcs Date of exam: Jauary 0, 0 Tme for exam: 09:00 a.m. :00 oo The problem set covers 5 pages Resources allowed: All wrtte ad prted

More information

CHAPTER VI Statistical Analysis of Experimental Data

CHAPTER VI Statistical Analysis of Experimental Data Chapter VI Statstcal Aalyss of Expermetal Data CHAPTER VI Statstcal Aalyss of Expermetal Data Measuremets do ot lead to a uque value. Ths s a result of the multtude of errors (maly radom errors) that ca

More information

2SLS Estimates ECON In this case, begin with the assumption that E[ i

2SLS Estimates ECON In this case, begin with the assumption that E[ i SLS Estmates ECON 3033 Bll Evas Fall 05 Two-Stage Least Squares (SLS Cosder a stadard lear bvarate regresso model y 0 x. I ths case, beg wth the assumto that E[ x] 0 whch meas that OLS estmates of wll

More information

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The

More information

Parametric Density Estimation: Bayesian Estimation. Naïve Bayes Classifier

Parametric Density Estimation: Bayesian Estimation. Naïve Bayes Classifier arametrc Dest Estmato: Baesa Estmato. Naïve Baes Classfer Baesa arameter Estmato Suose we have some dea of the rage where arameters should be Should t we formalze such ror owledge hoes that t wll lead

More information

Dimensionality Reduction and Learning

Dimensionality Reduction and Learning CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that

More information

Generative classification models

Generative classification models CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato

More information

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

Chapter 5 Properties of a Random Sample

Chapter 5 Properties of a Random Sample Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample

More information

BASIC PRINCIPLES OF STATISTICS

BASIC PRINCIPLES OF STATISTICS BASIC PRINCIPLES OF STATISTICS PROBABILITY DENSITY DISTRIBUTIONS DISCRETE VARIABLES BINOMIAL DISTRIBUTION ~ B 0 0 umber of successes trals Pr E [ ] Var[ ] ; BINOMIAL DISTRIBUTION B7 0. B30 0.3 B50 0.5

More information

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best Error Aalyss Preamble Wheever a measuremet s made, the result followg from that measuremet s always subject to ucertaty The ucertaty ca be reduced by makg several measuremets of the same quatty or by mprovg

More information

Lecture 3. Sampling, sampling distributions, and parameter estimation

Lecture 3. Sampling, sampling distributions, and parameter estimation Lecture 3 Samplg, samplg dstrbutos, ad parameter estmato Samplg Defto Populato s defed as the collecto of all the possble observatos of terest. The collecto of observatos we take from the populato s called

More information

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Multivariate Transformation of Variables and Maximum Likelihood Estimation Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty

More information

TESTS BASED ON MAXIMUM LIKELIHOOD

TESTS BASED ON MAXIMUM LIKELIHOOD ESE 5 Toy E. Smth. The Basc Example. TESTS BASED ON MAXIMUM LIKELIHOOD To llustrate the propertes of maxmum lkelhood estmates ad tests, we cosder the smplest possble case of estmatg the mea of the ormal

More information

Chapter 14 Logistic Regression Models

Chapter 14 Logistic Regression Models Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as

More information

Random Variables. ECE 313 Probability with Engineering Applications Lecture 8 Professor Ravi K. Iyer University of Illinois

Random Variables. ECE 313 Probability with Engineering Applications Lecture 8 Professor Ravi K. Iyer University of Illinois Radom Varables ECE 313 Probablty wth Egeerg Alcatos Lecture 8 Professor Rav K. Iyer Uversty of Illos Iyer - Lecture 8 ECE 313 Fall 013 Today s Tocs Revew o Radom Varables Cumulatve Dstrbuto Fucto (CDF

More information

Functions of Random Variables

Functions of Random Variables Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,

More information

X ε ) = 0, or equivalently, lim

X ε ) = 0, or equivalently, lim Revew for the prevous lecture Cocepts: order statstcs Theorems: Dstrbutos of order statstcs Examples: How to get the dstrbuto of order statstcs Chapter 5 Propertes of a Radom Sample Secto 55 Covergece

More information

Simulation Output Analysis

Simulation Output Analysis Smulato Output Aalyss Summary Examples Parameter Estmato Sample Mea ad Varace Pot ad Iterval Estmato ermatg ad o-ermatg Smulato Mea Square Errors Example: Sgle Server Queueg System x(t) S 4 S 4 S 3 S 5

More information

CHAPTER 6. d. With success = observation greater than 10, x = # of successes = 4, and

CHAPTER 6. d. With success = observation greater than 10, x = # of successes = 4, and CHAPTR 6 Secto 6.. a. We use the samle mea, to estmate the oulato mea µ. Σ 9.80 µ 8.407 7 ~ 7. b. We use the samle meda, 7 (the mddle observato whe arraged ascedg order. c. We use the samle stadard devato,

More information

( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006

( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006 Mult-Layer Refracto Problem Rafael Espercueta, Bakersfeld College, November, 006 Lght travels at dfferet speeds through dfferet meda, but refracts at layer boudares order to traverse the least-tme path.

More information

Module 7. Lecture 7: Statistical parameter estimation

Module 7. Lecture 7: Statistical parameter estimation Lecture 7: Statstcal parameter estmato Parameter Estmato Methods of Parameter Estmato 1) Method of Matchg Pots ) Method of Momets 3) Mamum Lkelhood method Populato Parameter Sample Parameter Ubased estmato

More information

D KL (P Q) := p i ln p i q i

D KL (P Q) := p i ln p i q i Cheroff-Bouds 1 The Geeral Boud Let P 1,, m ) ad Q q 1,, q m ) be two dstrbutos o m elemets, e,, q 0, for 1,, m, ad m 1 m 1 q 1 The Kullback-Lebler dvergece or relatve etroy of P ad Q s defed as m D KL

More information

Special Instructions / Useful Data

Special Instructions / Useful Data JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth

More information

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions Iteratoal Joural of Computatoal Egeerg Research Vol, 0 Issue, Estmato of Stress- Stregth Relablty model usg fte mxture of expoetal dstrbutos K.Sadhya, T.S.Umamaheswar Departmet of Mathematcs, Lal Bhadur

More information

Lecture 3 Probability review (cont d)

Lecture 3 Probability review (cont d) STATS 00: Itroducto to Statstcal Iferece Autum 06 Lecture 3 Probablty revew (cot d) 3. Jot dstrbutos If radom varables X,..., X k are depedet, the ther dstrbuto may be specfed by specfyg the dvdual dstrbuto

More information

Lecture 02: Bounding tail distributions of a random variable

Lecture 02: Bounding tail distributions of a random variable CSCI-B609: A Theorst s Toolkt, Fall 206 Aug 25 Lecture 02: Boudg tal dstrbutos of a radom varable Lecturer: Yua Zhou Scrbe: Yua Xe & Yua Zhou Let us cosder the ubased co flps aga. I.e. let the outcome

More information

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames KLT Tracker Tracker. Detect Harrs corers the frst frame 2. For each Harrs corer compute moto betwee cosecutve frames (Algmet). 3. Lk moto vectors successve frames to get a track 4. Itroduce ew Harrs pots

More information

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b CS 70 Dscrete Mathematcs ad Probablty Theory Fall 206 Sesha ad Walrad DIS 0b. Wll I Get My Package? Seaky delvery guy of some compay s out delverg packages to customers. Not oly does he had a radom package

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Exam: ECON430 Statstcs Date of exam: Frday, December 8, 07 Grades are gve: Jauary 4, 08 Tme for exam: 0900 am 00 oo The problem set covers 5 pages Resources allowed:

More information

1 Onto functions and bijections Applications to Counting

1 Onto functions and bijections Applications to Counting 1 Oto fuctos ad bectos Applcatos to Coutg Now we move o to a ew topc. Defto 1.1 (Surecto. A fucto f : A B s sad to be surectve or oto f for each b B there s some a A so that f(a B. What are examples of

More information

Idea is to sample from a different distribution that picks points in important regions of the sample space. Want ( ) ( ) ( ) E f X = f x g x dx

Idea is to sample from a different distribution that picks points in important regions of the sample space. Want ( ) ( ) ( ) E f X = f x g x dx Importace Samplg Used for a umber of purposes: Varace reducto Allows for dffcult dstrbutos to be sampled from. Sestvty aalyss Reusg samples to reduce computatoal burde. Idea s to sample from a dfferet

More information

IS 709/809: Computational Methods in IS Research. Simple Markovian Queueing Model

IS 709/809: Computational Methods in IS Research. Simple Markovian Queueing Model IS 79/89: Comutatoal Methods IS Research Smle Marova Queueg Model Nrmalya Roy Deartmet of Iformato Systems Uversty of Marylad Baltmore Couty www.umbc.edu Queueg Theory Software QtsPlus software The software

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

Section l h l Stem=Tens. 8l Leaf=Ones. 8h l 03. 9h 58

Section l h l Stem=Tens. 8l Leaf=Ones. 8h l 03. 9h 58 Secto.. 6l 34 6h 667899 7l 44 7h Stem=Tes 8l 344 Leaf=Oes 8h 5557899 9l 3 9h 58 Ths dsplay brgs out the gap the data: There are o scores the hgh 7's. 6. a. beams cylders 9 5 8 88533 6 6 98877643 7 488

More information

2.28 The Wall Street Journal is probably referring to the average number of cubes used per glass measured for some population that they have chosen.

2.28 The Wall Street Journal is probably referring to the average number of cubes used per glass measured for some population that they have chosen. .5 x 54.5 a. x 7. 786 7 b. The raked observatos are: 7.4, 7.5, 7.7, 7.8, 7.9, 8.0, 8.. Sce the sample sze 7 s odd, the meda s the (+)/ 4 th raked observato, or meda 7.8 c. The cosumer would more lkely

More information

22 Nonparametric Methods.

22 Nonparametric Methods. 22 oparametrc Methods. I parametrc models oe assumes apror that the dstrbutos have a specfc form wth oe or more ukow parameters ad oe tres to fd the best or atleast reasoably effcet procedures that aswer

More information

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters

More information

Chain Rules for Entropy

Chain Rules for Entropy Cha Rules for Etroy The etroy of a collecto of radom varables s the sum of codtoal etroes. Theorem: Let be radom varables havg the mass robablty x x.x. The...... The roof s obtaed by reeatg the alcato

More information

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Mache Learg CSE6740/CS764/ISYE6740, Fall 0 Itroducto to Regresso Le Sog Lecture 4, August 30, 0 Based o sldes from Erc g, CMU Readg: Chap. 3, CB Mache learg for apartmet hutg Suppose ou are to move to

More information

D. VQ WITH 1ST-ORDER LOSSLESS CODING

D. VQ WITH 1ST-ORDER LOSSLESS CODING VARIABLE-RATE VQ (AKA VQ WITH ENTROPY CODING) Varable-Rate VQ = Quatzato + Lossless Varable-Legth Bary Codg A rage of optos -- from smple to complex A. Uform scalar quatzato wth varable-legth codg, oe

More information

Lecture 9. Some Useful Discrete Distributions. Some Useful Discrete Distributions. The observations generated by different experiments have

Lecture 9. Some Useful Discrete Distributions. Some Useful Discrete Distributions. The observations generated by different experiments have NM 7 Lecture 9 Some Useful Dscrete Dstrbutos Some Useful Dscrete Dstrbutos The observatos geerated by dfferet eermets have the same geeral tye of behavor. Cosequetly, radom varables assocated wth these

More information

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution: Chapter 4 Exercses Samplg Theory Exercse (Smple radom samplg: Let there be two correlated radom varables X ad A sample of sze s draw from a populato by smple radom samplg wthout replacemet The observed

More information

Lecture 9: Tolerant Testing

Lecture 9: Tolerant Testing Lecture 9: Tolerat Testg Dael Kae Scrbe: Sakeerth Rao Aprl 4, 07 Abstract I ths lecture we prove a quas lear lower boud o the umber of samples eeded to do tolerat testg for L dstace. Tolerat Testg We have

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

STK3100 and STK4100 Autumn 2017

STK3100 and STK4100 Autumn 2017 SK3 ad SK4 Autum 7 Geeralzed lear models Part III Covers the followg materal from chaters 4 ad 5: Sectos 4..5, 4.3.5, 4.3.6, 4.4., 4.4., ad 4.4.3 Sectos 5.., 5.., ad 5.5. Ørulf Borga Deartmet of Mathematcs

More information

5 Short Proofs of Simplified Stirling s Approximation

5 Short Proofs of Simplified Stirling s Approximation 5 Short Proofs of Smplfed Strlg s Approxmato Ofr Gorodetsky, drtymaths.wordpress.com Jue, 20 0 Itroducto Strlg s approxmato s the followg (somewhat surprsg) approxmato of the factoral,, usg elemetary fuctos:

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~

More information

LINEAR REGRESSION ANALYSIS

LINEAR REGRESSION ANALYSIS LINEAR REGRESSION ANALYSIS MODULE V Lecture - Correctg Model Iadequaces Through Trasformato ad Weghtg Dr. Shalabh Departmet of Mathematcs ad Statstcs Ida Isttute of Techology Kapur Aalytcal methods for

More information

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades STAT 101 Dr. Kar Lock Morga 11/20/12 Exam 2 Grades Multple Regresso SECTIONS 9.2, 10.1, 10.2 Multple explaatory varables (10.1) Parttog varablty R 2, ANOVA (9.2) Codtos resdual plot (10.2) Trasformatos

More information

Probability and Statistics. What is probability? What is statistics?

Probability and Statistics. What is probability? What is statistics? robablt ad Statstcs What s robablt? What s statstcs? robablt ad Statstcs robablt Formall defed usg a set of aoms Seeks to determe the lkelhood that a gve evet or observato or measuremet wll or has haeed

More information

L5 Polynomial / Spline Curves

L5 Polynomial / Spline Curves L5 Polyomal / Sple Curves Cotets Coc sectos Polyomal Curves Hermte Curves Bezer Curves B-Sples No-Uform Ratoal B-Sples (NURBS) Mapulato ad Represetato of Curves Types of Curve Equatos Implct: Descrbe a

More information

STK3100 and STK4100 Autumn 2018

STK3100 and STK4100 Autumn 2018 SK3 ad SK4 Autum 8 Geeralzed lear models Part III Covers the followg materal from chaters 4 ad 5: Cofdece tervals by vertg tests Cosder a model wth a sgle arameter β We may obta a ( α% cofdece terval for

More information

Centroids & Moments of Inertia of Beam Sections

Centroids & Moments of Inertia of Beam Sections RCH 614 Note Set 8 S017ab Cetrods & Momets of erta of Beam Sectos Notato: b C d d d Fz h c Jo L O Q Q = ame for area = ame for a (base) wdth = desgato for chael secto = ame for cetrod = calculus smbol

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions Radom Varables ad Probablty Dstrbutos * If X : S R s a dscrete radom varable wth rage {x, x, x 3,. } the r = P (X = xr ) = * Let X : S R be a dscrete radom varable wth rage {x, x, x 3,.}.If x r P(X = x

More information

Ideal multigrades with trigonometric coefficients

Ideal multigrades with trigonometric coefficients Ideal multgrades wth trgoometrc coeffcets Zarathustra Brady December 13, 010 1 The problem A (, k) multgrade s defed as a par of dstct sets of tegers such that (a 1,..., a ; b 1,..., b ) a j = =1 for all

More information

Continuous Distributions

Continuous Distributions 7//3 Cotuous Dstrbutos Radom Varables of the Cotuous Type Desty Curve Percet Desty fucto, f (x) A smooth curve that ft the dstrbuto 3 4 5 6 7 8 9 Test scores Desty Curve Percet Probablty Desty Fucto, f

More information

Dimensionality reduction Feature selection

Dimensionality reduction Feature selection CS 750 Mache Learg Lecture 3 Dmesoalty reducto Feature selecto Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 750 Mache Learg Dmesoalty reducto. Motvato. Classfcato problem eample: We have a put data

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

Radial Basis Function Networks

Radial Basis Function Networks Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s

More information

STRONG CONSISTENCY FOR SIMPLE LINEAR EV MODEL WITH v/ -MIXING

STRONG CONSISTENCY FOR SIMPLE LINEAR EV MODEL WITH v/ -MIXING Joural of tatstcs: Advaces Theory ad Alcatos Volume 5, Number, 6, Pages 3- Avalable at htt://scetfcadvaces.co. DOI: htt://d.do.org/.864/jsata_7678 TRONG CONITENCY FOR IMPLE LINEAR EV MODEL WITH v/ -MIXING

More information

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model Chapter 3 Asmptotc Theor ad Stochastc Regressors The ature of eplaator varable s assumed to be o-stochastc or fed repeated samples a regresso aalss Such a assumpto s approprate for those epermets whch

More information

means the first term, a2 means the term, etc. Infinite Sequences: follow the same pattern forever.

means the first term, a2 means the term, etc. Infinite Sequences: follow the same pattern forever. 9.4 Sequeces ad Seres Pre Calculus 9.4 SEQUENCES AND SERIES Learg Targets:. Wrte the terms of a explctly defed sequece.. Wrte the terms of a recursvely defed sequece. 3. Determe whether a sequece s arthmetc,

More information

Channel Models with Memory. Channel Models with Memory. Channel Models with Memory. Channel Models with Memory

Channel Models with Memory. Channel Models with Memory. Channel Models with Memory. Channel Models with Memory Chael Models wth Memory Chael Models wth Memory Hayder radha Electrcal ad Comuter Egeerg Mchga State Uversty I may ractcal etworkg scearos (cludg the Iteret ad wreless etworks), the uderlyg chaels are

More information

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1 STA 08 Appled Lear Models: Regresso Aalyss Sprg 0 Soluto for Homework #. Let Y the dollar cost per year, X the umber of vsts per year. The the mathematcal relato betwee X ad Y s: Y 300 + X. Ths s a fuctoal

More information

CHAPTER 3 POSTERIOR DISTRIBUTIONS

CHAPTER 3 POSTERIOR DISTRIBUTIONS CHAPTER 3 POSTERIOR DISTRIBUTIONS If scece caot measure the degree of probablt volved, so much the worse for scece. The practcal ma wll stck to hs apprecatve methods utl t does, or wll accept the results

More information

Investigating Cellular Automata

Investigating Cellular Automata Researcher: Taylor Dupuy Advsor: Aaro Wootto Semester: Fall 4 Ivestgatg Cellular Automata A Overvew of Cellular Automata: Cellular Automata are smple computer programs that geerate rows of black ad whte

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I Chapter 8 Heterosedastcty Recall MLR 5 Homsedastcty error u has the same varace gve ay values of the eplaatory varables Varu,..., = or EUU = I Suppose other GM assumptos hold but have heterosedastcty.

More information

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity ECONOMETRIC THEORY MODULE VIII Lecture - 6 Heteroskedastcty Dr. Shalabh Departmet of Mathematcs ad Statstcs Ida Isttute of Techology Kapur . Breusch Paga test Ths test ca be appled whe the replcated data

More information

Unit 9. The Tangent Bundle

Unit 9. The Tangent Bundle Ut 9. The Taget Budle ========================================================================================== ---------- The taget sace of a submafold of R, detfcato of taget vectors wth dervatos at

More information

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier Baa Classfcato CS6L Data Mg: Classfcato() Referece: J. Ha ad M. Kamber, Data Mg: Cocepts ad Techques robablstc learg: Calculate explct probabltes for hypothess, amog the most practcal approaches to certa

More information

Chapter 3 Sampling For Proportions and Percentages

Chapter 3 Sampling For Proportions and Percentages Chapter 3 Samplg For Proportos ad Percetages I may stuatos, the characterstc uder study o whch the observatos are collected are qualtatve ature For example, the resposes of customers may marketg surveys

More information

Mu Sequences/Series Solutions National Convention 2014

Mu Sequences/Series Solutions National Convention 2014 Mu Sequeces/Seres Solutos Natoal Coveto 04 C 6 E A 6C A 6 B B 7 A D 7 D C 7 A B 8 A B 8 A C 8 E 4 B 9 B 4 E 9 B 4 C 9 E C 0 A A 0 D B 0 C C Usg basc propertes of arthmetc sequeces, we fd a ad bm m We eed

More information

The expected value of a sum of random variables,, is the sum of the expected values:

The expected value of a sum of random variables,, is the sum of the expected values: Sums of Radom Varables xpected Values ad Varaces of Sums ad Averages of Radom Varables The expected value of a sum of radom varables, say S, s the sum of the expected values: ( ) ( ) S Ths s always true

More information

VARIABLE-RATE VQ (AKA VQ WITH ENTROPY CODING)

VARIABLE-RATE VQ (AKA VQ WITH ENTROPY CODING) VARIABLE-RATE VQ (AKA VQ WITH ENTROPY CODING) Varable-Rate VQ = Quatzato + Lossless Varable-Legth Bary Codg A rage of optos -- from smple to complex a. Uform scalar quatzato wth varable-legth codg, oe

More information

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations Dervato of -Pot Block Method Formula for Solvg Frst Order Stff Ordary Dfferetal Equatos Kharul Hamd Kharul Auar, Kharl Iskadar Othma, Zara Bb Ibrahm Abstract Dervato of pot block method formula wth costat

More information

Lecture 7: Linear and quadratic classifiers

Lecture 7: Linear and quadratic classifiers Lecture 7: Lear ad quadratc classfers Bayes classfers for ormally dstrbuted classes Case : Σ σ I Case : Σ Σ (Σ daoal Case : Σ Σ (Σ o-daoal Case 4: Σ σ I Case 5: Σ Σ j eeral case Lear ad quadratc classfers:

More information

Mean is only appropriate for interval or ratio scales, not ordinal or nominal.

Mean is only appropriate for interval or ratio scales, not ordinal or nominal. Mea Same as ordary average Sum all the data values ad dvde by the sample sze. x = ( x + x +... + x Usg summato otato, we wrte ths as x = x = x = = ) x Mea s oly approprate for terval or rato scales, ot

More information

STK4011 and STK9011 Autumn 2016

STK4011 and STK9011 Autumn 2016 STK4 ad STK9 Autum 6 Pot estmato Covers (most of the followg materal from chapter 7: Secto 7.: pages 3-3 Secto 7..: pages 3-33 Secto 7..: pages 35-3 Secto 7..3: pages 34-35 Secto 7.3.: pages 33-33 Secto

More information

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ  1 STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS Recall Assumpto E(Y x) η 0 + η x (lear codtoal mea fucto) Data (x, y ), (x 2, y 2 ),, (x, y ) Least squares estmator ˆ E (Y x) ˆ " 0 + ˆ " x, where ˆ

More information

Model Fitting, RANSAC. Jana Kosecka

Model Fitting, RANSAC. Jana Kosecka Model Fttg, RANSAC Jaa Kosecka Fttg: Issues Prevous strateges Le detecto Hough trasform Smple parametrc model, two parameters m, b m + b Votg strateg Hard to geeralze to hgher dmesos a o + a + a 2 2 +

More information

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971)) art 4b Asymptotc Results for MRR usg RESS Recall that the RESS statstc s a specal type of cross valdato procedure (see Alle (97)) partcular to the regresso problem ad volves fdg Y $,, the estmate at the

More information

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections ENGI 441 Jot Probablty Dstrbutos Page 7-01 Jot Probablty Dstrbutos [Navd sectos.5 ad.6; Devore sectos 5.1-5.] The jot probablty mass fucto of two dscrete radom quattes, s, P ad p x y x y The margal probablty

More information