Lecture 3: Principal Components Analysis (PCA)

Size: px
Start display at page:

Download "Lecture 3: Principal Components Analysis (PCA)"

Transcription

1 Lecture 3: Principal Cmpnents Analysis (PCA) Reading: Sectins 6.3.1, 10.1, 10.2, 10.4 STATS 202: Data mining and analysis Jnathan Taylr, 9/28 Slide credits: Sergi Bacallad 1 / 24

2 The bias variance decmpsitin The inputs, x 1,..., x n are fixed, a test pint x 0 is als fixed. y i = f(x i ) + ε i ε i i.i.d, mean 0. A regressin methd fit t (x 1, y 1 ),..., (x n, y n ) prduces the estimate ˆf. Then, the Mean Squared Errr at x 0 satisfies: MSE(x 0 ) = E(y 0 ˆf(x 0 )) 2 = Var( ˆf(x 0 ))+[Bias( ˆf(x 0 ))] 2 +Var(ε). 2 / 24

3 The bias variance decmpsitin The inputs, x 1,..., x n are fixed, a test pint x 0 is als fixed. y i = f(x i ) + ε i ε i i.i.d, mean 0. A regressin methd fit t (x 1, y 1 ),..., (x n, y n ) prduces the estimate ˆf. Then, the Mean Squared Errr at x 0 satisfies: MSE(x 0 ) = E(y 0 ˆf(x 0 )) 2 = Var( ˆf(x 0 ))+[Bias( ˆf(x 0 ))] 2 +Var(ε). Bth variance and squared bias are always psitive, s t minimize the MSE, yu must reach a tradeff between bias and variance. 2 / 24

4 Squiggly f, high nise Linear f, high nise Squiggly f, lw nise MSE Bias Var Flexibility Flexibility Flexibility Figure / 24

5 Classificatin prblems In a classificatin setting, the utput takes values in a discrete set. Fr example, if we are predicting the brand f a car based n a number f variables, the functin f takes values in the set {Frd, Tyta, Mercedes-Benz,... }. 4 / 24

6 Classificatin prblems In a classificatin setting, the utput takes values in a discrete set. Fr example, if we are predicting the brand f a car based n a number f variables, the functin f takes values in the set {Frd, Tyta, Mercedes-Benz,... }. The mdel: Y = f(x) + ε becmes insufficient, as X is nt necessarily real-valued. 4 / 24

7 Classificatin prblems In a classificatin setting, the utput takes values in a discrete set. Fr example, if we are predicting the brand f a car based n a number f variables, the functin f takes values in the set {Frd, Tyta, Mercedes-Benz,... }. The mdel: Y = f(x) + ε becmes insufficient, as X is nt necessarily real-valued. 4 / 24

8 Classificatin prblems In a classificatin setting, the utput takes values in a discrete set. Fr example, if we are predicting the brand f a car based n a number f variables, the functin f takes values in the set {Frd, Tyta, Mercedes-Benz,... }. We will use slightly different ntatin: P (X, Y ) : jint distributin f (X, Y ), P (Y X) : cnditinal distributin f Y given X, ŷ i : predictin fr x i. 4 / 24

9 Lss functin fr classificatin There are many ways t measure the errr f a classificatin predictin. One f the mst cmmn is the 0-1 lss: E(1(y 0 ŷ 0 )) 5 / 24

10 Lss functin fr classificatin There are many ways t measure the errr f a classificatin predictin. One f the mst cmmn is the 0-1 lss: E(1(y 0 ŷ 0 )) Like the MSE, this quantity can be estimated frm training and test data by taking a sample average: 1 n n 1(y i ŷ i ) i=1 5 / 24

11 Bayes classifier X1 X2 Figure 2.13 In practice, we never knw the jint prbability P. Hwever, we can assume that it exists. 6 / 24

12 Bayes classifier X1 X2 Figure 2.13 In practice, we never knw the jint prbability P. Hwever, we can assume that it exists. The Bayes classifier assigns: ŷ i = argmax j P (Y = j X = x i ) It can be shwn that this is the best classifier under the 0-1 lss. 6 / 24

13 Principal Cmpnents Analysis This is the mst ppular unsupervised prcedure ever. Invented by Karl Pearsn (1901). Develped by Harld Htelling (1933). 7 / 24

14 Principal Cmpnents Analysis This is the mst ppular unsupervised prcedure ever. Invented by Karl Pearsn (1901). Develped by Harld Htelling (1933). Stanfrd pride! 7 / 24

15 Principal Cmpnents Analysis This is the mst ppular unsupervised prcedure ever. Invented by Karl Pearsn (1901). Develped by Harld Htelling (1933). What des it d? It prvides a way t visualize high dimensinal data, summarizing the mst imprtant infrmatin. 7 / 24

16 What is PCA gd fr? Murder Assault UrbanPp Rape 8 / 24

17 What is PCA gd fr? Secnd Principal Cmpnent UrbanPp Hawaii Rhde Massachusetts Island Utah Califrnia New Jersey Cnnecticut Washingtn Clrad New Yrk Ohi Illinis Arizna Nevada Wiscnsin Minnesta Pennsylvania Oregn Rape Texas Kansas Oklahma Delaware Nebraska Missuri Iwa Indiana Michigan New Hampshire Flrida Idah Virginia New Mexic Maine Wyming Maryland rth Dakta Mntana Assault Suth Dakta Tennessee Kentucky Luisiana Arkansas Alabama Alaska Gergia VermntWest Virginia Murder Suth Carlina Nrth Carlina Mississippi First Principal Cmpnent Figure / 24

18 What is the first principal cmpnent? It is the vectr which passes the clsest t a clud f samples, in terms f squared Euclidean distance. Ad Spending Ppulatin 9 / 24

19 i.e. The green directin minimizes the average squared length f the dtted lines. Ad Spending nd Principal Cmpnent Ppulatin st Principal Cmpnent Figure / 24

20 What des this lk like with 3 variables? The first tw principal cmpnents span a plane which is clsest t the data. First principal cmpnent Secnd principal cmpnent Figure / 24

21 A secnd interpretatin The prjectin nt the first principal cmpnent is the ne with the highest variance. Ad Spending nd Principal Cmpnent Ppulatin st Principal Cmpnent Figure / 24

22 Hw d we say this in math? Let X be a data matrix with n samples, and p variables. 13 / 24

23 Hw d we say this in math? Let X be a data matrix with n samples, and p variables. Frm each variable, we subtract the mean f the clumn; i.e. we center the variables. 13 / 24

24 Hw d we say this in math? Let X be a data matrix with n samples, and p variables. Frm each variable, we subtract the mean f the clumn; i.e. we center the variables. T find the first principal cmpnent φ 1 = (φ 11,..., φ p1 ), we slve the fllwing ptimizatin 13 / 24

25 Hw d we say this in math? Let X be a data matrix with n samples, and p variables. Frm each variable, we subtract the mean f the clumn; i.e. we center the variables. T find the first principal cmpnent φ 1 = (φ 11,..., φ p1 ), we slve the fllwing ptimizatin max φ 11,...,φ p1 1 n n i=1 2 p φ j1 x ij j=1 subject t p φ 2 j1 = 1. j=1 13 / 24

26 Hw d we say this in math? Let X be a data matrix with n samples, and p variables. Frm each variable, we subtract the mean f the clumn; i.e. we center the variables. T find the first principal cmpnent φ 1 = (φ 11,..., φ p1 ), we slve the fllwing ptimizatin max φ 11,...,φ p1 1 n n i=1 2 p φ j1 x ij j=1 subject t p φ 2 j1 = 1. Prjectin f the ith sample nt φ 1. Als knwn as the scre z i1 j=1 13 / 24

27 Hw d we say this in math? Let X be a data matrix with n samples, and p variables. Frm each variable, we subtract the mean f the clumn; i.e. we center the variables. T find the first principal cmpnent φ 1 = (φ 11,..., φ p1 ), we slve the fllwing ptimizatin max φ 11,...,φ p1 1 n n i=1 2 p φ j1 x ij j=1 subject t p φ 2 j1 = 1. j=1 Variance f the n samples prjected nt φ / 24

28 Hw d we say this in math? T find the secnd principal cmpnent φ 2 = (φ 12,..., φ p2 ), we slve the fllwing ptimizatin 14 / 24

29 Hw d we say this in math? T find the secnd principal cmpnent φ 2 = (φ 12,..., φ p2 ), we slve the fllwing ptimizatin subject t max φ 12,...,φ p2 1 n p φ 2 j2 = 1 j=1 2 n p φ j2 x ij i=1 and j=1 p φ j1 φ j2 = 0. j=1 14 / 24

30 Hw d we say this in math? T find the secnd principal cmpnent φ 2 = (φ 12,..., φ p2 ), we slve the fllwing ptimizatin subject t max φ 12,...,φ p2 1 n p φ 2 j2 = 1 j=1 2 n p φ j2 x ij i=1 and j=1 p φ j1 φ j2 = 0. j=1 First and secnd principal cmpnents must be rthgnal. 14 / 24

31 Hw d we say this in math? T find the secnd principal cmpnent φ 2 = (φ 12,..., φ p2 ), we slve the fllwing ptimizatin subject t max φ 12,...,φ p2 1 n p φ 2 j2 = 1 j=1 2 n p φ j2 x ij i=1 and j=1 p φ j1 φ j2 = 0. j=1 First and secnd principal cmpnents must be rthgnal. Equivalent t saying that the scres (z 11,..., z n1 ) and (z 12,..., z n2 ) are uncrrelated. 14 / 24

32 Slving the ptimizatin This ptimizatin is fundamental in linear algebra. It is satisfied by either: 15 / 24

33 Slving the ptimizatin This ptimizatin is fundamental in linear algebra. It is satisfied by either: The singular value decmpsitin (SVD) f X: X = UΣΦ T where the ith clumn f Φ is the ith principal cmpnent φ i, and the ith clumn f UΣ is the ith vectr f scres (z 1i,..., z ni ). 15 / 24

34 Slving the ptimizatin This ptimizatin is fundamental in linear algebra. It is satisfied by either: The singular value decmpsitin (SVD) f X: X = UΣΦ T where the ith clumn f Φ is the ith principal cmpnent φ i, and the ith clumn f UΣ is the ith vectr f scres (z 1i,..., z ni ). The eigendecmpsitin f X T X: X T X = ΦΣ 2 Φ T 15 / 24

35 PCA in practice: The biplt Secnd Principal Cmpnent UrbanPp Hawaii Rhde Massachusetts Island Utah Califrnia New Jersey Cnnecticut Washingtn Clrad New Yrk Ohi Illinis Arizna Nevada Wiscnsin Minnesta Pennsylvania Oregn Rape Texas Kansas Oklahma Delaware Nebraska Missuri Iwa Indiana Michigan New Hampshire Flrida Idah Virginia New Mexic Maine Wyming Maryland rth Dakta Mntana Assault Suth Dakta Tennessee Kentucky Luisiana Arkansas Alabama Alaska Gergia VermntWest Virginia Murder Suth Carlina Nrth Carlina Mississippi First Principal Cmpnent Figure / 24

36 Scaling the variables Mst f the time, we dn t care abut the abslute numerical value f a variable. 17 / 24

37 Scaling the variables Mst f the time, we dn t care abut the abslute numerical value f a variable. We care abut the value relative t the spread bserved in the sample. Befre PCA, in additin t centering each variable, we als multiply it times a cnstant t make its variance equal t / 24

38 Example: scaled vs. unscaled PCA Scaled Unscaled Secnd Principal Cmpnent UrbanPp Rape Assault Murder Secnd Principal Cmpnent UrbanPp Rape Murder Assau First Principal Cmpnent First Principal Cmpnent Figure / 24

39 Scaling the variables In special cases, we have variables measured in the same unit; e.g. gene expressin levels fr different genes. 19 / 24

40 Scaling the variables In special cases, we have variables measured in the same unit; e.g. gene expressin levels fr different genes. Therefre, we care abut the abslute value f the variables and we can perfrm PCA withut scaling. 19 / 24

41 Hw many principal cmpnents are enugh? Murder Assault UrbanPp Rape 20 / 24

42 Hw many principal cmpnents are enugh? Secnd Principal Cmpnent UrbanPp Hawaii Rhde Massachusetts Island Utah Califrnia New Jersey Cnnecticut Washingtn Clrad New Yrk Ohi Illinis Arizna Nevada Wiscnsin Minnesta Pennsylvania Oregn Rape Texas Kansas Oklahma Delaware Nebraska Missuri Iwa Indiana Michigan New Hampshire Flrida Idah Virginia New Mexic Maine Wyming Maryland rth Dakta Mntana Assault Suth Dakta Tennessee Kentucky Luisiana Arkansas Alabama Alaska Gergia VermntWest Virginia Murder Suth Carlina Nrth Carlina Mississippi First Principal Cmpnent We said 2 principal cmpnents capture mst f the relevant infrmatin. But hw can we tell? 20 / 24

43 The prprtin f variance explained We can think f the tp principal cmpnents as directins in space in which the data vary the mst. 21 / 24

44 The prprtin f variance explained We can think f the tp principal cmpnents as directins in space in which the data vary the mst. The ith scre vectr (z 1i,..., z ni ) can be interpreted as a new variable. The variance f this variable decreases as we take i frm 1 t p. 21 / 24

45 The prprtin f variance explained We can think f the tp principal cmpnents as directins in space in which the data vary the mst. The ith scre vectr (z 1i,..., z ni ) can be interpreted as a new variable. The variance f this variable decreases as we take i frm 1 t p. Hwever, the ttal variance f the scre vectrs is the same as the ttal variance f the riginal variables: p i=1 1 n n zji 2 = j=1 p Var(x k ). k=1 21 / 24

46 The prprtin f variance explained We can think f the tp principal cmpnents as directins in space in which the data vary the mst. The ith scre vectr (z 1i,..., z ni ) can be interpreted as a new variable. The variance f this variable decreases as we take i frm 1 t p. Hwever, the ttal variance f the scre vectrs is the same as the ttal variance f the riginal variables: p i=1 1 n n zji 2 = j=1 p Var(x k ). k=1 We can quantify hw much f the variance is captured by the first m principal cmpnents/scre variables. 21 / 24

47 The prprtin f variance explained The variance f the mth scre variable is: 1 n n i=1 z 2 im 22 / 24

48 The prprtin f variance explained The variance f the mth scre variable is: 1 n zim 2 = 1 n p φ jm x ij n n i=1 i=1 j= / 24

49 The prprtin f variance explained The variance f the mth scre variable is: 1 n zim 2 = 1 n p φ jm x ij n n i=1 i=1 j=1 2 = 1 n Σ2 mm. 22 / 24

50 The prprtin f variance explained The variance f the mth scre variable is: 1 n zim 2 = 1 n p φ jm x ij n n i=1 i=1 j=1 2 = 1 n Σ2 mm. Prp. Variance Explained Cumulative Prp. Variance Explained Principal Cmpnent Principal Cmpnent 22 / 24

51 The prprtin f variance explained The variance f the mth scre variable is: 1 n zim 2 = 1 n p φ jm x ij n n i=1 i=1 j=1 2 = 1 n Σ2 mm. Prp. Variance Explained Cumulative Prp. Variance Explained Principal Cmpnent Scree plt Principal Cmpnent 22 / 24

52 Generalizatins f PCA PCA wrks under a Euclidean gemetry in the space f variables. Often, the natural gemetry is different: 23 / 24

53 Generalizatins f PCA PCA wrks under a Euclidean gemetry in the space f variables. Often, the natural gemetry is different: We expect sme variables t be clser t each ther that t ther variables. 23 / 24

54 Generalizatins f PCA PCA wrks under a Euclidean gemetry in the space f variables. Often, the natural gemetry is different: We expect sme variables t be clser t each ther that t ther variables. Sme crrelatins between variables wuld be mre surprising than thers. 23 / 24

55 Generalizatins f PCA PCA wrks under a Euclidean gemetry in the space f variables. Often, the natural gemetry is different: We expect sme variables t be clser t each ther that t ther variables. Sme crrelatins between variables wuld be mre surprising than thers. Examples: Variables are pixel values, samples are different images f the brain. We expect neighbring pixels t have strnger crrelatins. 23 / 24

56 Generalizatins f PCA PCA wrks under a Euclidean gemetry in the space f variables. Often, the natural gemetry is different: We expect sme variables t be clser t each ther that t ther variables. Sme crrelatins between variables wuld be mre surprising than thers. Examples: Variables are pixel values, samples are different images f the brain. We expect neighbring pixels t have strnger crrelatins. Variables are rainfall measurements at different regins. We expect neighbring regins t have higher crrelatins. 23 / 24

57 Generalizatins f PCA There are ways t include this knwledge in a PCA. See: 1. Susan Hlmes. Multivariate Analysis, the French way. (2006). 2. Omar de la Cruz and Susan Hlmes. An intrductin t the duality diagram. (2011). 3. Stéphane Dray and Thibaut Jmbart. Revisiting Guerry s data: Intrducing spatial cnstraints in multivariate analysis. (2011). 4. Genevera Allen, Lgan Grsenick, and Jnathan Taylr. A Generalized Least Squares Matrix Decmpsitin. (2011). 24 / 24

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Lecture 10, Principal Component Analysis

Lecture 10, Principal Component Analysis Principal Cmpnent Analysis Lecture 10, Principal Cmpnent Analysis Ha Helen Zhang Fall 2017 Ha Helen Zhang Lecture 10, Principal Cmpnent Analysis 1 / 16 Principal Cmpnent Analysis Lecture 10, Principal

More information

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

Simple Linear Regression (single variable)

Simple Linear Regression (single variable) Simple Linear Regressin (single variable) Intrductin t Machine Learning Marek Petrik January 31, 2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins

More information

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came. MATH 1342 Ch. 24 April 25 and 27, 2013 Page 1 f 5 CHAPTER 24: INFERENCE IN REGRESSION Chapters 4 and 5: Relatinships between tw quantitative variables. Be able t Make a graph (scatterplt) Summarize the

More information

What is Statistical Learning?

What is Statistical Learning? What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,

More information

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

Mrs. Newgard. Lesson Plans. Geography. Grade 6 and 7

Mrs. Newgard. Lesson Plans. Geography. Grade 6 and 7 Mrs. Newgard Lessn Plans Gegraphy Grade 6 and 7 Mnday, Nvember 7 Standard: 7.1.7 Interpret and analyze primary and secndary surces t understand peple, places, and envirnments, 7.5.6 Explain hw physical

More information

AP Statistics Notes Unit Two: The Normal Distributions

AP Statistics Notes Unit Two: The Normal Distributions AP Statistics Ntes Unit Tw: The Nrmal Distributins Syllabus Objectives: 1.5 The student will summarize distributins f data measuring the psitin using quartiles, percentiles, and standardized scres (z-scres).

More information

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression 3.3.4 Prstate Cancer Data Example (Cntinued) 3.4 Shrinkage Methds 61 Table 3.3 shws the cefficients frm a number f different selectin and shrinkage methds. They are best-subset selectin using an all-subsets

More information

Part 3 Introduction to statistical classification techniques

Part 3 Introduction to statistical classification techniques Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms

More information

COMP 551 Applied Machine Learning Lecture 4: Linear classification

COMP 551 Applied Machine Learning Lecture 4: Linear classification COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted

More information

Principal Components

Principal Components Principal Cmpnents Suppse we have N measurements n each f p variables X j, j = 1,..., p. There are several equivalent appraches t principal cmpnents: Given X = (X 1,... X p ), prduce a derived (and small)

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017 Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with

More information

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs Admissibility Cnditins and Asympttic Behavir f Strngly Regular Graphs VASCO MOÇO MANO Department f Mathematics University f Prt Oprt PORTUGAL vascmcman@gmailcm LUÍS ANTÓNIO DE ALMEIDA VIEIRA Department

More information

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw: In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d) COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise

More information

EASTERN ARIZONA COLLEGE Introduction to Statistics

EASTERN ARIZONA COLLEGE Introduction to Statistics EASTERN ARIZONA COLLEGE Intrductin t Statistics Curse Design 2014-2015 Curse Infrmatin Divisin Scial Sciences Curse Number PSY 220 Title Intrductin t Statistics Credits 3 Develped by Adam Stinchcmbe Lecture/Lab

More information

Distributions, spatial statistics and a Bayesian perspective

Distributions, spatial statistics and a Bayesian perspective Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics

More information

5 th grade Common Core Standards

5 th grade Common Core Standards 5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin

More information

Chapter 3 Kinematics in Two Dimensions; Vectors

Chapter 3 Kinematics in Two Dimensions; Vectors Chapter 3 Kinematics in Tw Dimensins; Vectrs Vectrs and Scalars Additin f Vectrs Graphical Methds (One and Tw- Dimensin) Multiplicatin f a Vectr b a Scalar Subtractin f Vectrs Graphical Methds Adding Vectrs

More information

IAML: Support Vector Machines

IAML: Support Vector Machines 1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

Support-Vector Machines

Support-Vector Machines Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material

More information

Basics. Primary School learning about place value is often forgotten and can be reinforced at home.

Basics. Primary School learning about place value is often forgotten and can be reinforced at home. Basics When pupils cme t secndary schl they start a lt f different subjects and have a lt f new interests but it is still imprtant that they practise their basic number wrk which may nt be reinfrced as

More information

NUMBERS, MATHEMATICS AND EQUATIONS

NUMBERS, MATHEMATICS AND EQUATIONS AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t

More information

Resampling Methods. Chapter 5. Chapter 5 1 / 52

Resampling Methods. Chapter 5. Chapter 5 1 / 52 Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and

More information

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t

More information

A New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation

A New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation III-l III. A New Evaluatin Measure J. Jiner and L. Werner Abstract The prblems f evaluatin and the needed criteria f evaluatin measures in the SMART system f infrmatin retrieval are reviewed and discussed.

More information

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large

More information

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline

More information

Lecture 5: Ecological distance metrics; Principal Coordinates Analysis. Univariate testing vs. community analysis

Lecture 5: Ecological distance metrics; Principal Coordinates Analysis. Univariate testing vs. community analysis Lecture 5: Ecological distance metrics; Principal Coordinates Analysis Univariate testing vs. community analysis Univariate testing deals with hypotheses concerning individual taxa Is this taxon differentially

More information

, which yields. where z1. and z2

, which yields. where z1. and z2 The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin

More information

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression 4th Indian Institute f Astrphysics - PennState Astrstatistics Schl July, 2013 Vainu Bappu Observatry, Kavalur Crrelatin and Regressin Rahul Ry Indian Statistical Institute, Delhi. Crrelatin Cnsider a tw

More information

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:

More information

Determining the Accuracy of Modal Parameter Estimation Methods

Determining the Accuracy of Modal Parameter Estimation Methods Determining the Accuracy f Mdal Parameter Estimatin Methds by Michael Lee Ph.D., P.E. & Mar Richardsn Ph.D. Structural Measurement Systems Milpitas, CA Abstract The mst cmmn type f mdal testing system

More information

The general linear model and Statistical Parametric Mapping I: Introduction to the GLM

The general linear model and Statistical Parametric Mapping I: Introduction to the GLM The general linear mdel and Statistical Parametric Mapping I: Intrductin t the GLM Alexa Mrcm and Stefan Kiebel, Rik Hensn, Andrew Hlmes & J-B J Pline Overview Intrductin Essential cncepts Mdelling Design

More information

Modelling of Clock Behaviour. Don Percival. Applied Physics Laboratory University of Washington Seattle, Washington, USA

Modelling of Clock Behaviour. Don Percival. Applied Physics Laboratory University of Washington Seattle, Washington, USA Mdelling f Clck Behaviur Dn Percival Applied Physics Labratry University f Washingtn Seattle, Washingtn, USA verheads and paper fr talk available at http://faculty.washingtn.edu/dbp/talks.html 1 Overview

More information

Lecture 8: Multiclass Classification (I)

Lecture 8: Multiclass Classification (I) Bayes Rule fr Multiclass Prblems Traditinal Methds fr Multiclass Prblems Linear Regressin Mdels Lecture 8: Multiclass Classificatin (I) Ha Helen Zhang Fall 07 Ha Helen Zhang Lecture 8: Multiclass Classificatin

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

Lecture 5: Ecological distance metrics; Principal Coordinates Analysis. Univariate testing vs. community analysis

Lecture 5: Ecological distance metrics; Principal Coordinates Analysis. Univariate testing vs. community analysis Lecture 5: Ecological distance metrics; Principal Coordinates Analysis Univariate testing vs. community analysis Univariate testing deals with hypotheses concerning individual taxa Is this taxon differentially

More information

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9. Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.

More information

Figure 1a. A planar mechanism.

Figure 1a. A planar mechanism. ME 5 - Machine Design I Fall Semester 0 Name f Student Lab Sectin Number EXAM. OPEN BOOK AND CLOSED NOTES. Mnday, September rd, 0 Write n ne side nly f the paper prvided fr yur slutins. Where necessary,

More information

CS 109 Lecture 23 May 18th, 2016

CS 109 Lecture 23 May 18th, 2016 CS 109 Lecture 23 May 18th, 2016 New Datasets Heart Ancestry Netflix Our Path Parameter Estimatin Machine Learning: Frmally Many different frms f Machine Learning We fcus n the prblem f predictin Want

More information

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm

More information

MODULE FOUR. This module addresses functions. SC Academic Elementary Algebra Standards:

MODULE FOUR. This module addresses functions. SC Academic Elementary Algebra Standards: MODULE FOUR This mdule addresses functins SC Academic Standards: EA-3.1 Classify a relatinship as being either a functin r nt a functin when given data as a table, set f rdered pairs, r graph. EA-3.2 Use

More information

Tree Structured Classifier

Tree Structured Classifier Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients

More information

MODULE 1. e x + c. [You can t separate a demominator, but you can divide a single denominator into each numerator term] a + b a(a + b)+1 = a + b

MODULE 1. e x + c. [You can t separate a demominator, but you can divide a single denominator into each numerator term] a + b a(a + b)+1 = a + b . REVIEW OF SOME BASIC ALGEBRA MODULE () Slving Equatins Yu shuld be able t slve fr x: a + b = c a d + e x + c and get x = e(ba +) b(c a) d(ba +) c Cmmn mistakes and strategies:. a b + c a b + a c, but

More information

More Tutorial at

More Tutorial at Answer each questin in the space prvided; use back f page if extra space is needed. Answer questins s the grader can READILY understand yur wrk; nly wrk n the exam sheet will be cnsidered. Write answers,

More information

Functions. EXPLORE \g the Inverse of ao Exponential Function

Functions. EXPLORE \g the Inverse of ao Exponential Function ifeg Seepe3 Functins Essential questin: What are the characteristics f lgarithmic functins? Recall that if/(x) is a ne-t-ne functin, then the graphs f/(x) and its inverse,/'~\x}, are reflectins f each

More information

Physics 212. Lecture 12. Today's Concept: Magnetic Force on moving charges. Physics 212 Lecture 12, Slide 1

Physics 212. Lecture 12. Today's Concept: Magnetic Force on moving charges. Physics 212 Lecture 12, Slide 1 Physics 1 Lecture 1 Tday's Cncept: Magnetic Frce n mving charges F qv Physics 1 Lecture 1, Slide 1 Music Wh is the Artist? A) The Meters ) The Neville rthers C) Trmbne Shrty D) Michael Franti E) Radiatrs

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

Section 5.8 Notes Page Exponential Growth and Decay Models; Newton s Law

Section 5.8 Notes Page Exponential Growth and Decay Models; Newton s Law Sectin 5.8 Ntes Page 1 5.8 Expnential Grwth and Decay Mdels; Newtn s Law There are many applicatins t expnential functins that we will fcus n in this sectin. First let s lk at the expnential mdel. Expnential

More information

NOTE ON THE ANALYSIS OF A RANDOMIZED BLOCK DESIGN. Junjiro Ogawa University of North Carolina

NOTE ON THE ANALYSIS OF A RANDOMIZED BLOCK DESIGN. Junjiro Ogawa University of North Carolina NOTE ON THE ANALYSIS OF A RANDOMIZED BLOCK DESIGN by Junjir Ogawa University f Nrth Carlina This research was supprted by the Office f Naval Research under Cntract N. Nnr-855(06) fr research in prbability

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information

Physics 2010 Motion with Constant Acceleration Experiment 1

Physics 2010 Motion with Constant Acceleration Experiment 1 . Physics 00 Mtin with Cnstant Acceleratin Experiment In this lab, we will study the mtin f a glider as it accelerates dwnhill n a tilted air track. The glider is supprted ver the air track by a cushin

More information

Kinetic Model Completeness

Kinetic Model Completeness 5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins

More information

Least Squares Optimal Filtering with Multirate Observations

Least Squares Optimal Filtering with Multirate Observations Prc. 36th Asilmar Cnf. n Signals, Systems, and Cmputers, Pacific Grve, CA, Nvember 2002 Least Squares Optimal Filtering with Multirate Observatins Charles W. herrien and Anthny H. Hawes Department f Electrical

More information

AP Physics Kinematic Wrap Up

AP Physics Kinematic Wrap Up AP Physics Kinematic Wrap Up S what d yu need t knw abut this mtin in tw-dimensin stuff t get a gd scre n the ld AP Physics Test? First ff, here are the equatins that yu ll have t wrk with: v v at x x

More information

Sections 15.1 to 15.12, 16.1 and 16.2 of the textbook (Robbins-Miller) cover the materials required for this topic.

Sections 15.1 to 15.12, 16.1 and 16.2 of the textbook (Robbins-Miller) cover the materials required for this topic. Tpic : AC Fundamentals, Sinusidal Wavefrm, and Phasrs Sectins 5. t 5., 6. and 6. f the textbk (Rbbins-Miller) cver the materials required fr this tpic.. Wavefrms in electrical systems are current r vltage

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

Misc. ArcMap Stuff Andrew Phay

Misc. ArcMap Stuff Andrew Phay Misc. ArcMap Stuff Andrew Phay aphay@whatcmcd.rg Prjectins Used t shw a spherical surface n a flat surface Distrtin Shape Distance True Directin Area Different Classes Thse that minimize distrtin in shape

More information

Comparing Several Means: ANOVA. Group Means and Grand Mean

Comparing Several Means: ANOVA. Group Means and Grand Mean STAT 511 ANOVA and Regressin 1 Cmparing Several Means: ANOVA Slide 1 Blue Lake snap beans were grwn in 12 pen-tp chambers which are subject t 4 treatments 3 each with O 3 and SO 2 present/absent. The ttal

More information

CHAPTER 2 Algebraic Expressions and Fundamental Operations

CHAPTER 2 Algebraic Expressions and Fundamental Operations CHAPTER Algebraic Expressins and Fundamental Operatins OBJECTIVES: 1. Algebraic Expressins. Terms. Degree. Gruping 5. Additin 6. Subtractin 7. Multiplicatin 8. Divisin Algebraic Expressin An algebraic

More information

SAMPLING DYNAMICAL SYSTEMS

SAMPLING DYNAMICAL SYSTEMS SAMPLING DYNAMICAL SYSTEMS Melvin J. Hinich Applied Research Labratries The University f Texas at Austin Austin, TX 78713-8029, USA (512) 835-3278 (Vice) 835-3259 (Fax) hinich@mail.la.utexas.edu ABSTRACT

More information

1 Course Notes in Introductory Physics Jeffrey Seguritan

1 Course Notes in Introductory Physics Jeffrey Seguritan Intrductin & Kinematics I Intrductin Quickie Cncepts Units SI is standard system f units used t measure physical quantities. Base units that we use: meter (m) is standard unit f length kilgram (kg) is

More information

AP Statistics Practice Test Unit Three Exploring Relationships Between Variables. Name Period Date

AP Statistics Practice Test Unit Three Exploring Relationships Between Variables. Name Period Date AP Statistics Practice Test Unit Three Explring Relatinships Between Variables Name Perid Date True r False: 1. Crrelatin and regressin require explanatry and respnse variables. 1. 2. Every least squares

More information

Name: Period: Date: PERIODIC TABLE NOTES ADVANCED CHEMISTRY

Name: Period: Date: PERIODIC TABLE NOTES ADVANCED CHEMISTRY Name: Perid: Date: PERIODIC TABLE NOTES ADVANCED CHEMISTRY Directins: This packet will serve as yur ntes fr this chapter. Fllw alng with the PwerPint presentatin and fill in the missing infrmatin. Imprtant

More information

SIZE BIAS IN LINE TRANSECT SAMPLING: A FIELD TEST. Mark C. Otto Statistics Research Division, Bureau of the Census Washington, D.C , U.S.A.

SIZE BIAS IN LINE TRANSECT SAMPLING: A FIELD TEST. Mark C. Otto Statistics Research Division, Bureau of the Census Washington, D.C , U.S.A. SIZE BIAS IN LINE TRANSECT SAMPLING: A FIELD TEST Mark C. Ott Statistics Research Divisin, Bureau f the Census Washingtn, D.C. 20233, U.S.A. and Kenneth H. Pllck Department f Statistics, Nrth Carlina State

More information

1b) =.215 1c).080/.215 =.372

1b) =.215 1c).080/.215 =.372 Practice Exam 1 - Answers 1. / \.1/ \.9 (D+) (D-) / \ / \.8 / \.2.15/ \.85 (T+) (T-) (T+) (T-).080.020.135.765 1b).080 +.135 =.215 1c).080/.215 =.372 2. The data shwn in the scatter plt is the distance

More information

Determination of Static Orientation Using IMU Data Revision 1

Determination of Static Orientation Using IMU Data Revision 1 Determinatin f Static Orientatin Usin IMU Data Revisin 1 Determinatin f Static Orientatin frm IMU Accelermeter and Manetmeter Data Intrductin An imprtant applicatin f inertial data is the rientatin determinatin

More information

The standards are taught in the following sequence.

The standards are taught in the following sequence. B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and

More information

Building to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems.

Building to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems. Building t Transfrmatins n Crdinate Axis Grade 5: Gemetry Graph pints n the crdinate plane t slve real-wrld and mathematical prblems. 5.G.1. Use a pair f perpendicular number lines, called axes, t define

More information

The blessing of dimensionality for kernel methods

The blessing of dimensionality for kernel methods fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented

More information

SPH3U1 Lesson 06 Kinematics

SPH3U1 Lesson 06 Kinematics PROJECTILE MOTION LEARNING GOALS Students will: Describe the mtin f an bject thrwn at arbitrary angles thrugh the air. Describe the hrizntal and vertical mtins f a prjectile. Slve prjectile mtin prblems.

More information

Elements of Machine Intelligence - I

Elements of Machine Intelligence - I ECE-175A Elements f Machine Intelligence - I Ken Kreutz-Delgad Nun Vascncels ECE Department, UCSD Winter 2011 The curse The curse will cver basic, but imprtant, aspects f machine learning and pattern recgnitin

More information

Name: Period: Date: PERIODIC TABLE NOTES HONORS CHEMISTRY

Name: Period: Date: PERIODIC TABLE NOTES HONORS CHEMISTRY Name: Perid: Date: PERIODIC TABLE NOTES HONORS CHEMISTRY Directins: This packet will serve as yur ntes fr this chapter. Fllw alng with the PwerPint presentatin and fill in the missing infrmatin. Imprtant

More information

Thermodynamics Partial Outline of Topics

Thermodynamics Partial Outline of Topics Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)

More information

Physics 101 Math Review. Solutions

Physics 101 Math Review. Solutions Physics 0 Math eview Slutins . The fllwing are rdinary physics prblems. Place the answer in scientific ntatin when apprpriate and simplify the units (Scientific ntatin is used when it takes less time t

More information

INSTRUMENTAL VARIABLES

INSTRUMENTAL VARIABLES INSTRUMENTAL VARIABLES Technical Track Sessin IV Sergi Urzua University f Maryland Instrumental Variables and IE Tw main uses f IV in impact evaluatin: 1. Crrect fr difference between assignment f treatment

More information

Support Vector Machines and Flexible Discriminants

Support Vector Machines and Flexible Discriminants 12 Supprt Vectr Machines and Flexible Discriminants This is page 417 Printer: Opaque this 12.1 Intrductin In this chapter we describe generalizatins f linear decisin bundaries fr classificatin. Optimal

More information

Math Foundations 20 Work Plan

Math Foundations 20 Work Plan Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant

More information

Smoothing, penalized least squares and splines

Smoothing, penalized least squares and splines Smthing, penalized least squares and splines Duglas Nychka, www.image.ucar.edu/~nychka Lcally weighted averages Penalized least squares smthers Prperties f smthers Splines and Reprducing Kernels The interplatin

More information

Interference is when two (or more) sets of waves meet and combine to produce a new pattern.

Interference is when two (or more) sets of waves meet and combine to produce a new pattern. Interference Interference is when tw (r mre) sets f waves meet and cmbine t prduce a new pattern. This pattern can vary depending n the riginal wave directin, wavelength, amplitude, etc. The tw mst extreme

More information

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected

More information

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme Enhancing Perfrmance f / Neural Classifiers via an Multivariate Data Distributin Scheme Halis Altun, Gökhan Gelen Nigde University, Electrical and Electrnics Engineering Department Nigde, Turkey haltun@nigde.edu.tr

More information

Lecture 6: Phase Space and Damped Oscillations

Lecture 6: Phase Space and Damped Oscillations Lecture 6: Phase Space and Damped Oscillatins Oscillatins in Multiple Dimensins The preius discussin was fine fr scillatin in a single dimensin In general, thugh, we want t deal with the situatin where:

More information

L a) Calculate the maximum allowable midspan deflection (w o ) critical under which the beam will slide off its support.

L a) Calculate the maximum allowable midspan deflection (w o ) critical under which the beam will slide off its support. ecture 6 Mderately arge Deflectin Thery f Beams Prblem 6-1: Part A: The department f Highways and Public Wrks f the state f Califrnia is in the prcess f imprving the design f bridge verpasses t meet earthquake

More information

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition The Kullback-Leibler Kernel as a Framewrk fr Discriminant and Lcalized Representatins fr Visual Recgnitin Nun Vascncels Purdy H Pedr Mren ECE Department University f Califrnia, San Dieg HP Labs Cambridge

More information

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y=

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y= Intrductin t Vectrs I 21 Intrductin t Vectrs I 22 I. Determine the hrizntal and vertical cmpnents f the resultant vectr by cunting n the grid. X= y= J. Draw a mangle with hrizntal and vertical cmpnents

More information

Math 10 - Exam 1 Topics

Math 10 - Exam 1 Topics Math 10 - Exam 1 Tpics Types and Levels f data Categrical, Discrete r Cntinuus Nminal, Ordinal, Interval r Rati Descriptive Statistics Stem and Leaf Graph Dt Plt (Interpret) Gruped Data Relative and Cumulative

More information

AP Physics. Summer Assignment 2012 Date. Name. F m = = + What is due the first day of school? a. T. b. = ( )( ) =

AP Physics. Summer Assignment 2012 Date. Name. F m = = + What is due the first day of school? a. T. b. = ( )( ) = P Physics Name Summer ssignment 0 Date I. The P curriculum is extensive!! This means we have t wrk at a fast pace. This summer hmewrk will allw us t start n new Physics subject matter immediately when

More information

A Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture

A Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture Few asic Facts but Isthermal Mass Transfer in a inary Miture David Keffer Department f Chemical Engineering University f Tennessee first begun: pril 22, 2004 last updated: January 13, 2006 dkeffer@utk.edu

More information