15-381/781 Bayesian Nets & Probabilistic Inference

Size: px
Start display at page:

Download "15-381/781 Bayesian Nets & Probabilistic Inference"

Transcription

1 15-381/781 Bayesian Nets & Prbabilistic Inference Emma Brunskill (this time) Ariel Prcaccia With thanks t Dan Klein (Berkeley), Percy Liang (Stanfrd) and Past Instructrs fr sme slide cntent, and Russell & Nrvig

2 What Yu Shuld Knw Define prbabilistic inference Hw t define a Bayes Net given a real example Hw jint can be used t answer any query Cmplexity f exact inference Apprximatin inference (direct, likelihd, Gibbs) Be able t implement and run algrithm Cmpare benefits and limitatins f each 2

3 Bayesian Netwrk Cmpact representatin f the jint distributin Cnditinal independence relatinships explicit Each var cnditinally independent f all its nndescendants in the graph given the value f its parents 3

4 Jint Distributin Ex. Variables: Cludy, Sprinkler, Rain, Wet Grass Dmain f each variable: 2 (true r false) Jint encdes prbability f all cmbs f variables & values P(Cludy=false & Sprinkler = true & Rain = false & WetGrass = True) +c +s +r +w.01 +c +s +r -w.01 +c +s -r +w.05 +c +s -r -w.1 +c -s +r +w # +c -s +r -w # +c -s -r +w # +c -s -r -w # -c +s +r +w # -c +s +r -w # -c +s -r +w # -c +s -r -w # -c -s +r +w # -c -s +r -w # -c -s -r +w # -c -s -r -w # 4

5 Jint as Prduct f Cnditinals (Chain rule) +c +s +r +w.01 +c +s +r -w.01 +c +s -r +w.05 +c +s -r -w.1 +c -s +r +w # +c -s +r -w # +c -s -r +w # +c -s -r -w # -c +s +r +w # -c +s +r -w # -c +s -r +w # -c +s -r -w # -c -s +r +w # -c -s +r -w # -c -s -r +w # -c -s -r -w # = P(WetGrass Cludy,Sprinkler,Rain)* P(Rain Cludy,Sprinkler)* P(Sprinkler Cludy)* P(Cludy) 5

6 Jint as Prduct f Cnditinals +c +s +r +w.01 +c +s +r -w.01 +c +s -r +w.05 +c +s -r -w.1 +c -s +r +w # +c -s +r -w # +c -s -r +w # +c -s -r -w # -c +s +r +w # -c +s +r -w # = P(WetGrass Cludy,Sprinkler,Rain)* P(Rain Cludy,Sprinkler)* P(Sprinkler Cludy)* P(Cludy) Cludy -c +s -r +w # -c +s -r -w # -c -s +r +w # -c -s +r -w # Sprinkler Rain -c -s -r +w # -c -s -r -w # Wet Grass but there may be additinal cnditinal independencies 6

7 What if sme variables are cnditinally indep? Cludy Cludy Sprinkler Rain Sprinkler Rain Wet Grass Wet Grass Explicitly shws any cnditinal independencies 7

8 Cnditinal Independencies +c +s +r +w.01 +c c 0.5 +c +s +r -w.01 +c +s -r +w.05 Cludy +c +s -r -w.1 +c -s +r +w # +c -s +r -w # +c -s -r +w # +c -s -r -w # -c +s +r +w # -c +s +r -w # -c +s -r +w # -c +s -r -w # -c -s +r +w # -c -s +r -w # -c -s -r +w # -c -s -r -w # à +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Wet Grass Rain +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w 1.0 +c +r.8 +c - r.2 - c +r.2 - c - r.8 8

9 Bayesian Netwrk Cmpact representatin f the jint distributin Cnditinal independence relatinships explicit Still represents jint s can be used t answer any prbabilistic query 9

10 Prbabilistic Inference Cmpute prbability f a query variable (r variables) taking n a value (r set f values) given sme evidence Pr[Q E 1 =e 1,...,E k =e k ] 10

11 Using the Jint T Answer Queries Jint distributin is sufficient t answer any prbabilistic inference questin invlving variables described in jint Can take Bayes Net, cnstruct full jint, and then lk up entries where evidence variables take n specified values 11

12 But Cnstructing Jint Expensive & Exact Inference is NP-Hard 12

13 Sln: Apprximate Inference Use samples t apprximate psterir distributin Pr[Q E 1 =e 1,...,E k =e k ] Last time Tday Direct sampling Likelihd weighting Gibbs 13

14 Pll: Which Algrithm? Evidence: Cludy=+c, Rain=+r Query variable: Sprinkler P(Sprinkler Cludy=+c,Rain=+r) Samples Cludy +c,+s,+r,-w +c,-s,-r,-w Sprinkler Rain +c,+s,-r,+w +c,-s,+r,-w What algrithm culd ve generated these samples? 1) Direct sampling Wet Grass 2) Likelihd weighting 3) Bth 4) N clue 14

15 Direct Sampling Recap Algrithm: 1. Create a tplgical rder f the variables in the Bayes Net 15

16 Tplgical Order Any rdering in directed acyclic graph where a nde can nly appear after all f its ancestrs in the graph Cludy E.g. Cludy, Sprinkler, Rain, WetGrass Sprinkler Rain Cludy, Rain, Sprinkler, WetGrass Wet Grass 16

17 Direct Sampling Recap Algrithm: 1. Create a tplgical rder f the variables in the Bayes Net 2. Sample each variable cnditined n the values f its parents 3. Use samples which match evidence variable values t estimate prbability f query variable e.g. P(Sprinkler=+s Cludy=+c,Rain=+r) ~ # samples with +s,+c, +r / # samples with +c, +r Cnsistent in limit f infinite samples Inefficient (why?) 17

18 Cnsistency In the limit f infinite samples, estimated Pr[Q E 1 =e 1,...,E k =e k ] will cnverge t true psterir prbability Desirable prperty (therwise always have sme errr) 18

19 Likelihd Weighting Recap 1. Create array TtalWeights 1. Initialize value f each array element t 0 2. Fr j=1:n 1. w tmp = 1 2. Set evidence variables in sample z=<z 1, z n > t bserved values 3. Fr each variable z i in tplgical rder 1. If x i is an evidence variable 1. w tmp = w tmp *P(Z i = e i Parents(Z) = x(parents(z i ))) 2. Else 1. Sample x i cnditined n the values f its parents 4. Update weight f resulting sample 1. TtalWeights[z]=TtalWeights[z]+w tmp 3. Use weights t cmpute prbability f query variable P(Sprinkler=+s Cludy=+c,Rain=+r) ~ Sum c,r,w TtalWeight(+s,c,r,w)/Sum s,c,r,w TtalWeight(s,c,r,w) 19

20 LW Cnsistency Prbability f getting a sample (z,e) where z is a set f values fr the nn-evidence variables and e is the vals f evidence vars Sampling distributin fr a weighted sample (WS) Is this the true psterir distributin P(z e)? N, why? Desn t cnsider evidence that is nt an ancestr Weights fix this! 20

21 Weighted Prbability Samples each nn-evidence variable z accrding t Weight f a sample is Weighted prbability f a sample is Frm chain rule & cnditinal indep 21

22 Des Likelihd Weighting Prduce Cnsistent Estimates? Yes P(X = x e) = P(X = x,e) P(e) P(X = x,e) X is query var(s) E is evidence var(s) Y is nn-query vars P(X = x e) P(X = x,e) = N WS (x, y, e)w(x, y, e) y n * S WS (x, y,e)w(x, y, e) y # f samples where query variables=x, nn-query=y, Evidence=e = P(x, y, e) y as # samples n à infinity = P(x, e) 22

23 Example When sampling S and R the evidence W=t is ignred Samples with S=f and R=f althugh evidence rules this ut Weight makes up fr this difference abve weight wuld be 0 If we have 100 samples with R=t and ttal weight 1, and 400 samples with R=f and ttal weight 2, what is estimate f R=t? = 1/ 3 23

24 Limitatins f Likelihd Weighting Pr perfrmance if evidence vars ccur later in rdering Why? Nt being used t influence samples! Yields samples with lw weights 24

25 Markv Chain Mnte Carl Methds Prir methds generate each new sample frm scratch MCMC generate each new sample by making a randm change t preceding sample Can view algrithm as being in a particular state (assignment f values t each variable) 25

26 Review: Markv Blanket Markv blanket Parents Children Children s parents Variable cnditinally independent f all ther ndes given its Markv Blanket 26

27 Gibbs Sampling: Cmpute P(X e) mb(z i ) = Markv Blanket f Z i frm Russell & Nrvig 27

28 Gibbs Sampling Example Want Pr(R S=t,W=t) Nn-evidence variables are C & R Initialize randmly: C= t and R=f Initial state (C,S,R,W)= [t,t,f,t] +c c 0.5 Cludy Sample C given current values f its Markv Blanket +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Rain +c +r.8 +c - r.2 - c +r.2 - c - r.8 Wet Grass +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w

29 Gibbs Sampling Example Want Pr(R S=t,W=t) Nn-evidence variables are C & R Initialize randmly: C= t and R=f Initial state (C,S,R,W)= [t,t,f,t] +c c 0.5 Cludy Sample C given current values f its Markv Blanket Markv blanket is parents, children and children s parents: fr C=S & R Sample C given P(C S=t,R=f) First have t cmpute P(C S=t,R=f) Use exact inference t d this +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Wet Grass Rain +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w 1.0 +c +r.8 +c - r.2 - c +r.2 - c - r.8 29

30 Exercise: cmpute P(C=t S=t,R=f)? Quick refresher Sum rule +c c 0.5 Cludy Prduct/Chain rule +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Rain +c +r.8 +c - r.2 - c +r.2 - c - r.8 Bayes rule Wet Grass +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w

31 Exact Inference Exercise P(C S=t,R=f) What is the prbability P(C=t S=t, R= f)? = P(C=t, S=t, R=f) / (P(S=t,R=f)) Prprtinal t P(C=t, S=t, R=f) Use nrmalizatin trick, & cmpute the abve fr C=t and C=f P(C=t, S=t, R=f) = P(C=t) P(S=t C=t) P (R=f C=t, S=t) prduct rule = P(C=t) P(S=t C=t) P (R=f C=t) (BN independencies) = 0.5 * 0.1 * 0.2 = 0.01 P(C=f, S=t, R=f) = P(C=f) P (S=t C=f) P(R=f C=f) = 0.5 * 0.5 * 0.8 = 0.2 (P(S=t,R=f)) use sum rule = P(C=f, S=t, R=f) + P(C=t, S=t, R=f) P (C = t S=t, R= f) = 0.21 P (C=t S=t, R = f) = 0.01 / 0.21 ~ c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler +c c 0.5 Cludy Wet Grass Rain +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w 1.0 +c +r.8 +c - r.2 - c +r.2 - c - r.8 31

32 Gibbs Sampling Example Want Pr(R S=t,W=t) Nn-evidence variables are C & R Initialize randmly: C= t and R=f Initial state (C,S,R,W)= [t,t,f,t] +c c 0.5 Cludy Sample C given current values f its Markv Blanket Markv blanket is parents, children and children s parents: fr C=S & R Exactly cmpute P(C S=t,R=f) Sample C given P(C S=t,R=f) Get C = f New state (f,t,f,t) +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Wet Grass Rain +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w 1.0 +c +r.8 +c - r.2 - c +r.2 - c - r.8 32

33 Gibbs Sampling Example Want Pr(R S=t,W=t) Initialize nn-evidence variables (C and R) randmly t t and f Initial state (C,S,R,W)= [t,t,f,t] +c c 0.5 Cludy Sample C given current values f its Markv Blanket, p(c S=t,R=f) Suppse result is C=f +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Rain +c +r.8 +c - r.2 - c +r.2 - c - r.8 New state (f,t,f,t) Sample Rain given its MB What is its Markv blanket? Wet Grass +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w

34 Gibbs Sampling Example Want Pr(R S=t,W=t) Initialize nn-evidence variables (C and R) randmly t t and f Initial state (C,S,R,W)= [t,t,f,t] +c c 0.5 Cludy Sample C given current values f its Markv Blanket, p(c S=t,R=f) Suppse result is C=f +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Rain +c +r.8 +c - r.2 - c +r.2 - c - r.8 New state (f,t,f,t) Sample Rain given its MB, p(r C=f,S=t,W=t) Suppse result is R=t New state (f,t,t,t) Wet Grass +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w

35 Pll: Gibbs Sampling Ex. Want Pr(R S=t,W=t) Initialize nn-evidence variables (C and R) randmly t t and f Initial state (C,S,R,W)= [t,t,f,t] +c c 0.5 Cludy Current state (f,t,t,t) What is nt a pssible next state +c +s.1 +c - s.9 - c +s.5 - c - s.5 Sprinkler Rain +c +r.8 +c - r.2 - c +r.2 - c - r.8 1. (f,t,t,t) 2. (t,t,t,t) 3. (f,t,f,t) 4. (f,f,t,t) (incnsistent w/evid) 5. Nt sure Wet Grass +s +r +w.99 +s +r - w.01 +s - r +w.90 +s - r - w.10 - s +r +w.90 - s +r - w.10 - s - r +w 0 - s - r - w

36 Gibbs Sampling This invlve inference! mb(z i ) = Markv Blanket f Z i frm Russell & Nrvig 36

37 Pll Are Gibbs samples independent? 1. Yes 2. N 3. Nt sure mb(z i ) = Markv Blanket f Z i frm Russell & Nrvig 37

38 Markv Blanket Sampling Want t shw P(Z i mb(z i ) ) is same as P(Z i all ther variables) Implies cnditinal independence f Z i frm rest f netwrk given its Markv Blanket Derive equatin fr cmputing P(Z i mb(z i ) ) 38

39 Prbability Given Markv Blanket 39

40 Why is Gibbs Cnsistent? Sampling prcess settles int a statinary distributin where lng-term fractin f time spent in each state is exactly equal t psterir prbability à Implies that if draw enugh samples frm this statinary distributin, will get cnsistent estimate because sampling frm true psterir 40

41 Markv Chain Let P(x à x ) be prbability the sampling prcess makes a transitin frm x (sme state) t x (sme ther state) E.g. (t,t,f,t) à (t,f,f,t) Run sampling fr t steps P t (x) is prbability system is in state x at time t Next state P t+1 (x ) = Sum x P t (x) P(x à x ) 41

42 Statinary Distributin Let P(x à x ) be prbability the prcess makes a transitin frm x t x P t (x) is prbability system is in state x at time t P t+1 (x ) = Sum x P t (x) P(x à x ) Reached statinary distributin if P t+1 (x )=P t (x) Call statinary distributin π Must satisfy π(x ) = \sum_{x} π(x) P(x à x ) fr all x If P(x à x ) is ergdic, exactly ne such π fr any given P(x à x ) 42

43 Detailed Balance Let P(x à x ) be prbability the prcess makes a transitin frm x t x P t (x) is prbability system is in state x at time t Statinary distributin π Satisfies π(x ) = \sum_{x} π(x) P(x à x ) fr all x Detailed balance: inflw = utflw π(x) P(x à x ) = π(x ) P(x à x) fr all x, x π (x')p(x > x') = π (x') P(x' > x) = π (x') x x 43

44 Prf n bard 44

45 Prving Gibbs Samples frm True Psterir General Gibbs: sample the value f a new variable cnditined n all the ther variables Can prve this versin f Gibbs satisfies detailed balance equatin with statinary distributin f P(X e) Then use prir result that sampling cnditined n all variables is equivalent t sampling given Markv Blanket fr Bayes Nets See text fr recap 45

46 Gibbs Sampling Samples are valid nce reach statinary distributin When d we reach statinary distributin? Unclear 46

47 What Yu Shuld Knw Define prbabilistic inference Hw t define a Bayes Net given a real example Hw jint can be used t answer any query Cmplexity f exact inference Apprximatin inference (direct, likelihd, Gibbs) Be able t implement and run algrithm Cmpare benefits and limitatins f each 47

Lecture 13: Markov Chain Monte Carlo. Gibbs sampling

Lecture 13: Markov Chain Monte Carlo. Gibbs sampling Lecture 13: Markv hain Mnte arl Gibbs sampling Gibbs sampling Markv chains 1 Recall: Apprximate inference using samples Main idea: we generate samples frm ur Bayes net, then cmpute prbabilities using (weighted)

More information

Informatics 2D Reasoning and Agents Semester 2,

Informatics 2D Reasoning and Agents Semester 2, Informatics 2D Reasoning and Agents Semester 2, 2018 2019 Alex Lascarides alex@inf.ed.ac.uk Lecture 25 Approximate Inference in Bayesian Networks 19th March 2019 Informatics UoE Informatics 2D 1 Where

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Tree Structured Classifier

Tree Structured Classifier Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

Maximum A Posteriori (MAP) CS 109 Lecture 22 May 16th, 2016

Maximum A Posteriori (MAP) CS 109 Lecture 22 May 16th, 2016 Maximum A Psteriri (MAP) CS 109 Lecture 22 May 16th, 2016 Previusly in CS109 Game f Estimatrs Maximum Likelihd Nn spiler: this didn t happen Side Plt argmax argmax f lg Mther f ptimizatins? Reviving an

More information

Distributions, spatial statistics and a Bayesian perspective

Distributions, spatial statistics and a Bayesian perspective Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics

More information

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017 Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with

More information

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal

More information

The Law of Total Probability, Bayes Rule, and Random Variables (Oh My!)

The Law of Total Probability, Bayes Rule, and Random Variables (Oh My!) The Law f Ttal Prbability, Bayes Rule, and Randm Variables (Oh My!) Administrivia Hmewrk 2 is psted and is due tw Friday s frm nw If yu didn t start early last time, please d s this time. Gd Milestnes:

More information

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression 4th Indian Institute f Astrphysics - PennState Astrstatistics Schl July, 2013 Vainu Bappu Observatry, Kavalur Crrelatin and Regressin Rahul Ry Indian Statistical Institute, Delhi. Crrelatin Cnsider a tw

More information

Kinetic Model Completeness

Kinetic Model Completeness 5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins

More information

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

Hypothesis Tests for One Population Mean

Hypothesis Tests for One Population Mean Hypthesis Tests fr One Ppulatin Mean Chapter 9 Ala Abdelbaki Objective Objective: T estimate the value f ne ppulatin mean Inferential statistics using statistics in rder t estimate parameters We will be

More information

NAME: Prof. Ruiz. 1. [5 points] What is the difference between simple random sampling and stratified random sampling?

NAME: Prof. Ruiz. 1. [5 points] What is the difference between simple random sampling and stratified random sampling? CS4445 ata Mining and Kwledge iscery in atabases. B Term 2014 Exam 1 Nember 24, 2014 Prf. Carlina Ruiz epartment f Cmputer Science Wrcester Plytechnic Institute NAME: Prf. Ruiz Prblem I: Prblem II: Prblem

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

GMM with Latent Variables

GMM with Latent Variables GMM with Latent Variables A. Rnald Gallant Penn State University Raffaella Giacmini University Cllege Lndn Giuseppe Ragusa Luiss University Cntributin The cntributin f GMM (Hansen and Singletn, 1982) was

More information

Dataflow Analysis and Abstract Interpretation

Dataflow Analysis and Abstract Interpretation Dataflw Analysis and Abstract Interpretatin Cmputer Science and Artificial Intelligence Labratry MIT Nvember 9, 2015 Recap Last time we develped frm first principles an algrithm t derive invariants. Key

More information

Chapter Summary. Mathematical Induction Strong Induction Recursive Definitions Structural Induction Recursive Algorithms

Chapter Summary. Mathematical Induction Strong Induction Recursive Definitions Structural Induction Recursive Algorithms Chapter 5 1 Chapter Summary Mathematical Inductin Strng Inductin Recursive Definitins Structural Inductin Recursive Algrithms Sectin 5.1 3 Sectin Summary Mathematical Inductin Examples f Prf by Mathematical

More information

You need to be able to define the following terms and answer basic questions about them:

You need to be able to define the following terms and answer basic questions about them: CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

MATHEMATICS SYLLABUS SECONDARY 5th YEAR

MATHEMATICS SYLLABUS SECONDARY 5th YEAR Eurpean Schls Office f the Secretary-General Pedaggical Develpment Unit Ref. : 011-01-D-8-en- Orig. : EN MATHEMATICS SYLLABUS SECONDARY 5th YEAR 6 perid/week curse APPROVED BY THE JOINT TEACHING COMMITTEE

More information

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y=

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y= Intrductin t Vectrs I 21 Intrductin t Vectrs I 22 I. Determine the hrizntal and vertical cmpnents f the resultant vectr by cunting n the grid. X= y= J. Draw a mangle with hrizntal and vertical cmpnents

More information

Preparation work for A2 Mathematics [2017]

Preparation work for A2 Mathematics [2017] Preparatin wrk fr A2 Mathematics [2017] The wrk studied in Y12 after the return frm study leave is frm the Cre 3 mdule f the A2 Mathematics curse. This wrk will nly be reviewed during Year 13, it will

More information

A proposition is a statement that can be either true (T) or false (F), (but not both).

A proposition is a statement that can be either true (T) or false (F), (but not both). 400 lecture nte #1 [Ch 2, 3] Lgic and Prfs 1.1 Prpsitins (Prpsitinal Lgic) A prpsitin is a statement that can be either true (T) r false (F), (but nt bth). "The earth is flat." -- F "March has 31 days."

More information

Reinforcement Learning" CMPSCI 383 Nov 29, 2011!

Reinforcement Learning CMPSCI 383 Nov 29, 2011! Reinfrcement Learning" CMPSCI 383 Nv 29, 2011! 1 Tdayʼs lecture" Review f Chapter 17: Making Cmple Decisins! Sequential decisin prblems! The mtivatin and advantages f reinfrcement learning.! Passive learning!

More information

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,

More information

READING STATECHART DIAGRAMS

READING STATECHART DIAGRAMS READING STATECHART DIAGRAMS Figure 4.48 A Statechart diagram with events The diagram in Figure 4.48 shws all states that the bject plane can be in during the curse f its life. Furthermre, it shws the pssible

More information

LCAO APPROXIMATIONS OF ORGANIC Pi MO SYSTEMS The allyl system (cation, anion or radical).

LCAO APPROXIMATIONS OF ORGANIC Pi MO SYSTEMS The allyl system (cation, anion or radical). Principles f Organic Chemistry lecture 5, page LCAO APPROIMATIONS OF ORGANIC Pi MO SYSTEMS The allyl system (catin, anin r radical).. Draw mlecule and set up determinant. 2 3 0 3 C C 2 = 0 C 2 3 0 = -

More information

Checking the resolved resonance region in EXFOR database

Checking the resolved resonance region in EXFOR database Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,

More information

Fall 2013 Physics 172 Recitation 3 Momentum and Springs

Fall 2013 Physics 172 Recitation 3 Momentum and Springs Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.

More information

1b) =.215 1c).080/.215 =.372

1b) =.215 1c).080/.215 =.372 Practice Exam 1 - Answers 1. / \.1/ \.9 (D+) (D-) / \ / \.8 / \.2.15/ \.85 (T+) (T-) (T+) (T-).080.020.135.765 1b).080 +.135 =.215 1c).080/.215 =.372 2. The data shwn in the scatter plt is the distance

More information

Module 3: Gaussian Process Parameter Estimation, Prediction Uncertainty, and Diagnostics

Module 3: Gaussian Process Parameter Estimation, Prediction Uncertainty, and Diagnostics Mdule 3: Gaussian Prcess Parameter Estimatin, Predictin Uncertainty, and Diagnstics Jerme Sacks and William J Welch Natinal Institute f Statistical Sciences and University f British Clumbia Adapted frm

More information

Graduate AI Lecture 16: Planning 2. Teachers: Martial Hebert Ariel Procaccia (this time)

Graduate AI Lecture 16: Planning 2. Teachers: Martial Hebert Ariel Procaccia (this time) Graduate AI Lecture 16: Planning 2 Teachers: Martial Hebert Ariel Prcaccia (this time) Reminder State is a cnjunctin f cnditins, e.g., at(truck 1,Shadyside) at(truck 2,Oakland) States are transfrmed via

More information

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected

More information

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods AI: Stochastic inference in BNs AI: Stochastic inference in BNs 1 Outline ypes of inference in (causal) BNs Hardness of exact

More information

Simple Linear Regression (single variable)

Simple Linear Regression (single variable) Simple Linear Regressin (single variable) Intrductin t Machine Learning Marek Petrik January 31, 2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins

More information

, which yields. where z1. and z2

, which yields. where z1. and z2 The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin

More information

Pure adaptive search for finite global optimization*

Pure adaptive search for finite global optimization* Mathematical Prgramming 69 (1995) 443-448 Pure adaptive search fr finite glbal ptimizatin* Z.B. Zabinskya.*, G.R. Wd b, M.A. Steel c, W.P. Baritmpa c a Industrial Engineering Prgram, FU-20. University

More information

NTP Clock Discipline Principles

NTP Clock Discipline Principles NTP Clck Discipline Principles David L. Mills University f Delaware http://www.eecis.udel.edu/~mills mailt:mills@udel.edu Sir Jhn Tenniel; Alice s Adventures in Wnderland,Lewis Carrll 24-Aug-04 1 Traditinal

More information

AP Statistics Notes Unit Two: The Normal Distributions

AP Statistics Notes Unit Two: The Normal Distributions AP Statistics Ntes Unit Tw: The Nrmal Distributins Syllabus Objectives: 1.5 The student will summarize distributins f data measuring the psitin using quartiles, percentiles, and standardized scres (z-scres).

More information

1 The limitations of Hartree Fock approximation

1 The limitations of Hartree Fock approximation Chapter: Pst-Hartree Fck Methds - I The limitatins f Hartree Fck apprximatin The n electrn single determinant Hartree Fck wave functin is the variatinal best amng all pssible n electrn single determinants

More information

Math Foundations 20 Work Plan

Math Foundations 20 Work Plan Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant

More information

CHM112 Lab Graphing with Excel Grading Rubric

CHM112 Lab Graphing with Excel Grading Rubric Name CHM112 Lab Graphing with Excel Grading Rubric Criteria Pints pssible Pints earned Graphs crrectly pltted and adhere t all guidelines (including descriptive title, prperly frmatted axes, trendline

More information

Experiment #3. Graphing with Excel

Experiment #3. Graphing with Excel Experiment #3. Graphing with Excel Study the "Graphing with Excel" instructins that have been prvided. Additinal help with learning t use Excel can be fund n several web sites, including http://www.ncsu.edu/labwrite/res/gt/gt-

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

Bayes Nets: Sampling

Bayes Nets: Sampling Bayes Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Approximate Inference:

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information

Bayesian networks. Independence. Bayesian networks. Markov conditions Inference. by enumeration rejection sampling Gibbs sampler

Bayesian networks. Independence. Bayesian networks. Markov conditions Inference. by enumeration rejection sampling Gibbs sampler Bayesian networks Independence Bayesian networks Markov conditions Inference by enumeration rejection sampling Gibbs sampler Independence if P(A=a,B=a) = P(A=a)P(B=b) for all a and b, then we call A and

More information

Inference in Bayesian Networks

Inference in Bayesian Networks Lecture 7 Inference in Bayesian Networks Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Slides by Stuart Russell and Peter Norvig Course Overview Introduction

More information

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9. Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.

More information

Eric Klein and Ning Sa

Eric Klein and Ning Sa Week 12. Statistical Appraches t Netwrks: p1 and p* Wasserman and Faust Chapter 15: Statistical Analysis f Single Relatinal Netwrks There are fur tasks in psitinal analysis: 1) Define Equivalence 2) Measure

More information

Revisiting the Socrates Example

Revisiting the Socrates Example Sectin 1.6 Sectin Summary Valid Arguments Inference Rules fr Prpsitinal Lgic Using Rules f Inference t Build Arguments Rules f Inference fr Quantified Statements Building Arguments fr Quantified Statements

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

THERMAL-VACUUM VERSUS THERMAL- ATMOSPHERIC TESTS OF ELECTRONIC ASSEMBLIES

THERMAL-VACUUM VERSUS THERMAL- ATMOSPHERIC TESTS OF ELECTRONIC ASSEMBLIES PREFERRED RELIABILITY PAGE 1 OF 5 PRACTICES PRACTICE NO. PT-TE-1409 THERMAL-VACUUM VERSUS THERMAL- ATMOSPHERIC Practice: Perfrm all thermal envirnmental tests n electrnic spaceflight hardware in a flight-like

More information

OTHER USES OF THE ICRH COUPL ING CO IL. November 1975

OTHER USES OF THE ICRH COUPL ING CO IL. November 1975 OTHER USES OF THE ICRH COUPL ING CO IL J. C. Sprtt Nvember 1975 -I,," PLP 663 Plasma Studies University f Wiscnsin These PLP Reprts are infrmal and preliminary and as such may cntain errrs nt yet eliminated.

More information

ALE 21. Gibbs Free Energy. At what temperature does the spontaneity of a reaction change?

ALE 21. Gibbs Free Energy. At what temperature does the spontaneity of a reaction change? Name Chem 163 Sectin: Team Number: ALE 21. Gibbs Free Energy (Reference: 20.3 Silberberg 5 th editin) At what temperature des the spntaneity f a reactin change? The Mdel: The Definitin f Free Energy S

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2017 Learning frm Examples Cesare Tinelli The University f Iwa Cpyright 2004 17, Cesare Tinelli and Stuart Russell a a These ntes were riginally develped by Stuart

More information

Lab 1 The Scientific Method

Lab 1 The Scientific Method INTRODUCTION The fllwing labratry exercise is designed t give yu, the student, an pprtunity t explre unknwn systems, r universes, and hypthesize pssible rules which may gvern the behavir within them. Scientific

More information

Public Key Cryptography. Tim van der Horst & Kent Seamons

Public Key Cryptography. Tim van der Horst & Kent Seamons Public Key Cryptgraphy Tim van der Hrst & Kent Seamns Last Updated: Oct 5, 2017 Asymmetric Encryptin Why Public Key Crypt is Cl Has a linear slutin t the key distributin prblem Symmetric crypt has an expnential

More information

5 th grade Common Core Standards

5 th grade Common Core Standards 5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin

More information

Relationships Between Frequency, Capacitance, Inductance and Reactance.

Relationships Between Frequency, Capacitance, Inductance and Reactance. P Physics Relatinships between f,, and. Relatinships Between Frequency, apacitance, nductance and Reactance. Purpse: T experimentally verify the relatinships between f, and. The data cllected will lead

More information

COMP 551 Applied Machine Learning Lecture 4: Linear classification

COMP 551 Applied Machine Learning Lecture 4: Linear classification COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted

More information

A New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation

A New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation III-l III. A New Evaluatin Measure J. Jiner and L. Werner Abstract The prblems f evaluatin and the needed criteria f evaluatin measures in the SMART system f infrmatin retrieval are reviewed and discussed.

More information

x x

x x Mdeling the Dynamics f Life: Calculus and Prbability fr Life Scientists Frederick R. Adler cfrederick R. Adler, Department f Mathematics and Department f Bilgy, University f Utah, Salt Lake City, Utah

More information

Turing Machines. Human-aware Robotics. 2017/10/17 & 19 Chapter 3.2 & 3.3 in Sipser Ø Announcement:

Turing Machines. Human-aware Robotics. 2017/10/17 & 19 Chapter 3.2 & 3.3 in Sipser Ø Announcement: Turing Machines Human-aware Rbtics 2017/10/17 & 19 Chapter 3.2 & 3.3 in Sipser Ø Annuncement: q q q q Slides fr this lecture are here: http://www.public.asu.edu/~yzhan442/teaching/cse355/lectures/tm-ii.pdf

More information

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur Cdewrd Distributin fr Frequency Sensitive Cmpetitive Learning with One Dimensinal Input Data Aristides S. Galanpuls and Stanley C. Ahalt Department f Electrical Engineering The Ohi State University Abstract

More information

The standards are taught in the following sequence.

The standards are taught in the following sequence. B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and

More information

Physical Layer: Outline

Physical Layer: Outline 18-: Intrductin t Telecmmunicatin Netwrks Lectures : Physical Layer Peter Steenkiste Spring 01 www.cs.cmu.edu/~prs/nets-ece Physical Layer: Outline Digital Representatin f Infrmatin Characterizatin f Cmmunicatin

More information

Unit 14 Thermochemistry Notes

Unit 14 Thermochemistry Notes Name KEY Perid CRHS Academic Chemistry Unit 14 Thermchemistry Ntes Quiz Date Exam Date Lab Dates Ntes, Hmewrk, Exam Reviews and Their KEYS lcated n CRHS Academic Chemistry Website: https://cincchem.pbwrks.cm

More information

Hiding in plain sight

Hiding in plain sight Hiding in plain sight Principles f stegangraphy CS349 Cryptgraphy Department f Cmputer Science Wellesley Cllege The prisners prblem Stegangraphy 1-2 1 Secret writing Lemn juice is very nearly clear s it

More information

Time, Synchronization, and Wireless Sensor Networks

Time, Synchronization, and Wireless Sensor Networks Time, Synchrnizatin, and Wireless Sensr Netwrks Part II Ted Herman University f Iwa Ted Herman/March 2005 1 Presentatin: Part II metrics and techniques single-hp beacns reginal time znes ruting-structure

More information

How can standard heats of formation be used to calculate the heat of a reaction?

How can standard heats of formation be used to calculate the heat of a reaction? Answer Key ALE 28. ess s Law and Standard Enthalpies Frmatin (Reerence: Chapter 6 - Silberberg 4 th editin) Imprtant!! Fr answers that invlve a calculatin yu must shw yur wrk neatly using dimensinal analysis

More information

Thermodynamics Partial Outline of Topics

Thermodynamics Partial Outline of Topics Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)

More information

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System Flipping Physics Lecture Ntes: Simple Harmnic Mtin Intrductin via a Hrizntal Mass-Spring System A Hrizntal Mass-Spring System is where a mass is attached t a spring, riented hrizntally, and then placed

More information

Unit Project Descriptio

Unit Project Descriptio Unit Prject Descriptin: Using Newtn s Laws f Mtin and the scientific methd, create a catapult r trebuchet that will sht a marshmallw at least eight feet. After building and testing yur machine at hme,

More information

Statistics, Numerical Models and Ensembles

Statistics, Numerical Models and Ensembles Statistics, Numerical Mdels and Ensembles Duglas Nychka, Reinhard Furrer,, Dan Cley Claudia Tebaldi, Linda Mearns, Jerry Meehl and Richard Smith (UNC). Spatial predictin and data assimilatin Precipitatin

More information

Finite Automata. Human-aware Robo.cs. 2017/08/22 Chapter 1.1 in Sipser

Finite Automata. Human-aware Robo.cs. 2017/08/22 Chapter 1.1 in Sipser Finite Autmata 2017/08/22 Chapter 1.1 in Sipser 1 Last time Thery f cmputatin Autmata Thery Cmputability Thery Cmplexity Thery Finite autmata Pushdwn autmata Turing machines 2 Outline fr tday Finite autmata

More information

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22. Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides

More information

Materials Engineering 272-C Fall 2001, Lecture 7 & 8 Fundamentals of Diffusion

Materials Engineering 272-C Fall 2001, Lecture 7 & 8 Fundamentals of Diffusion Materials Engineering 272-C Fall 2001, Lecture 7 & 8 Fundamentals f Diffusin Diffusin: Transprt in a slid, liquid, r gas driven by a cncentratin gradient (r, in the case f mass transprt, a chemical ptential

More information

ELT COMMUNICATION THEORY

ELT COMMUNICATION THEORY ELT 41307 COMMUNICATION THEORY Matlab Exercise #2 Randm variables and randm prcesses 1 RANDOM VARIABLES 1.1 ROLLING A FAIR 6 FACED DICE (DISCRETE VALIABLE) Generate randm samples fr rlling a fair 6 faced

More information

Lifting a Lion: Using Proportions

Lifting a Lion: Using Proportions Overview Students will wrk in cperative grups t slve a real-wrd prblem by using the bk Hw D yu Lift a Lin? Using a ty lin and a lever, students will discver hw much wrk is needed t raise the ty lin. They

More information

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares

More information

ECE 5318/6352 Antenna Engineering. Spring 2006 Dr. Stuart Long. Chapter 6. Part 7 Schelkunoff s Polynomial

ECE 5318/6352 Antenna Engineering. Spring 2006 Dr. Stuart Long. Chapter 6. Part 7 Schelkunoff s Polynomial ECE 538/635 Antenna Engineering Spring 006 Dr. Stuart Lng Chapter 6 Part 7 Schelkunff s Plynmial 7 Schelkunff s Plynmial Representatin (fr discrete arrays) AF( ψ ) N n 0 A n e jnψ N number f elements in

More information

Resampling Methods. Chapter 5. Chapter 5 1 / 52

Resampling Methods. Chapter 5. Chapter 5 1 / 52 Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and

More information

Lyapunov Stability Stability of Equilibrium Points

Lyapunov Stability Stability of Equilibrium Points Lyapunv Stability Stability f Equilibrium Pints 1. Stability f Equilibrium Pints - Definitins In this sectin we cnsider n-th rder nnlinear time varying cntinuus time (C) systems f the frm x = f ( t, x),

More information

ENSC Discrete Time Systems. Project Outline. Semester

ENSC Discrete Time Systems. Project Outline. Semester ENSC 49 - iscrete Time Systems Prject Outline Semester 006-1. Objectives The gal f the prject is t design a channel fading simulatr. Upn successful cmpletin f the prject, yu will reinfrce yur understanding

More information

Source Coding and Compression

Source Coding and Compression Surce Cding and Cmpressin Heik Schwarz Cntact: Dr.-Ing. Heik Schwarz heik.schwarz@hhi.fraunhfer.de Heik Schwarz Surce Cding and Cmpressin September 22, 2013 1 / 60 PartI: Surce Cding Fundamentals Heik

More information

NUMBERS, MATHEMATICS AND EQUATIONS

NUMBERS, MATHEMATICS AND EQUATIONS AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t

More information

General Chemistry II, Unit I: Study Guide (part I)

General Chemistry II, Unit I: Study Guide (part I) 1 General Chemistry II, Unit I: Study Guide (part I) CDS Chapter 14: Physical Prperties f Gases Observatin 1: Pressure- Vlume Measurements n Gases The spring f air is measured as pressure, defined as the

More information

Multiple Source Multiple. using Network Coding

Multiple Source Multiple. using Network Coding Multiple Surce Multiple Destinatin Tplgy Inference using Netwrk Cding Pegah Sattari EECS, UC Irvine Jint wrk with Athina Markpulu, at UCI, Christina Fraguli, at EPFL, Lausanne Outline Netwrk Tmgraphy Gal,

More information

Math 105: Review for Exam I - Solutions

Math 105: Review for Exam I - Solutions 1. Let f(x) = 3 + x + 5. Math 105: Review fr Exam I - Slutins (a) What is the natural dmain f f? [ 5, ), which means all reals greater than r equal t 5 (b) What is the range f f? [3, ), which means all

More information

B. Definition of an exponential

B. Definition of an exponential Expnents and Lgarithms Chapter IV - Expnents and Lgarithms A. Intrductin Starting with additin and defining the ntatins fr subtractin, multiplicatin and divisin, we discvered negative numbers and fractins.

More information

What is Statistical Learning?

What is Statistical Learning? What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,

More information

Physics 2010 Motion with Constant Acceleration Experiment 1

Physics 2010 Motion with Constant Acceleration Experiment 1 . Physics 00 Mtin with Cnstant Acceleratin Experiment In this lab, we will study the mtin f a glider as it accelerates dwnhill n a tilted air track. The glider is supprted ver the air track by a cushin

More information