Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Size: px
Start display at page:

Download "Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power"

Transcription

1 Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp:// ehem/imle All oher slides are based on Michell. Learning a Class from Examples Class C of a family car Predicion: Is car x a family car? Knowledge exracion: Wha do people expec from a family car? Oupu: Posiive (+) and negaive ( ) examples Inpu represenaion: x 1 : price, x : engine power 1 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 3 Training se X r X x {,r } 1 1if x isposiive 0 if x isnegaive Class C p price p AD e engine power e 1 1 x x x 1 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 4 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 5

2 Hypohesis class H 1if h says x isposiive h( x) 0 if h says x isnegaive S, G, and he Version Space mos specific hypohesis, S mos general hypohesis, G Error of h on H E( h X) 1h x r 1 h H, beween S and G is consisen and make up he version space (Michell, 1997) Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 6 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 7 Compuaional Learning Theory (from Michell Compuaional Learning Theory Chaper 7) Theoreical characerizaion of he difficulies and capabiliies of learning algorihms. Wha general laws consrain inducive learning? We seek heory o relae: Quesions: Probabiliy of successful learning Condiions for successful/unsuccessful learning umber of raining examples Condiions of success for paricular algorihms Two frameworks: Probably Approximaely Correc (PAC) framework: classes of hypoheses ha can be learned; complexiy of hypohesis Complexiy of hypohesis space Accuracy o which arge concep is approximaed Manner in which raining examples presened space and bound on raining se size. Misake bound framework: number of raining errors made before correc hypohesis is deermined. 3

3 Specific Quesions Sample Complexiy Sample complexiy: How many raining examples are needed for a learner o converge? Compuaional complexiy: How much compuaional effor is needed for a learner o converge? Misake bound: How many raining examples will he learner misclassify before converging? Issues: When o say i was successful? How are inpus acquired? How many raining examples are sufficien o learn he arge concep? 1. If learner proposes insances, as queries o eacher Learner proposes insance x, eacher provides c(x). If eacher (who knows c) provides raining examples eacher provides sequence of examples of form x, c(x) 3. If some random process (e.g., naure) proposes insances insance x generaed randomly, eacher provides c(x) 4 5 True Error of a Hypohesis Two oions of Error - Where c and h disagree c Insance space X + + h - - Training error of hypohesis h wih respec o arge concep c How ofen h(x) c(x) over raining insances True error of hypohesis h wih respec o c How ofen h(x) c(x) over fuure random insances Definiion: The rue error (denoed error D (h)) of hypohesis h wih respec o arge concep c and disribuion D is he probabiliy ha h will misclassify an insance drawn a random according o D. Our concern: Can we bound he rue error of h given he raining error of h? Firs consider when raining error of h is zero (i.e., h V S H,D ) error D (h) Pr [c(x) h(x)] x D 6 7

4 Exhausing he Version Space error =.3 r =.1 error =.1 r =. Hypohesis space H error =. r =0 VS H,D error =.1 r =0 error =.3 r =.4 error =. r =.3 (r = raining error, error = rue error) Definiion: The version space V S H,D is said o be ɛ-exhaused wih respec o c and D, if every hypohesis h in V S H,D has error less han ɛ wih respec o c and D. ( h V S H,D ) error D (h) < ɛ How many examples will ɛ-exhaus he VS? Theorem: [Haussler, 1988]. If he hypohesis space H is finie, and D is a sequence of m 1 independen random examples of some arge concep c, hen for any 0 ɛ 1, he probabiliy ha he version space wih respec o H and D is no ɛ-exhaused (wih respec o c) is less han H e ɛm This bounds he probabiliy ha any consisen learner will oupu a hypohesis h wih error(h) ɛ If we wan his probabiliy o be below δ hen H e ɛm δ m 1 (ln H + ln(1/δ)) ɛ 8 9 Proof of ɛ-exhasing Theorem Theorem: Prob. of V S H,D no being ɛ-exhaused is H e ɛm. Proof: Le h i H (i = 1..k) be hose ha have rue error greaer han ɛ wr c (k H ). We fail o ɛ-exhaus he VS iff a leas one h i is consisen wih all m sample raining insances (noe: hey have rue error greaer han ɛ). Prob. of a single hypohesis wih error > ɛ is consisen for one random sample is a mos (1 ɛ). Prob. of ha hypohesis being consisen wih m samples is (1 ɛ) m. Prob. of a leas one of k hypoheses wih error > ɛ is consisen wih m samples is k(1 ɛ) m. PAC Learning Consider a class C of possible arge conceps defined over a se of insances X of lengh n, and a learner L using hypohesis space H. Definiion: C is PAC-learnable by L using H if for all c C, disribuions D over X, ɛ such ha 0 < ɛ < 1/, and δ such ha 0 < δ < 1/, learner L will wih probabiliy a leas (1 δ) oupu a hypohesis h H such ha error D (h) ɛ, in ime ha is polynomial in 1/ɛ, 1/δ, n and size(c). Since k H, and for 0 ɛ 1, (1 ɛ) e ɛ : k(1 ɛ) m H (1 ɛ) m H e ɛm 10 11

5 Agnosic Learning Shaering a Se of Insances So far, we assumed ha c H. Wha if i is no he case? Agnosic learning seing: don assume c H Wha do we wan hen? The hypohesis h ha makes fewes errors on raining daa Wha is sample complexiy in his case? derived from Hoeffding bounds: m 1 (ln H + ln(1/δ)) ɛ Definiion: a dichoomy of a se S is a pariion of S ino wo disjoin subses. Definiion: a se of insances S is shaered by hypohesis space H if and only if for every dichoomy of S here exiss some hypohesis in H consisen wih his dichoomy. P r[error D (h) > error D (h) + ɛ] e mɛ 1 13 Three Insances Shaered Insance space X The Vapnik-Chervonenkis Dimension Definiion: The Vapnik-Chervonenkis dimension, V C(H), of hypohesis space H defined over insance space X is he size of he larges finie subse of X shaered by H. If arbirarily large finie ses of X can be shaered by H, hen V C(H). oe ha H can be infinie, while V C(H) finie! Each closed conour indicaes one dichoomy. Wha kind of hypohesis space H can shaer he insances? 14 15

6 VC Dim. of Linear Decision Surfaces VC Dimension: Anoher Example ( a) ( b) S = {3.1, 5.7}, and hypohesis space includes inervals a < x < b. Dichoomies: boh, none, 3.1, or 5.7. When H is a se of lines, and S a se of poins, V C(H) = 3. (a) can be shaered, bu (b) canno be. However, if a leas one subse of size 3 can be shaered, ha s fine. Are here inervals ha cover all he above dichoomies? Wha abou S = x 0, x 1, x for an arbirary x i? (cf. collinear poins). Se of size 4 canno be shaered, for any combinaion of poins (hink abou an XOR-like siuaion) Sample Complexiy from VC Dimension How many randomly drawn examples suffice o ɛ-exhaus V S H,D wih probabiliy a leas (1 δ)? m 1 ɛ (4 log (/δ) + 8V C(H) log (13/ɛ)) Misake Bounds So far: how many examples needed o learn? Wha abou: how many misakes before convergence? This is an ineresing quesion because some learning sysems may need o sar operaing while sill learning. V C(H) is direcly relaed o he sample complexiy: More expressive H needs more samples. More samples needed for H wih more unable parameers. Le s consider similar seing o PAC learning: Insances drawn a random from X according o disribuion D. Learner mus classify each insance before receiving correc classificaion from eacher. Can we bound he number of misakes learner makes before converging? 18 19

7 Misake Bounds: Halving Algorihm Misake Bound of Halving Algorihm Consider he Halving Algorihm: Learn concep using version space Candidae-Eliminaion or Lis-Then-Eliminae algorihm (no need o know deails abou hese algorihms). Classify new insances by majoriy voe of version space members. How many misakes before converging o correc h? Sar wih version space = H. Misake is made when more han half of he h H misclassified. In ha case, a mos half of h V S will be eliminaed. Tha is, each misake reduces he V S by half. Iniially V S = H, and each misake halves he V S, so i akes log H misakes o reduce V S o 1. Acual wors-case bound is log H.... in wors case?... in bes case? 0 1 Opimal Misake Bounds Le M A (C) be he max number of misakes made by algorihm A o learn conceps in C. (maximum over all possible c C, and all possible raining sequences) M A (C) max c C M A(c) Misake Bounds and VC Dimension Lilesone (1987) showed: V C(C) Op(C) M Halving (C) log ( C ) Definiion: Le C be an arbirary non-empy concep class. The opimal misake bound for C, denoed Op(C), is he minimum over all possible learning algorihms A of M A (C). Op(C) min M A(C) A learning algorihms 3

8 oise and Model Complexiy Use he simpler one because Simpler o use (lower compuaional complexiy) Easier o rain (lower space complexiy) Easier o explain (more inerpreable) Generalizes beer (lower variance - Occam s razor) Muliple Classes, C i i=1,...,k X x r i h {,r } 1 1 if x Ci 0 if x C i x j, j i Train hypoheses h i (x), i =1,...,K: 1 if x Ci 0 if x C j, j Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 11 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 1 Regression X r r E 1 g X r gx E f x, r 1 x 1 g xw1x w0 1 w1, w0 X r w 1x w0 1 g xwx w1x w0 Model Selecion & Generalizaion Learning is an ill-posed problem; daa is no sufficien o find a unique soluion The need for inducive bias, assumpions abou H Generalizaion: How well a model performs on new daa Overfiing: H more complex han C or f Underfiing: H less complex han C or f Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 13 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 14

9 Triple Trade-Off There is a rade-off beween hree facors (Dieerich, 003): 1. Complexiy of H, c (H),. Training se size,, 3. Generalizaion error, E, on new daa As E As c (H)firs Eand hen E Cross-Validaion To esimae generalizaion error, we need daa unseen during raining. We spli he daa as Training se (50%) Validaion se (5%) Tes (publicaion) se (5%) Resampling when here is few daa Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 15 Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 16 Dimensions of a Supervised Learner 1. Model:. Loss funcion: 3. Opimizaion procedure: g x E X Lr, gx * argmine X Lecure oes for E Alpaydın 010 Inroducion o Machine Learning e The MIT Press (V1.0) 17

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

Tasty Coffee example

Tasty Coffee example Lecure Slides for (Binary) Classificaion: Learning a Class from labeled Examples ITRODUCTIO TO Machine Learning ETHEM ALPAYDI The MIT Press, 00 (modified by dph, 0000) CHAPTER : Supervised Learning Things

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LDA, logisic

More information

Online Convex Optimization Example And Follow-The-Leader

Online Convex Optimization Example And Follow-The-Leader CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LTU, decision

More information

3.1 More on model selection

3.1 More on model selection 3. More on Model selecion 3. Comparing models AIC, BIC, Adjused R squared. 3. Over Fiing problem. 3.3 Sample spliing. 3. More on model selecion crieria Ofen afer model fiing you are lef wih a handful of

More information

EXERCISES FOR SECTION 1.5

EXERCISES FOR SECTION 1.5 1.5 Exisence and Uniqueness of Soluions 43 20. 1 v c 21. 1 v c 1 2 4 6 8 10 1 2 2 4 6 8 10 Graph of approximae soluion obained using Euler s mehod wih = 0.1. Graph of approximae soluion obained using Euler

More information

Some Basic Information about M-S-D Systems

Some Basic Information about M-S-D Systems Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,

More information

Math 2142 Exam 1 Review Problems. x 2 + f (0) 3! for the 3rd Taylor polynomial at x = 0. To calculate the various quantities:

Math 2142 Exam 1 Review Problems. x 2 + f (0) 3! for the 3rd Taylor polynomial at x = 0. To calculate the various quantities: Mah 4 Eam Review Problems Problem. Calculae he 3rd Taylor polynomial for arcsin a =. Soluion. Le f() = arcsin. For his problem, we use he formula f() + f () + f ()! + f () 3! for he 3rd Taylor polynomial

More information

Lecture 33: November 29

Lecture 33: November 29 36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure

More information

Computational Learning Theory

Computational Learning Theory 09s1: COMP9417 Machine Learning and Data Mining Computational Learning Theory May 20, 2009 Acknowledgement: Material derived from slides for the book Machine Learning, Tom M. Mitchell, McGraw-Hill, 1997

More information

Introduction to Probability and Statistics Slides 4 Chapter 4

Introduction to Probability and Statistics Slides 4 Chapter 4 Inroducion o Probabiliy and Saisics Slides 4 Chaper 4 Ammar M. Sarhan, asarhan@mahsa.dal.ca Deparmen of Mahemaics and Saisics, Dalhousie Universiy Fall Semeser 8 Dr. Ammar Sarhan Chaper 4 Coninuous Random

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Notes for Lecture 17-18

Notes for Lecture 17-18 U.C. Berkeley CS278: Compuaional Complexiy Handou N7-8 Professor Luca Trevisan April 3-8, 2008 Noes for Lecure 7-8 In hese wo lecures we prove he firs half of he PCP Theorem, he Amplificaion Lemma, up

More information

Chapter 7: Solving Trig Equations

Chapter 7: Solving Trig Equations Haberman MTH Secion I: The Trigonomeric Funcions Chaper 7: Solving Trig Equaions Le s sar by solving a couple of equaions ha involve he sine funcion EXAMPLE a: Solve he equaion sin( ) The inverse funcions

More information

5. Stochastic processes (1)

5. Stochastic processes (1) Lec05.pp S-38.45 - Inroducion o Teleraffic Theory Spring 2005 Conens Basic conceps Poisson process 2 Sochasic processes () Consider some quaniy in a eleraffic (or any) sysem I ypically evolves in ime randomly

More information

This document was generated at 1:04 PM, 09/10/13 Copyright 2013 Richard T. Woodward. 4. End points and transversality conditions AGEC

This document was generated at 1:04 PM, 09/10/13 Copyright 2013 Richard T. Woodward. 4. End points and transversality conditions AGEC his documen was generaed a 1:4 PM, 9/1/13 Copyrigh 213 Richard. Woodward 4. End poins and ransversaliy condiions AGEC 637-213 F z d Recall from Lecure 3 ha a ypical opimal conrol problem is o maimize (,,

More information

Hypothesis Testing in the Classical Normal Linear Regression Model. 1. Components of Hypothesis Tests

Hypothesis Testing in the Classical Normal Linear Regression Model. 1. Components of Hypothesis Tests ECONOMICS 35* -- NOTE 8 M.G. Abbo ECON 35* -- NOTE 8 Hypohesis Tesing in he Classical Normal Linear Regression Model. Componens of Hypohesis Tess. A esable hypohesis, which consiss of wo pars: Par : a

More information

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H. ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple

More information

Longest Common Prefixes

Longest Common Prefixes Longes Common Prefixes The sandard ordering for srings is he lexicographical order. I is induced by an order over he alphabe. We will use he same symbols (,

More information

CptS 570 Machine Learning School of EECS Washington State University. CptS Machine Learning 1

CptS 570 Machine Learning School of EECS Washington State University. CptS Machine Learning 1 CpS 570 Machine Learning School of EECS Washingon Sae Universiy CpS 570 - Machine Learning 1 Form of underlying disribuions unknown Bu sill wan o perform classificaion and regression Semi-parameric esimaion

More information

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,

More information

Lecture 20: Riccati Equations and Least Squares Feedback Control

Lecture 20: Riccati Equations and Least Squares Feedback Control 34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he

More information

Comparing Means: t-tests for One Sample & Two Related Samples

Comparing Means: t-tests for One Sample & Two Related Samples Comparing Means: -Tess for One Sample & Two Relaed Samples Using he z-tes: Assumpions -Tess for One Sample & Two Relaed Samples The z-es (of a sample mean agains a populaion mean) is based on he assumpion

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course OMP: Arificial Inelligence Fundamenals Lecure 0 Very Brief Overview Lecurer: Email: Xiao-Jun Zeng x.zeng@mancheser.ac.uk Overview This course will focus mainly on probabilisic mehods in AI We shall presen

More information

Final Spring 2007

Final Spring 2007 .615 Final Spring 7 Overview The purpose of he final exam is o calculae he MHD β limi in a high-bea oroidal okamak agains he dangerous n = 1 exernal ballooning-kink mode. Effecively, his corresponds o

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

Lecture 2 October ε-approximation of 2-player zero-sum games

Lecture 2 October ε-approximation of 2-player zero-sum games Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion

More information

Approximation Algorithms for Unique Games via Orthogonal Separators

Approximation Algorithms for Unique Games via Orthogonal Separators Approximaion Algorihms for Unique Games via Orhogonal Separaors Lecure noes by Konsanin Makarychev. Lecure noes are based on he papers [CMM06a, CMM06b, LM4]. Unique Games In hese lecure noes, we define

More information

5.1 - Logarithms and Their Properties

5.1 - Logarithms and Their Properties Chaper 5 Logarihmic Funcions 5.1 - Logarihms and Their Properies Suppose ha a populaion grows according o he formula P 10, where P is he colony size a ime, in hours. When will he populaion be 2500? We

More information

Lecture Notes 2. The Hilbert Space Approach to Time Series

Lecture Notes 2. The Hilbert Space Approach to Time Series Time Series Seven N. Durlauf Universiy of Wisconsin. Basic ideas Lecure Noes. The Hilber Space Approach o Time Series The Hilber space framework provides a very powerful language for discussing he relaionship

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning

More information

Lecture 4 Notes (Little s Theorem)

Lecture 4 Notes (Little s Theorem) Lecure 4 Noes (Lile s Theorem) This lecure concerns one of he mos imporan (and simples) heorems in Queuing Theory, Lile s Theorem. More informaion can be found in he course book, Bersekas & Gallagher,

More information

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H. ACE 56 Fall 5 Lecure 8: The Simple Linear Regression Model: R, Reporing he Resuls and Predicion by Professor Sco H. Irwin Required Readings: Griffihs, Hill and Judge. "Explaining Variaion in he Dependen

More information

Matlab and Python programming: how to get started

Matlab and Python programming: how to get started Malab and Pyhon programming: how o ge sared Equipping readers he skills o wrie programs o explore complex sysems and discover ineresing paerns from big daa is one of he main goals of his book. In his chaper,

More information

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t... Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger

More information

Chapter 4. Truncation Errors

Chapter 4. Truncation Errors Chaper 4. Truncaion Errors and he Taylor Series Truncaion Errors and he Taylor Series Non-elemenary funcions such as rigonomeric, eponenial, and ohers are epressed in an approimae fashion using Taylor

More information

Unit Root Time Series. Univariate random walk

Unit Root Time Series. Univariate random walk Uni Roo ime Series Univariae random walk Consider he regression y y where ~ iid N 0, he leas squares esimae of is: ˆ yy y y yy Now wha if = If y y hen le y 0 =0 so ha y j j If ~ iid N 0, hen y ~ N 0, he

More information

Computational Learning Theory

Computational Learning Theory Computational Learning Theory Sinh Hoa Nguyen, Hung Son Nguyen Polish-Japanese Institute of Information Technology Institute of Mathematics, Warsaw University February 14, 2006 inh Hoa Nguyen, Hung Son

More information

CS Homework Week 2 ( 2.25, 3.22, 4.9)

CS Homework Week 2 ( 2.25, 3.22, 4.9) CS3150 - Homework Week 2 ( 2.25, 3.22, 4.9) Dan Li, Xiaohui Kong, Hammad Ibqal and Ihsan A. Qazi Deparmen of Compuer Science, Universiy of Pisburgh, Pisburgh, PA 15260 Inelligen Sysems Program, Universiy

More information

11!Hí MATHEMATICS : ERDŐS AND ULAM PROC. N. A. S. of decomposiion, properly speaking) conradics he possibiliy of defining a counably addiive real-valu

11!Hí MATHEMATICS : ERDŐS AND ULAM PROC. N. A. S. of decomposiion, properly speaking) conradics he possibiliy of defining a counably addiive real-valu ON EQUATIONS WITH SETS AS UNKNOWNS BY PAUL ERDŐS AND S. ULAM DEPARTMENT OF MATHEMATICS, UNIVERSITY OF COLORADO, BOULDER Communicaed May 27, 1968 We shall presen here a number of resuls in se heory concerning

More information

KINEMATICS IN ONE DIMENSION

KINEMATICS IN ONE DIMENSION KINEMATICS IN ONE DIMENSION PREVIEW Kinemaics is he sudy of how hings move how far (disance and displacemen), how fas (speed and velociy), and how fas ha how fas changes (acceleraion). We say ha an objec

More information

Licenciatura de ADE y Licenciatura conjunta Derecho y ADE. Hoja de ejercicios 2 PARTE A

Licenciatura de ADE y Licenciatura conjunta Derecho y ADE. Hoja de ejercicios 2 PARTE A Licenciaura de ADE y Licenciaura conjuna Derecho y ADE Hoja de ejercicios PARTE A 1. Consider he following models Δy = 0.8 + ε (1 + 0.8L) Δ 1 y = ε where ε and ε are independen whie noise processes. In

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Mathcad Lecture #7 In-class Worksheet "Smart" Solve Block Techniques Handout

Mathcad Lecture #7 In-class Worksheet Smart Solve Block Techniques Handout Mahcad Lecure #7 In-class Workshee "Smar" Solve Block echniques Handou A he end of his lecure, you will be able o: use funcions in solve block equaions o improve convergence consruc solve blocks wih minimal

More information

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18 A Firs ourse on Kineics and Reacion Engineering lass 19 on Uni 18 Par I - hemical Reacions Par II - hemical Reacion Kineics Where We re Going Par III - hemical Reacion Engineering A. Ideal Reacors B. Perfecly

More information

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3

More information

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details! MAT 257, Handou 6: Ocober 7-2, 20. I. Assignmen. Finish reading Chaper 2 of Spiva, rereading earlier secions as necessary. handou and fill in some missing deails! II. Higher derivaives. Also, read his

More information

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology

More information

!!"#"$%&#'()!"#&'(*%)+,&',-)./0)1-*23)

!!#$%&#'()!#&'(*%)+,&',-)./0)1-*23) "#"$%&#'()"#&'(*%)+,&',-)./)1-*) #$%&'()*+,&',-.%,/)*+,-&1*#$)()5*6$+$%*,7&*-'-&1*(,-&*6&,7.$%$+*&%'(*8$&',-,%'-&1*(,-&*6&,79*(&,%: ;..,*&1$&$.$%&'()*1$$.,'&',-9*(&,%)?%*,('&5

More information

Ensemble Confidence Estimates Posterior Probability

Ensemble Confidence Estimates Posterior Probability Ensemble Esimaes Poserior Probabiliy Michael Muhlbaier, Aposolos Topalis, and Robi Polikar Rowan Universiy, Elecrical and Compuer Engineering, Mullica Hill Rd., Glassboro, NJ 88, USA {muhlba6, opali5}@sudens.rowan.edu

More information

Explaining Total Factor Productivity. Ulrich Kohli University of Geneva December 2015

Explaining Total Factor Productivity. Ulrich Kohli University of Geneva December 2015 Explaining Toal Facor Produciviy Ulrich Kohli Universiy of Geneva December 2015 Needed: A Theory of Toal Facor Produciviy Edward C. Presco (1998) 2 1. Inroducion Toal Facor Produciviy (TFP) has become

More information

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is UNIT IMPULSE RESPONSE, UNIT STEP RESPONSE, STABILITY. Uni impulse funcion (Dirac dela funcion, dela funcion) rigorously defined is no sricly a funcion, bu disribuion (or measure), precise reamen requires

More information

Announcements: Warm-up Exercise:

Announcements: Warm-up Exercise: Fri Apr 13 7.1 Sysems of differenial equaions - o model muli-componen sysems via comparmenal analysis hp//en.wikipedia.org/wiki/muli-comparmen_model Announcemens Warm-up Exercise Here's a relaively simple

More information

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still. Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in

More information

Christos Papadimitriou & Luca Trevisan November 22, 2016

Christos Papadimitriou & Luca Trevisan November 22, 2016 U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream

More information

IMPLICIT AND INVERSE FUNCTION THEOREMS PAUL SCHRIMPF 1 OCTOBER 25, 2013

IMPLICIT AND INVERSE FUNCTION THEOREMS PAUL SCHRIMPF 1 OCTOBER 25, 2013 IMPLICI AND INVERSE FUNCION HEOREMS PAUL SCHRIMPF 1 OCOBER 25, 213 UNIVERSIY OF BRIISH COLUMBIA ECONOMICS 526 We have exensively sudied how o solve sysems of linear equaions. We know how o check wheher

More information

On Measuring Pro-Poor Growth. 1. On Various Ways of Measuring Pro-Poor Growth: A Short Review of the Literature

On Measuring Pro-Poor Growth. 1. On Various Ways of Measuring Pro-Poor Growth: A Short Review of the Literature On Measuring Pro-Poor Growh 1. On Various Ways of Measuring Pro-Poor Growh: A Shor eview of he Lieraure During he pas en years or so here have been various suggesions concerning he way one should check

More information

Notes on online convex optimization

Notes on online convex optimization Noes on online convex opimizaion Karl Sraos Online convex opimizaion (OCO) is a principled framework for online learning: OnlineConvexOpimizaion Inpu: convex se S, number of seps T For =, 2,..., T : Selec

More information

Biol. 356 Lab 8. Mortality, Recruitment, and Migration Rates

Biol. 356 Lab 8. Mortality, Recruitment, and Migration Rates Biol. 356 Lab 8. Moraliy, Recruimen, and Migraion Raes (modified from Cox, 00, General Ecology Lab Manual, McGraw Hill) Las week we esimaed populaion size hrough several mehods. One assumpion of all hese

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

Empirical Process Theory

Empirical Process Theory Empirical Process heory 4.384 ime Series Analysis, Fall 27 Reciaion by Paul Schrimpf Supplemenary o lecures given by Anna Mikusheva Ocober 7, 28 Reciaion 7 Empirical Process heory Le x be a real-valued

More information

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 17

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 17 EES 16A Designing Informaion Devices and Sysems I Spring 019 Lecure Noes Noe 17 17.1 apaciive ouchscreen In he las noe, we saw ha a capacior consiss of wo pieces on conducive maerial separaed by a nonconducive

More information

FINM 6900 Finance Theory

FINM 6900 Finance Theory FINM 6900 Finance Theory Universiy of Queensland Lecure Noe 4 The Lucas Model 1. Inroducion In his lecure we consider a simple endowmen economy in which an unspecified number of raional invesors rade asses

More information

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency)

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency) Foundaions of Saisical Inference Julien Beresycki Lecure 2 - Sufficiency, Facorizaion, Minimal sufficiency Deparmen of Saisics Universiy of Oxford MT 2016 Julien Beresycki (Universiy of Oxford BS2a MT

More information

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes Some common engineering funcions 2.7 Inroducion This secion provides a caalogue of some common funcions ofen used in Science and Engineering. These include polynomials, raional funcions, he modulus funcion

More information

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon 3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of

More information

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data Chaper 2 Models, Censoring, and Likelihood for Failure-Time Daa William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based

More information

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin ACE 56 Fall 005 Lecure 4: Simple Linear Regression Model: Specificaion and Esimaion by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Simple Regression: Economic and Saisical Model

More information

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018 MATH 5720: Gradien Mehods Hung Phan, UMass Lowell Ocober 4, 208 Descen Direcion Mehods Consider he problem min { f(x) x R n}. The general descen direcions mehod is x k+ = x k + k d k where x k is he curren

More information

Guest Lectures for Dr. MacFarlane s EE3350 Part Deux

Guest Lectures for Dr. MacFarlane s EE3350 Part Deux Gues Lecures for Dr. MacFarlane s EE3350 Par Deux Michael Plane Mon., 08-30-2010 Wrie name in corner. Poin ou his is a review, so I will go faser. Remind hem o go lisen o online lecure abou geing an A

More information

CMU-Q Lecture 3: Search algorithms: Informed. Teacher: Gianni A. Di Caro

CMU-Q Lecture 3: Search algorithms: Informed. Teacher: Gianni A. Di Caro CMU-Q 5-38 Lecure 3: Search algorihms: Informed Teacher: Gianni A. Di Caro UNINFORMED VS. INFORMED SEARCH Sraegy How desirable is o be in a cerain inermediae sae for he sake of (effecively) reaching a

More information

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum. January 01 Final Exam Quesions: Mark W. Wason (Poins/Minues are given in Parenheses) (15) 1. Suppose ha y follows he saionary AR(1) process y = y 1 +, where = 0.5 and ~ iid(0,1). Le x = (y + y 1 )/. (11)

More information

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004 ODEs II, Lecure : Homogeneous Linear Sysems - I Mike Raugh March 8, 4 Inroducion. In he firs lecure we discussed a sysem of linear ODEs for modeling he excreion of lead from he human body, saw how o ransform

More information

Expert Advice for Amateurs

Expert Advice for Amateurs Exper Advice for Amaeurs Ernes K. Lai Online Appendix - Exisence of Equilibria The analysis in his secion is performed under more general payoff funcions. Wihou aking an explici form, he payoffs of he

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H. ACE 564 Spring 2006 Lecure 7 Exensions of The Muliple Regression Model: Dumm Independen Variables b Professor Sco H. Irwin Readings: Griffihs, Hill and Judge. "Dumm Variables and Varing Coefficien Models

More information

Let us start with a two dimensional case. We consider a vector ( x,

Let us start with a two dimensional case. We consider a vector ( x, Roaion marices We consider now roaion marices in wo and hree dimensions. We sar wih wo dimensions since wo dimensions are easier han hree o undersand, and one dimension is a lile oo simple. However, our

More information

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t M ah 5 2 7 Fall 2 0 0 9 L ecure 1 0 O c. 7, 2 0 0 9 Hamilon- J acobi Equaion: Explici Formulas In his lecure we ry o apply he mehod of characerisics o he Hamilon-Jacobi equaion: u + H D u, x = 0 in R n

More information

More Digital Logic. t p output. Low-to-high and high-to-low transitions could have different t p. V in (t)

More Digital Logic. t p output. Low-to-high and high-to-low transitions could have different t p. V in (t) EECS 4 Spring 23 Lecure 2 EECS 4 Spring 23 Lecure 2 More igial Logic Gae delay and signal propagaion Clocked circui elemens (flip-flop) Wriing a word o memory Simplifying digial circuis: Karnaugh maps

More information

Online Learning with Partial Feedback. 1 Online Mirror Descent with Estimated Gradient

Online Learning with Partial Feedback. 1 Online Mirror Descent with Estimated Gradient Avance Course in Machine Learning Spring 2010 Online Learning wih Parial Feeback Hanous are joinly prepare by Shie Mannor an Shai Shalev-Shwarz In previous lecures we alke abou he general framework of

More information

This document was generated at 7:34 PM, 07/27/09 Copyright 2009 Richard T. Woodward

This document was generated at 7:34 PM, 07/27/09 Copyright 2009 Richard T. Woodward his documen was generaed a 7:34 PM, 07/27/09 Copyrigh 2009 Richard. Woodward 15. Bang-bang and mos rapid approach problems AGEC 637 - Summer 2009 here are some problems for which he opimal pah does no

More information

Mathcad Lecture #8 In-class Worksheet Curve Fitting and Interpolation

Mathcad Lecture #8 In-class Worksheet Curve Fitting and Interpolation Mahcad Lecure #8 In-class Workshee Curve Fiing and Inerpolaion A he end of his lecure, you will be able o: explain he difference beween curve fiing and inerpolaion decide wheher curve fiing or inerpolaion

More information

The Brock-Mirman Stochastic Growth Model

The Brock-Mirman Stochastic Growth Model c December 3, 208, Chrisopher D. Carroll BrockMirman The Brock-Mirman Sochasic Growh Model Brock and Mirman (972) provided he firs opimizing growh model wih unpredicable (sochasic) shocks. The social planner

More information

Lecture 2 April 04, 2018

Lecture 2 April 04, 2018 Sas 300C: Theory of Saisics Spring 208 Lecure 2 April 04, 208 Prof. Emmanuel Candes Scribe: Paulo Orensein; edied by Sephen Baes, XY Han Ouline Agenda: Global esing. Needle in a Haysack Problem 2. Threshold

More information

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering Inroducion o Arificial Inelligence V22.0472-001 Fall 2009 Lecure 18: aricle & Kalman Filering Announcemens Final exam will be a 7pm on Wednesday December 14 h Dae of las class 1.5 hrs long I won ask anyhing

More information

EE 315 Notes. Gürdal Arslan CLASS 1. (Sections ) What is a signal?

EE 315 Notes. Gürdal Arslan CLASS 1. (Sections ) What is a signal? EE 35 Noes Gürdal Arslan CLASS (Secions.-.2) Wha is a signal? In his class, a signal is some funcion of ime and i represens how some physical quaniy changes over some window of ime. Examples: velociy of

More information