Neural Network (Basic Ideas) Hung-yi Lee

Size: px
Start display at page:

Download "Neural Network (Basic Ideas) Hung-yi Lee"

Transcription

1 Neur Network (Bsc Ides) Hung-y Lee

2 Lernng Lookng for Functon Speech Recognton f Hndwrtten Recognton f Wether forecst f Py vdeo gmes f wether tody Postons nd numer of enemes 你好 sunny tomorrow fre

3 Frmework x : ŷ : (e) Mode Hypothess Functon Set f, f y Trnng: Pck the est Functon f * Best Functon f * Testng: f x y Trnng Dt x : functon nput ŷ : functon output x, yˆ, x, yˆ,

4 Outne. Wht s the mode (functon hypothess set)?. Wht s the est functon? 3. How to pck the est functon?

5 Tsk Consdered Tody Cssfcton Bnry Cssfcton Ony two csses nput oject Css A (yes) Css B (no) Spm fterng Is n e-m spm or not? Recommendton systems recommend the product to the customer or not? Mwre detecton Is the softwre mcous or not? Stock predcton W the future vue of stock ncrese or not?

6 Tsk Consdered Tody Cssfcton Bnry Cssfcton Ony two csses Mut-css Cssfcton More thn two csses nput oject Css A (yes) Css B (no) nput oject Css A Css B Css C

7 Mut-css Cssfcton Hndwrtng Dgt Cssfcton Input: Css:,,., 9, csses Imge Recognton Input: Css: dog, ct, ook,. Thousnds of csses

8 Mut-css Cssfcton Re speech recognton s not mut-css cssfcton The HW s mut-css cssfcton frme Input: Csses: h, how re you, I m sorry. Cnnot e enumerted // // /ε/ The frme eongs to whch phoneme. Csses re the phonemes.

9 . Wht s the mode?

10 Wht s the functon we re ookng for? cssfcton y = f x f: R N R M x: nput oject to e cssfed y: css Assume oth x nd y cn e represented s fxed-sze vector x s vector wth N dmensons, nd y s vector wth M dmensons

11 Wht s the functon we re ookng for? Hndwrtng Dgt Cssfcton f: R N R M x: mge y: css 6 x 6 Ech pxe corresponds to n eement n the vector : for nk, : otherwse 6 x 6 = 56 dmensons dmensons for dgt recognton 3 3 or not or not 3 or not

12 . Wht s the mode? A Lyer of Neuron

13 Snge Neuron z f: R N R z x x w w z Actvton functon z y x N w N s z e z

14 Snge Neuron z f: R N R z x x w w z Actvton functon z y x N w N s z e z

15 Snge Neuron f: R N R Snge neuron cn ony do nry cssfcton, cnnot hnde mut-css cssfcton x x y x N s not "" "" y y.5.5

16 A Lyer of Neuron f: R N R M Hndwrtng dgt cssfcton Csses:,,., 9, csses x x x N If y s the mx, then the mge s. y neurons or not y or not y 3 3 or not

17 . Wht s the mode? Lmtton of Snge Lyer

18 Lmtton of Snge Lyer x w w x z z w x w x Input Output x x No Yes Yes No x yes no threshod < threshod threshod threshod Cn we? < threshod threshod x

19 Lmtton of Snge Lyer No, we cn t x w w x z x x x x

20 Lmtton of Snge Lyer x w w x z Input Output x x No Yes Yes NOT AND AND OR No

21 Neur Network NOT AND Neur Network AND OR x z x z z Hdden Neurons

22 =.73 =.7 z x x =.7 =.5 x x =.5 =.7 z x =.7 =.73 x

23 x =.73 =.7 w z =.7 =.5 w x =.5 =.7 (.73,.5) x =.7 =.73 (.7,.7) (.5,.73) x

24 . Wht s the mode? Neur Network

25 Neur Network s Mode Input x f: R N R M Lyer Lyer Lyer L Output y vector x x x N Input Lyer Hdden Lyers Output Lyer Fuy connected feedforwrd network Deep Neur Network: mny hdden yers y y M vector y

26 Notton j Lyer nodes N j Lyer N nodes Output of neuron: Lyer Neuron Output of one yer: : vector

27 Notton Lyer nodes N j j w j Lyer N nodes W w j w w Lyer to Lyer from neuron j (Lyer ) to neuron (Lyer ) N w w N

28 Notton Lyer nodes N j j Lyer N nodes : s for neuron t yer s for neurons n yer

29 Notton j Lyer nodes N j w w w j z Lyer N z z z nodes : nput of the ctvton functon for neuron t yer : nput of the ctvton functon the neurons n yer w z w N j w j j

30 Notton - Summry :output of neuron w j : weght :output of yer W : weght mtrx z : nput of ctvton functon : s z : nput of ctvton functon for yer : s vector

31 Retons etween Lyer Outputs Lyer nodes N j j z z z z Lyer N nodes

32 Retons etween Lyer Outputs nodes N Lyer Lyer nodes N j j z z z z w w w w z z z W z w w z w w z w w z

33 Retons etween Lyer Outputs z z z z z nodes N Lyer Lyer nodes N j j z z z z

34 Retons etween Lyer Outputs z z z W z j j z z W Lyer nodes N Lyer N nodes

35 Functon of Neur Network Input x W, Lyer Lyer Lyer L W, W L, L Output y vector x x x N x W x W L y y M L L- L L W vector y y

36 Functon of Neur Network Input x W, Lyer Lyer Lyer L W, W L, L Output y vector x x y vector y x N y M y f x W L W W x L

37 . Wht s the est functon?

38 Best Functon = Best Prmeters y f L W W W x x functon set f x; W,, W, W, L L ecuse dfferent prmeters W nd ed to dfferent functon Form wy to defne functon set: prmeter set L Pck the est functon f* Pck the est prmeter set θ*

39 Cost Functon Defne functon for prmeter set C θ C θ evute how d prmeter set s The est prmeter set θ s the one tht mnmzes C θ θ = rg mn θ C θ C θ s ced cost/oss/error functon If you defne the goodness of the prmeter set y nother functon O θ O θ s ced ojectve functon

40 Cost Functon Gven trnng dt: r r R R x, yˆ x, yˆ x, yˆ Hndwrtng Dgt Cssfcton sum over trnng exmpes C R r f x ; r yˆ r Mnmze dstnce 3

41 3. How to pck the est functon? Grdent Descent

42 Sttement of Proems Sttement of proems: There s functon C(θ) θ represents prmeter set θ = {θ, θ, θ 3, } Fnd θ * tht mnmzes C(θ) Brute force? Enumerte posse θ Ccuus? Fnd θ * such tht C C, * *,

43 Grdent Descent Ide For smpfcton, frst consder tht θ hs ony one vre C Drop somewhere When the stops, we fnd the oc mnm 3

44 Grdent Descent Ide η s ced ernng rte For smpfcton, frst consder tht θ hs ony one vre C Rndomy strt t θ Compute θ θ η dc θ Τdθ dc θ Τdθ Compute dc θ Τdθ θ θ η dc θ Τdθ

45 Grdent Descent Suppose tht θ hs two vres {θ, θ } Rndomy strt t θ = θ Compute the grdents of C θ t θ : C θ = Updte prmeters θ θ θ = θ θ η C θ Τ θ C θ Τ θ C θ Τ θ C θ Τ θ θ = θ η C θ Compute the grdents of C θ t θ : C θ C θ Τ = θ C θ Τ θ

46 Grdent Descent θ C θ C θ Strt t poston θ θ θ Grdent Movement θ θ 3 C θ C θ 3 Compute grdent t θ Move to θ = θ - η C θ Compute grdent t θ Move to θ = θ η C θ θ

47 Form Dervton of Grdent Descent Suppose tht θ hs two vres {θ, θ } C(θ) Gven pont, we cn esy fnd the pont wth the smest vue nery. How?

48 Form Dervton of Grdent Descent Tyor seres: Let h(x) e nfntey dfferente round x = x. h x k h h k x k k! x x x hx x x x x x h! When x s cose to x h x h x h x x x

49 E.g. Tyor seres for h(x)=sn(x) round x =π/4 sn(x)= The pproxmton s good round π/4.

50 Mutvre Tyor seres,,,, y y y y x h x x x y x h y x h y x h When x nd y s cose to x nd y,,,, y y y y x h x x x y x h y x h y x h + somethng reted to (x-x ) nd (y-y ) +

51 Form Dervton of Grdent Descent Bsed on Tyor Seres: If the red crce s sm enough, n the red crce u C C, s C, C, C, C, C, C, v v s u, C(θ)

52 Form Dervton of Grdent Descent Fnd θ nd θ yedng the smest vue of C θ n the crce v u s C v u, C,, C v u, C, C Its vue dependng on the rdus of the crce, u nd v. Ths s how grdent descent updtes prmeters. Bsed on Tyor Seres: If the red crce s sm enough, n the red crce, C

53 Grdent Descent for Neur Network Strtng Prmeters C C ompute C C C ompute C C L L, W,, W,,, W,, W, j w C C w w w w Mons of prmeters To compute the grdents effcenty, we use ckpropgton.

54 Stuck t oc mnm? Sdde pont Who s Afrd of Non- Convex Loss Functons? t/em7_ecun_w/ Deep Lernng: Theoretc Motvtons t/deepernng5_e ngo_theoretc_motv tons/

55 3. How to pck the est functon? Prctc Issues for neur network

56 Prctc Issues for neur network Prmeter Intzton Lernng Rte Stochstc grdent descent nd Mn-tch Recpe for Lernng

57 Prmeter Intzton For grdent Descent, we need to pck n ntzton prmeter θ. The ntzton prmeters hve some nfuence to the trnng. We w go ck to ths ssue n the future. Suggeston tody: Do not set the prmeters θ equ Set the prmeters n θ rndomy

58 Lernng Rte C Set the ernng rte η crefuy cost Very Lrge Lrge Just mke sm No. of prmeters updtes Error Surfce

59 Lernng Rte C Set the ernng rte η crefuy Toy Exmpe x w z y z y * w Trnng Dt ( exmpes) x = [.,.5,.,.5,.,.5, 3., 3.5, 4., 4.5, 5., 5.5, 6., 6.5, 7., 7.5, 8., 8.5, 9., 9.5] y = [.,.4,.9,.6,.,.5,.8, 3.5, 3.9, 4.7, 5., 5.3, 6.3, 6.5, 6.7, 7.5, 8., 8.5, 8.9, 9.5]

60 Lernng Rte Toy Exmpe Error Surfce: C(w,) C strt trget

61 . Lernng Rte Toy Exmpe Dfferent ernng rte η.. ~ 3k updtes ~.3k updtes

62 Stochstc Grdent Descent nd Mn-tch Grdent Descent Stochstc Grdent Descent C r r C R C r C Pck n exmpe x r If exmpe x r hve equ protes to e pcked r r r C R C E Fster! Better! r r r y x f R C ˆ ; r r R C

63 Stochstc Grdent Descent nd Mn-tch When usng stochstc grdent descent C Strtng t θ Trnng Dt: R R r r y x y x y x y x ˆ,, ˆ,, ˆ,, ˆ, C pck x pck x pck x r r r r r C pck x R R R R R C pck x R R R C Seen the exmpes once One epoch Wht s epoch?

64 Stochstc Grdent Descent nd Mn-tch Toy Exmpe Grdent Descent Updte fter seeng exmpes See exmpes Stochstc Grdent Descent If there re exmpes, updte tmes n one epoch. See ony one exmpe epoch

65 Stochstc Grdent Descent nd Mn-tch Grdent Descent C r C C Stochstc Grdent Descent Pck n exmpe x r Shuffe your dt Mn Btch Grdent Descent Pck B exmpes s C r r C tch B s tch sze B R x r Averge the grdent of the exmpes n the tch r

66 Stochstc Grdent Descent nd Mn-tch Hndwrtng Dgt Cssfcton Btch sze = Grdent Descent

67 Stochstc Grdent Descent nd Mn-tch Why mn-tch s fster thn stochstc grdent descent? Stochstc Grdent Descent z = W x z = W x Mn-tch mtrx z z = W x x Prctcy, whch one s fster?

68 Recpe for Lernng Dt provded n homework Testng Dt Trnng Dt Vdton Re Testng x ŷ x y x y Best Functon f *

69 Recpe for Lernng Dt provded n homework Testng Dt Trnng Dt Vdton Re Testng x ŷ x y x y Immedtey know the ccurcy Do not know the ccurcy unt the dedne (wht rey count)

70 Recpe for Lernng Do I get good resuts on trnng set? no Modfy your trnng process Your code hs ug. Cn not fnd good functon Stuck t oc mnm, sdde ponts. Chnge the trnng strtegy Bd mode There s no good functon n the hypothess functon set. Proy you need gger network

71 Recpe for Lernng Do I get good resuts on trnng set? yes Do I get good resuts on vdton set? yes done no no Modfy your trnng process Preventng Overfttng Your code usuy do not hve ug t ths stuton.

72 Recpe for Lernng - Overfttng You pck est prmeter set θ * r r Trnng Dt: x, yˆ r r However, r : f x ; * Testng Dt: x u u u ˆ f x ; * Trnng dt nd testng dt hve dfferent dstruton. Trnng Dt: y Testng Dt: yˆ

73 Recpe for Lernng - Overfttng Pnce: Hve more trnng dt You cn do tht n re ppcton, ut you cn t do tht n homework. We w go ck to ths ssue n the future.

74 Concudng Remrks. Wht s the mode (functon hypothess set)? Neur Network. Wht s the est functon? Cost Functon 3. How to pck the est functon? Prmeter Intzton Lernng Rte Stochstc grdent descent, Mn-tch Recpe for Lernng Grdent Descent

75 Acknowedgement 感謝余朗祺同學於上課時糾正投影片上的拼字錯誤 感謝吳柏瑜同學糾正投影片上的 notton 錯誤 感謝 Yes Hung 糾正投影片上的打字錯誤

Neural Network Introduction. Hung-yi Lee

Neural Network Introduction. Hung-yi Lee Neu Neto Intoducton Hung- ee Reve: Supevsed enng Mode Hpothess Functon Set f, f : : (e) Tnng: Pc the est Functon f * Best Functon f * Testng: f Tnng Dt : functon nput : functon output, ˆ,, ˆ, Neu Neto

More information

SVMs for regression Multilayer neural networks

SVMs for regression Multilayer neural networks Lecture SVMs for regresson Muter neur netors Mos Husrecht mos@cs.ptt.edu 539 Sennott Squre Support vector mchne SVM SVM mmze the mrgn round the seprtng hperpne. he decson functon s fu specfed suset of

More information

Support vector machines for regression

Support vector machines for regression S 75 Mchne ernng ecture 5 Support vector mchnes for regresson Mos Huskrecht mos@cs.ptt.edu 539 Sennott Squre S 75 Mchne ernng he decson oundr: ˆ he decson: Support vector mchnes ˆ α SV ˆ sgn αˆ SV!!: Decson

More information

Partially Observable Systems. 1 Partially Observable Markov Decision Process (POMDP) Formalism

Partially Observable Systems. 1 Partially Observable Markov Decision Process (POMDP) Formalism CS294-40 Lernng for Rootcs nd Control Lecture 10-9/30/2008 Lecturer: Peter Aeel Prtlly Oservle Systems Scre: Dvd Nchum Lecture outlne POMDP formlsm Pont-sed vlue terton Glol methods: polytree, enumerton,

More information

Rank One Update And the Google Matrix by Al Bernstein Signal Science, LLC

Rank One Update And the Google Matrix by Al Bernstein Signal Science, LLC Introducton Rnk One Updte And the Google Mtrx y Al Bernsten Sgnl Scence, LLC www.sgnlscence.net here re two dfferent wys to perform mtrx multplctons. he frst uses dot product formulton nd the second uses

More information

SVMs for regression Non-parametric/instance based classification method

SVMs for regression Non-parametric/instance based classification method S 75 Mchne ernng ecture Mos Huskrecht mos@cs.ptt.edu 539 Sennott Squre SVMs for regresson Non-prmetrc/nstnce sed cssfcton method S 75 Mchne ernng Soft-mrgn SVM Aos some fet on crossng the seprtng hperpne

More information

Deep Learning. Hung-yi Lee 李宏毅

Deep Learning. Hung-yi Lee 李宏毅 Deep Learning Hung-yi Lee 李宏毅 Deep learning attracts lots of attention. I believe you have seen lots of exciting results before. Deep learning trends at Google. Source: SIGMOD 206/Jeff Dean 958: Perceptron

More information

Regression. Hung-yi Lee 李宏毅

Regression. Hung-yi Lee 李宏毅 Regression Hung-yi Lee 李宏毅 Regression: Output a scalar Stock Market Forecast f Self-driving Car = Dow Jones Industrial Average at tomorrow f = 方向盤角度 Recommendation f = 使用者 A 商品 B 購買可能性 Example Application

More information

Chapter Newton-Raphson Method of Solving a Nonlinear Equation

Chapter Newton-Raphson Method of Solving a Nonlinear Equation Chpter.4 Newton-Rphson Method of Solvng Nonlner Equton After redng ths chpter, you should be ble to:. derve the Newton-Rphson method formul,. develop the lgorthm of the Newton-Rphson method,. use the Newton-Rphson

More information

Definition of Tracking

Definition of Tracking Trckng Defnton of Trckng Trckng: Generte some conclusons bout the moton of the scene, objects, or the cmer, gven sequence of mges. Knowng ths moton, predct where thngs re gong to project n the net mge,

More information

More Tips for Training Neural Network. Hung-yi Lee

More Tips for Training Neural Network. Hung-yi Lee More Tips for Training Neural Network Hung-yi ee Outline Activation Function Cost Function Data Preprocessing Training Generalization Review: Training Neural Network Neural network: f ; θ : input (vector)

More information

Deep learning attracts lots of attention.

Deep learning attracts lots of attention. Deep Learning Deep learning attracts lots of attention. I believe you have seen lots of exciting results before. Deep learning trends at Google. Source: SIGMOD/Jeff Dean Ups and downs of Deep Learning

More information

18.7 Artificial Neural Networks

18.7 Artificial Neural Networks 310 18.7 Artfcl Neurl Networks Neuroscence hs hypotheszed tht mentl ctvty conssts prmrly of electrochemcl ctvty n networks of brn cells clled neurons Ths led McCulloch nd Ptts to devse ther mthemtcl model

More information

Convolutional Neural Network. Hung-yi Lee

Convolutional Neural Network. Hung-yi Lee al Neural Network Hung-yi Lee Why CNN for Image? [Zeiler, M. D., ECCV 2014] x 1 x 2 Represented as pixels x N The most basic classifiers Use 1 st layer as module to build classifiers Use 2 nd layer as

More information

Least squares. Václav Hlaváč. Czech Technical University in Prague

Least squares. Václav Hlaváč. Czech Technical University in Prague Lest squres Václv Hlváč Czech echncl Unversty n Prgue hlvc@fel.cvut.cz http://cmp.felk.cvut.cz/~hlvc Courtesy: Fred Pghn nd J.P. Lews, SIGGRAPH 2007 Course; Outlne 2 Lner regresson Geometry of lest-squres

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

Chapter Newton-Raphson Method of Solving a Nonlinear Equation

Chapter Newton-Raphson Method of Solving a Nonlinear Equation Chpter 0.04 Newton-Rphson Method o Solvng Nonlner Equton Ater redng ths chpter, you should be ble to:. derve the Newton-Rphson method ormul,. develop the lgorthm o the Newton-Rphson method,. use the Newton-Rphson

More information

Classification: Probabilistic Generative Model

Classification: Probabilistic Generative Model Classification: Probabilistic Generative Model Classification x Function Class n Credit Scoring Input: income, savings, profession, age, past financial history Output: accept or refuse Medical Diagnosis

More information

DCDM BUSINESS SCHOOL NUMERICAL METHODS (COS 233-8) Solutions to Assignment 3. x f(x)

DCDM BUSINESS SCHOOL NUMERICAL METHODS (COS 233-8) Solutions to Assignment 3. x f(x) DCDM BUSINESS SCHOOL NUMEICAL METHODS (COS -8) Solutons to Assgnment Queston Consder the followng dt: 5 f() 8 7 5 () Set up dfference tble through fourth dfferences. (b) Wht s the mnmum degree tht n nterpoltng

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

VECTORS VECTORS VECTORS VECTORS. 2. Vector Representation. 1. Definition. 3. Types of Vectors. 5. Vector Operations I. 4. Equal and Opposite Vectors

VECTORS VECTORS VECTORS VECTORS. 2. Vector Representation. 1. Definition. 3. Types of Vectors. 5. Vector Operations I. 4. Equal and Opposite Vectors 1. Defnton A vetor s n entt tht m represent phsl quntt tht hs mgntude nd dreton s opposed to slr tht ls dreton.. Vetor Representton A vetor n e represented grphll n rrow. The length of the rrow s the mgntude

More information

MEG 741 Energy and Variational Methods in Mechanics I

MEG 741 Energy and Variational Methods in Mechanics I EG 74 Energy nd rton ethod n echnc I Brendn J. O Tooe, Ph.D. Aocte Profeor of echnc Engneerng Hord R. Hughe Coege of Engneerng Unerty of Ned eg TBE B- (7) 895-885 j@me.un.edu Chter 4: Structur Any ethod

More information

UNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS. M.Sc. in Economics MICROECONOMIC THEORY I. Problem Set II

UNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS. M.Sc. in Economics MICROECONOMIC THEORY I. Problem Set II Mcroeconomc Theory I UNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS MSc n Economcs MICROECONOMIC THEORY I Techng: A Lptns (Note: The number of ndctes exercse s dffculty level) ()True or flse? If V( y )

More information

In this appendix, we evaluate the derivative of Eq. 9 in the main text, i.e., we need to calculate

In this appendix, we evaluate the derivative of Eq. 9 in the main text, i.e., we need to calculate Supporting Tet Evoution of the Averge Synptic Updte Rue In this ppendi e evute the derivtive of Eq. 9 in the min tet i.e. e need to ccute Py ( ) Py ( Y ) og γ og. [] P( y Y ) P% ( y Y ) Before e strt et

More information

PHYS 2421 Fields and Waves

PHYS 2421 Fields and Waves PHYS 242 Felds nd Wves Instucto: Joge A. López Offce: PSCI 29 A, Phone: 747-7528 Textook: Unvesty Physcs e, Young nd Feedmn 23. Electc potentl enegy 23.2 Electc potentl 23.3 Clcultng electc potentl 23.4

More information

Section 6: Area, Volume, and Average Value

Section 6: Area, Volume, and Average Value Chpter The Integrl Applied Clculus Section 6: Are, Volume, nd Averge Vlue Are We hve lredy used integrls to find the re etween the grph of function nd the horizontl xis. Integrls cn lso e used to find

More information

An Introduction to Support Vector Machines

An Introduction to Support Vector Machines An Introducton to Support Vector Mchnes Wht s good Decson Boundry? Consder two-clss, lnerly seprble clssfcton problem Clss How to fnd the lne (or hyperplne n n-dmensons, n>)? Any de? Clss Per Lug Mrtell

More information

Principle Component Analysis

Principle Component Analysis Prncple Component Anlyss Jng Go SUNY Bufflo Why Dmensonlty Reducton? We hve too mny dmensons o reson bout or obtn nsghts from o vsulze oo much nose n the dt Need to reduce them to smller set of fctors

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Model Fitting and Robust Regression Methods

Model Fitting and Robust Regression Methods Dertment o Comuter Engneerng Unverst o Clorn t Snt Cruz Model Fttng nd Robust Regresson Methods CMPE 64: Imge Anlss nd Comuter Vson H o Fttng lnes nd ellses to mge dt Dertment o Comuter Engneerng Unverst

More information

Study on the Normal and Skewed Distribution of Isometric Grouping

Study on the Normal and Skewed Distribution of Isometric Grouping Open Journ of Sttstcs 7-5 http://dx.do.org/.36/ojs..56 Pubshed Onne October (http://www.scp.org/journ/ojs) Study on the orm nd Skewed Dstrbuton of Isometrc Groupng Zhensheng J Wenk J Schoo of Economcs

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 9

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 9 CS434/541: Pttern Recognton Prof. Olg Veksler Lecture 9 Announcements Fnl project proposl due Nov. 1 1-2 prgrph descrpton Lte Penlt: s 1 pont off for ech d lte Assgnment 3 due November 10 Dt for fnl project

More information

Lecture 3: Equivalence Relations

Lecture 3: Equivalence Relations Mthcmp Crsh Course Instructor: Pdric Brtlett Lecture 3: Equivlence Reltions Week 1 Mthcmp 2014 In our lst three tlks of this clss, we shift the focus of our tlks from proof techniques to proof concepts

More information

Using Predictions in Online Optimization: Looking Forward with an Eye on the Past

Using Predictions in Online Optimization: Looking Forward with an Eye on the Past Usng Predctons n Onlne Optmzton: Lookng Forwrd wth n Eye on the Pst Nngjun Chen Jont work wth Joshu Comden, Zhenhu Lu, Anshul Gndh, nd Adm Wermn 1 Predctons re crucl for decson mkng 2 Predctons re crucl

More information

Katholieke Universiteit Leuven Department of Computer Science

Katholieke Universiteit Leuven Department of Computer Science Updte Rules for Weghted Non-negtve FH*G Fctorzton Peter Peers Phlp Dutré Report CW 440, Aprl 006 Ktholeke Unverstet Leuven Deprtment of Computer Scence Celestjnenln 00A B-3001 Heverlee (Belgum) Updte Rules

More information

First Midterm Examination

First Midterm Examination 24-25 Fll Semester First Midterm Exmintion ) Give the stte digrm of DFA tht recognizes the lnguge A over lphet Σ = {, } where A = {w w contins or } 2) The following DFA recognizes the lnguge B over lphet

More information

Interpreting Integrals and the Fundamental Theorem

Interpreting Integrals and the Fundamental Theorem Interpreting Integrls nd the Fundmentl Theorem Tody, we go further in interpreting the mening of the definite integrl. Using Units to Aid Interprettion We lredy know tht if f(t) is the rte of chnge of

More information

Reinforcement learning II

Reinforcement learning II CS 1675 Introduction to Mchine Lerning Lecture 26 Reinforcement lerning II Milos Huskrecht milos@cs.pitt.edu 5329 Sennott Squre Reinforcement lerning Bsics: Input x Lerner Output Reinforcement r Critic

More information

Periodic Learning of B-spline Models for Output PDF Control: Application to MWD Control

Periodic Learning of B-spline Models for Output PDF Control: Application to MWD Control 2005 Amercn Contro Conference June 8-10, 2005. Portnd, OR, USA WeB12.6 Perodc Lernng of B-spne Modes for Output PDF Contro: Appcton to MWD Contro H. Wng *, J. F. Zhng nd H. Yue Astrct Perodc ernng of B-spne

More information

Physics 1402: Lecture 7 Today s Agenda

Physics 1402: Lecture 7 Today s Agenda 1 Physics 1402: Lecture 7 Tody s gend nnouncements: Lectures posted on: www.phys.uconn.edu/~rcote/ HW ssignments, solutions etc. Homework #2: On Msterphysics tody: due Fridy Go to msteringphysics.com Ls:

More information

3/6/00. Reading Assignments. Outline. Hidden Markov Models: Explanation and Model Learning

3/6/00. Reading Assignments. Outline. Hidden Markov Models: Explanation and Model Learning 3/6/ Hdden Mrkov Models: Explnton nd Model Lernng Brn C. Wllms 6.4/6.43 Sesson 2 9/3/ courtesy of JPL copyrght Brn Wllms, 2 Brn C. Wllms, copyrght 2 Redng Assgnments AIMA (Russell nd Norvg) Ch 5.-.3, 2.3

More information

7.2 Volume. A cross section is the shape we get when cutting straight through an object.

7.2 Volume. A cross section is the shape we get when cutting straight through an object. 7. Volume Let s revew the volume of smple sold, cylnder frst. Cylnder s volume=se re heght. As llustrted n Fgure (). Fgure ( nd (c) re specl cylnders. Fgure () s rght crculr cylnder. Fgure (c) s ox. A

More information

Lecture 4: Piecewise Cubic Interpolation

Lecture 4: Piecewise Cubic Interpolation Lecture notes on Vrtonl nd Approxmte Methods n Appled Mthemtcs - A Perce UBC Lecture 4: Pecewse Cubc Interpolton Compled 6 August 7 In ths lecture we consder pecewse cubc nterpolton n whch cubc polynoml

More information

Actor-Critic. Hung-yi Lee

Actor-Critic. Hung-yi Lee Actor-Critic Hung-yi Lee Asynchronous Advntge Actor-Critic (A3C) Volodymyr Mnih, Adrià Puigdomènech Bdi, Mehdi Mirz, Alex Grves, Timothy P. Lillicrp, Tim Hrley, Dvid Silver, Kory Kvukcuoglu, Asynchronous

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Research Article Special Issue

Research Article Special Issue ournl of Fundmentl nd Appled Scences ISSN 1112-9867 Reserch Artcle Specl Issue Avlble onlne t http://www.fs.nfo A PERFORMANCE EVALUATION OF PRUNING EFFECTS ON HYBRID NEURAL NETWORK S. Y. Leow* 1, K. S.

More information

S56 (5.3) Vectors.notebook January 29, 2016

S56 (5.3) Vectors.notebook January 29, 2016 Dily Prctice 15.1.16 Q1. The roots of the eqution (x 1)(x + k) = 4 re equl. Find the vlues of k. Q2. Find the rte of chnge of 剹 x when x = 1 / 8 Tody we will e lerning out vectors. Q3. Find the eqution

More information

Collective Network of Evolutionary Binary Classifiers for Content-Based Image Retrieval

Collective Network of Evolutionary Binary Classifiers for Content-Based Image Retrieval Coectve Networ of Evoutonry Bnry Cssfers for Content-Bsed Imge Retrev Sern Krnyz, Stefn Uhmnn, Jenn Punen nd Moncef Gbbouj Dept. of Sgn Processng Tmpere Unversty of Technoogy Tmpere, Fnnd {frstnme.stnme}@tut.f

More information

What would be a reasonable choice of the quantization step Δ?

What would be a reasonable choice of the quantization step Δ? CE 108 HOMEWORK 4 EXERCISE 1. Suppose you are samplng the output of a sensor at 10 KHz and quantze t wth a unform quantzer at 10 ts per sample. Assume that the margnal pdf of the sgnal s Gaussan wth mean

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Connected-components. Summary of lecture 9. Algorithms and Data Structures Disjoint sets. Example: connected components in graphs

Connected-components. Summary of lecture 9. Algorithms and Data Structures Disjoint sets. Example: connected components in graphs Prm University, Mth. Deprtment Summry of lecture 9 Algorithms nd Dt Structures Disjoint sets Summry of this lecture: (CLR.1-3) Dt Structures for Disjoint sets: Union opertion Find opertion Mrco Pellegrini

More information

Module 3: Element Properties Lecture 5: Solid Elements

Module 3: Element Properties Lecture 5: Solid Elements Modue : Eement Propertes eture 5: Sod Eements There re two s fmes of three-dmenson eements smr to two-dmenson se. Etenson of trngur eements w produe tetrhedrons n three dmensons. Smr retngur preeppeds

More information

List all of the possible rational roots of each equation. Then find all solutions (both real and imaginary) of the equation. 1.

List all of the possible rational roots of each equation. Then find all solutions (both real and imaginary) of the equation. 1. Mth Anlysis CP WS 4.X- Section 4.-4.4 Review Complete ech question without the use of grphing clcultor.. Compre the mening of the words: roots, zeros nd fctors.. Determine whether - is root of 0. Show

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

The practical version

The practical version Roerto s Notes on Integrl Clculus Chpter 4: Definite integrls nd the FTC Section 7 The Fundmentl Theorem of Clculus: The prcticl version Wht you need to know lredy: The theoreticl version of the FTC. Wht

More information

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4 Intermedite Mth Circles Wednesdy, Novemer 14, 2018 Finite Automt II Nickols Rollick nrollick@uwterloo.c Regulr Lnguges Lst time, we were introduced to the ide of DFA (deterministic finite utomton), one

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Math 124B January 24, 2012

Math 124B January 24, 2012 Mth 24B Jnury 24, 22 Viktor Grigoryn 5 Convergence of Fourier series Strting from the method of seprtion of vribes for the homogeneous Dirichet nd Neumnn boundry vue probems, we studied the eigenvue probem

More information

Language Modeling. Hung-yi Lee 李宏毅

Language Modeling. Hung-yi Lee 李宏毅 Language Modeling Hung-yi Lee 李宏毅 Language modeling Language model: Estimated the probability of word sequence Word sequence: w 1, w 2, w 3,., w n P(w 1, w 2, w 3,., w n ) Application: speech recognition

More information

Directional Independent Component Analysis with Tensor Representation

Directional Independent Component Analysis with Tensor Representation Drecton Independent Component Anyss wth Tensor Representton Le Zhng 1, Qunxue Go 1,2 nd Dvd Zhng 1 1 Bometrc Reserch Center, The Hong ong Poytechnc Unversty, Hong ong, Chn 2 Schoo of Teecommunctons Engneerng,

More information

Product Layout Optimization and Simulation Model in a Multi-level Distribution Center

Product Layout Optimization and Simulation Model in a Multi-level Distribution Center Avbe onne t www.scencedrect.com Systems Engneerng Proced (0) 300 307 Product yout Optmzton nd Smuton Mode n Mut-eve Dstrbuton Center Ynru Chen,Qnn Xo, Xopng Tng Southwest otong unversty,chengdu,6003,p.r.chn

More information

ESCI 342 Atmospheric Dynamics I Lesson 1 Vectors and Vector Calculus

ESCI 342 Atmospheric Dynamics I Lesson 1 Vectors and Vector Calculus ESI 34 tmospherc Dnmcs I Lesson 1 Vectors nd Vector lculus Reference: Schum s Outlne Seres: Mthemtcl Hndbook of Formuls nd Tbles Suggested Redng: Mrtn Secton 1 OORDINTE SYSTEMS n orthonorml coordnte sstem

More information

Learning Enhancement Team

Learning Enhancement Team Lernng Enhnement Tem Worsheet: The Cross Produt These re the model nswers for the worsheet tht hs questons on the ross produt etween vetors. The Cross Produt study gude. z x y. Loong t mge, you n see tht

More information

where I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X).

where I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X). 11.4.1 Estmaton of Multple Regresson Coeffcents In multple lnear regresson, we essentally solve n equatons for the p unnown parameters. hus n must e equal to or greater than p and n practce n should e

More information

Bi-level models for OD matrix estimation

Bi-level models for OD matrix estimation TNK084 Trffc Theory seres Vol.4, number. My 2008 B-level models for OD mtrx estmton Hn Zhng, Quyng Meng Abstrct- Ths pper ntroduces two types of O/D mtrx estmton model: ME2 nd Grdent. ME2 s mxmum-entropy

More information

Variable time amplitude amplification and quantum algorithms for linear algebra. Andris Ambainis University of Latvia

Variable time amplitude amplification and quantum algorithms for linear algebra. Andris Ambainis University of Latvia Vrble tme mpltude mplfcton nd quntum lgorthms for lner lgebr Andrs Ambns Unversty of Ltv Tlk outlne. ew verson of mpltude mplfcton;. Quntum lgorthm for testng f A s sngulr; 3. Quntum lgorthm for solvng

More information

Decomposition of Boolean Function Sets for Boolean Neural Networks

Decomposition of Boolean Function Sets for Boolean Neural Networks Decomposton of Boolen Functon Sets for Boolen Neurl Netorks Romn Kohut,, Bernd Stenbch Freberg Unverst of Mnng nd Technolog Insttute of Computer Scence Freberg (Schs), Germn Outlne Introducton Boolen Neuron

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture #16 Scribe: Yannan Wang April 3, 2014

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture #16 Scribe: Yannan Wang April 3, 2014 COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #16 Scrbe: Yannan Wang Aprl 3, 014 1 Introducton The goal of our onlne learnng scenaro from last class s C comparng wth best expert and

More information

Multiple view geometry

Multiple view geometry EECS 442 Computer vson Multple vew geometry Perspectve Structure from Moton - Perspectve structure from moton prolem - mgutes - lgerc methods - Fctorzton methods - Bundle djustment - Self-clrton Redng:

More information

Math 497C Sep 17, Curves and Surfaces Fall 2004, PSU

Math 497C Sep 17, Curves and Surfaces Fall 2004, PSU Mth 497C Sep 17, 004 1 Curves nd Surfces Fll 004, PSU Lecture Notes 3 1.8 The generl defnton of curvture; Fox-Mlnor s Theorem Let α: [, b] R n be curve nd P = {t 0,...,t n } be prtton of [, b], then the

More information

SOLVING SYSTEMS OF EQUATIONS, ITERATIVE METHODS

SOLVING SYSTEMS OF EQUATIONS, ITERATIVE METHODS ELM Numericl Anlysis Dr Muhrrem Mercimek SOLVING SYSTEMS OF EQUATIONS, ITERATIVE METHODS ELM Numericl Anlysis Some of the contents re dopted from Lurene V. Fusett, Applied Numericl Anlysis using MATLAB.

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Coalgebra, Lecture 15: Equations for Deterministic Automata

Coalgebra, Lecture 15: Equations for Deterministic Automata Colger, Lecture 15: Equtions for Deterministic Automt Julin Slmnc (nd Jurrin Rot) Decemer 19, 2016 In this lecture, we will study the concept of equtions for deterministic utomt. The notes re self contined

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Abhilasha Classes Class- XII Date: SOLUTION (Chap - 9,10,12) MM 50 Mob no

Abhilasha Classes Class- XII Date: SOLUTION (Chap - 9,10,12) MM 50 Mob no hlsh Clsses Clss- XII Dte: 0- - SOLUTION Chp - 9,0, MM 50 Mo no-996 If nd re poston vets of nd B respetvel, fnd the poston vet of pont C n B produed suh tht C B vet r C B = where = hs length nd dreton

More information

Statistics 423 Midterm Examination Winter 2009

Statistics 423 Midterm Examination Winter 2009 Sttstcs 43 Mdterm Exmnton Wnter 009 Nme: e-ml: 1. Plese prnt your nme nd e-ml ddress n the bove spces.. Do not turn ths pge untl nstructed to do so. 3. Ths s closed book exmnton. You my hve your hnd clcultor

More information

Non-Linear Data for Neural Networks Training and Testing

Non-Linear Data for Neural Networks Training and Testing Proceedngs of the 4th WSEAS Int Conf on Informton Securty, Communctons nd Computers, Tenerfe, Spn, December 6-8, 005 (pp466-47) Non-Lner Dt for Neurl Networks Trnng nd Testng ABDEL LATIF ABU-DALHOUM MOHAMMED

More information

CSC 411 / CSC D11 / CSC C11

CSC 411 / CSC D11 / CSC C11 18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t

More information

7.1 Integral as Net Change and 7.2 Areas in the Plane Calculus

7.1 Integral as Net Change and 7.2 Areas in the Plane Calculus 7.1 Integrl s Net Chnge nd 7. Ares in the Plne Clculus 7.1 INTEGRAL AS NET CHANGE Notecrds from 7.1: Displcement vs Totl Distnce, Integrl s Net Chnge We hve lredy seen how the position of n oject cn e

More information

Demand. Demand and Comparative Statics. Graphically. Marshallian Demand. ECON 370: Microeconomic Theory Summer 2004 Rice University Stanley Gilbert

Demand. Demand and Comparative Statics. Graphically. Marshallian Demand. ECON 370: Microeconomic Theory Summer 2004 Rice University Stanley Gilbert Demnd Demnd nd Comrtve Sttcs ECON 370: Mcroeconomc Theory Summer 004 Rce Unversty Stnley Glbert Usng the tools we hve develoed u to ths ont, we cn now determne demnd for n ndvdul consumer We seek demnd

More information

Semi-supervised Learning

Semi-supervised Learning Semi-supervised Learning Introduction Supervised learning: x r, y r R r=1 E.g.x r : image, y r : class labels Semi-supervised learning: x r, y r r=1 R, x u R+U u=r A set of unlabeled data, usually U >>

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

set is not closed under matrix [ multiplication, ] and does not form a group.

set is not closed under matrix [ multiplication, ] and does not form a group. Prolem 2.3: Which of the following collections of 2 2 mtrices with rel entries form groups under [ mtrix ] multipliction? i) Those of the form for which c d 2 Answer: The set of such mtrices is not closed

More information

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014 CMPSCI 250: Introduction to Computtion Lecture #31: Wht DFA s Cn nd Cn t Do Dvid Mix Brrington 9 April 2014 Wht DFA s Cn nd Cn t Do Deterministic Finite Automt Forml Definition of DFA s Exmples of DFA

More information

Introduction to the Introduction to Artificial Neural Network

Introduction to the Introduction to Artificial Neural Network Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp

More information

Graphical rules for SU(N)

Graphical rules for SU(N) M/FP/Prours of Physque Théorque Invrnes n physs nd group theory Grph rues for SU(N) In ths proem, we de wth grph nguge, whh turns out to e very usefu when omputng group ftors n Yng-Ms fed theory onstruted

More information

IV. Performance Optimization

IV. Performance Optimization IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton

More information

CISE 301: Numerical Methods Lecture 5, Topic 4 Least Squares, Curve Fitting

CISE 301: Numerical Methods Lecture 5, Topic 4 Least Squares, Curve Fitting CISE 3: umercl Methods Lecture 5 Topc 4 Lest Squres Curve Fttng Dr. Amr Khouh Term Red Chpter 7 of the tetoo c Khouh CISE3_Topc4_Lest Squre Motvton Gven set of epermentl dt 3 5. 5.9 6.3 The reltonshp etween

More information

Homework Assignment 6 Solution Set

Homework Assignment 6 Solution Set Homework Assignment 6 Solution Set PHYCS 440 Mrch, 004 Prolem (Griffiths 4.6 One wy to find the energy is to find the E nd D fields everywhere nd then integrte the energy density for those fields. We know

More information

Minnesota State University, Mankato 44 th Annual High School Mathematics Contest April 12, 2017

Minnesota State University, Mankato 44 th Annual High School Mathematics Contest April 12, 2017 Minnesot Stte University, Mnkto 44 th Annul High School Mthemtics Contest April, 07. A 5 ft. ldder is plced ginst verticl wll of uilding. The foot of the ldder rests on the floor nd is 7 ft. from the wll.

More information

Numerical Analysis: Trapezoidal and Simpson s Rule

Numerical Analysis: Trapezoidal and Simpson s Rule nd Simpson s Mthemticl question we re interested in numericlly nswering How to we evlute I = f (x) dx? Clculus tells us tht if F(x) is the ntiderivtive of function f (x) on the intervl [, b], then I =

More information

Lecture 2e Orthogonal Complement (pages )

Lecture 2e Orthogonal Complement (pages ) Lecture 2e Orthogonl Complement (pges -) We hve now seen tht n orthonorml sis is nice wy to descrie suspce, ut knowing tht we wnt n orthonorml sis doesn t mke one fll into our lp. In theory, the process

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Dennis Bricker, 2001 Dept of Industrial Engineering The University of Iowa. MDP: Taxi page 1

Dennis Bricker, 2001 Dept of Industrial Engineering The University of Iowa. MDP: Taxi page 1 Denns Brcker, 2001 Dept of Industrl Engneerng The Unversty of Iow MDP: Tx pge 1 A tx serves three djcent towns: A, B, nd C. Ech tme the tx dschrges pssenger, the drver must choose from three possble ctons:

More information

Bases for Vector Spaces

Bases for Vector Spaces Bses for Vector Spces 2-26-25 A set is independent if, roughly speking, there is no redundncy in the set: You cn t uild ny vector in the set s liner comintion of the others A set spns if you cn uild everything

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information