10/2/ , 5.9, Jacob Hays Amit Pillay James DeFelice
|
|
- Hope Goodman
- 5 years ago
- Views:
Transcription
1 0//008 Liear Discrimiat Fuctios Jacob Hays Amit Pillay James DeFelice 5.8, 5.9, 5. Miimum Squared Error Previous methods oly worked o liear separable cases, by lookig at misclassified samples to correct error MSE looks at all samples, usig liear equatios to fid estimate
2 0//008 Miimum Squared Error x space mapped to y space. For all samples x i i dimesio d, there exists a y i of dimesio d^ Fid vector a makig all a t y i > 0 All samples y i i matrix Y, dim x d^, Ya = b (b is vector of positive costats) b is our margi for error y y... y y y... y b yd a0 b yd a = yd ad b Miimum Squared Error Y is rectagular ( x d^), so it does ot have a direct iverse to solve Ya = b Ya b = e gives error, miimize it Square error e Take Gradiet J t s( a) = Ya b = ( a yi bi i= Js = i= t ( a yi bi) yi = Y t ) ( Ya b) Gradiet should goto Zero t t Y Ya = Y b
3 0//008 Miimum Squared Error Y t Ya = Y t b goes to a = (Y t Y) - Y t b (Y t Y) - Y t is the psuedo-iverse of Y, dimesio d^ x, ca be writte as Y Y Y = I YY I a = Y b gives us a solutio with b beig a margi. Miimum Squared Error 3
4 0//008 Fisher s Liear Discrimiat Based o projectio of d-dimesioal data oto a lie. Loses a lot of data, but some orietatio of the lie might give a good split y = w t x, w = y i is projectio of x i oto lie w Goal: Fid best w to separate them Highly overlappig data performs poorly Fisher s Liear Discrimiat Mea of each class D i w = m m / m m mi = i x Di x 4
5 0//008 5 Fisher s Liear Discrimiat Fisher s Liear Discrimiat Scatter Matrices S W = S + S t i D x i i m x m x S i ) )( ( = ( ) m m S w W = Fisher s Relatio to MSE Fisher s Relatio to MSE MSE ad Fisher equivalet for specific b i = umber of i is colum vector of i full of oes Plug ito Y t Ya = Y t b = b x D i = X X Y = w w 0 a = 0 X X w X X X X t t t t w ( ) m m S w W = α
6 0//008 Relatio to Optimal Discrimiat If you set b =, MSE approaches the optimal Bayes discrimiat g 0 as umber of samples approaches ifiity. (see 5.8.3) g0( x) = P( ω x) P( ω x) g(x) is MSE estimatio Widrow-Hoff / LMS LMS Least Mea Squared Still solves whe Y t Y is sigular a,b, threshold θ, step η(.), k = 0 begi do k = (k + ) mod a = a + η(k)(b k a t y k )y k util η(k) )(b k a t y k )y k < θ retur a ed 6
7 0//008 Widrow-Hoff / LMS LMS ot guarateed to coverge to a separatig plae, eve if oe exists. Procedural differeces Perceptro, relaxatio If samples liearly separable, we ca fid a solutio Otherwise, we do ot coverge to a solutio MSE Always yields a weight vector May ot be the best solutio Not guarateed to be a separatig vector 7
8 0//008 Choosig b Arbitrary b, MSE miimizes Ya b If liearly separable, we ca more smartly choose b Defie â ad ß such that Yâ = ß > 0 Every compoet of ß is positive Modified MSE J s (a,b) = Ya b a, b allowed to vary Subject to b > 0 Mi of J s is zero a that achieves mi J s is the separatig vector 8
9 0//008 Ho-Kashyap Kashyap/Descet prodecure t aj s = Y ( Ya b) = ( Ya b) For ay b a = Y b b J s So, = 0 ad we're doe? a J s o... Must avoid b = 0 Must avoid b < 0 Ho-Kashyap Kashyap/Descet Procedure Pick positive b Do t allow reductio of b s compoets Set all positive compoets of J a s to zero b(k+) = b(k) -ηc b J s if bj s 0 c = 0 otherwise c = ( J J ) b s b s 9
10 0//008 Ho-Kashyap Kashyap/Descet Procedure b k + = bk η [ J J ] b s b s ( Ya b) b J s = e = Ya b b + k + = bk + η kek a + e = ( e e ) = Y k b k k k k Ho-Kashyap Algorithm Begi iitialize a, b, η() <, threshold b mi, k max do k = k+ mod e = Ya b e + = ½(e+abs(e)) b = b + η(k)e + a = Y b if abs(e) <= b mi the retur a,b ad exit Util k = k max Prit NO SOLUTION Ed Whe e(k) = 0 we have solutio Whe e(k) <= 0 samples ot liearly separable 0
11 0//008 Covergece (separable case) If 0 < η <, ad liearly separable Solutio vector exists We will fid i fiite k steps Two possibilities e(k) = 0 for some fiite k 0 No zero i e() If e(k 0 ) a(k), b(k), e(k) stop chagig Ya(k) = b(k) > 0 for all k > k 0 If we fid k 0, algorithm termiates with solutio vector Covergece (separable) e() ever zero for fiite k If samples are liearly separable Ya = b, b > 0 Because b is positive, either e(k) is zero, or e(k) is positive Sice e(k) caot be zero (first bullet), it must be positive
12 0//008 Covergece (separable) ¼( e k - e k+ )=η(-η) e + k +η e +t kyy e + k YY is symmetric, positive semi-defiite 0 < η < Therefore, e k > e k+ if 0 < η < e will evetually coverge to zero a will evetually coverge to solutio vector Covergece (o-separable) If ot liearly separable, may obtai a ozero error vector without positive compoets Still have ¼( e k - e k+ )=η(-η) e + k +η e +t kyy e + k So limitig e caot be zero Will coverge to a o-zero value Covergece says that e + k = 0 for some fiite k (separable) e + k will coverge to zero while e is bouded away from zero (o-separable)
13 0//008 Support Vector Machies (SVMs) SVMs Represetig data i higher dimesios space, SVM will costruct a separatig hyperplae i that space, oe which maximizes margi betwee the two data sets. 3
14 0//008 Applicatio Face detectio, verificatio, ad recogitio Object detectio ad recogitio Hadwritte character ad digit recogitio Text detectio ad categorizatio Speech ad speaker verificatio, recogitio Iformatio ad image retrieval Formalizatio We are give some traiig data, a set of poits of the form Equatio of separatig hyperplae: The vector w is a ormal vector. The parameter b/ w determies the offset of the hyperplae from the origi alog the ormal vector 4
15 0//008 Formalizatio cot Defiig two hyperplaes give by equatios: H: H: These hyperplaes are defied i such a way that o poits lies betwee them To prevet data poits fallig betwee these hyperplaes, followig two costraits are defied: Formulatio cot This ca be rewritte as: So the formulatio of the optimizatio problem is Choose w, b to miimize w subject to 5
16 0//008 SVM Hyperplae Example SVM Traiig Lagrage Optimizatio problem Reformulated Optimizatio Problem is give as: Thus the ew optimizatio problem is to miimize L P w.r.t w ad b subject to: 6
17 0//008 SVM Traiig cot Dual of Lagrage formulatio The dual of Lagrage states that the gradiet descet of L P with respect to w ad b vaishes. so we have the dual as: The optimizatio problem w.r.t dual is to maximize L D subject to: From the above optimizatio equatio we have: This shows that the solutio is the ier product of iput poits Most of the poits have α to be zero ad for those poits for which α is ot zero are the closest poits to the separatig hyperplae. These poits are called support vectors. 7
18 0//008 Advatages & Disadvatages of SVM Advatages Gives high geeralizatio performace Complexity of SVM classifier is characterized by umber of support vectors rather tha the dimesioality of trasformed space. Disadvatages The traiig time scales somewhere betwee quadratic ad cubic with respect to the umber of traiig samples Recogitio of 3D-Objects Experimet ivolved recogitio of 3D objects from the COIL db Each coil image is trasformed ito eight-bit vector of 3X3 = 04 compoets 8
19 0//008 Refereces Potil, M.; Verri, A., Support vector machies for 3D object recogitio, Patter Aalysis ad Machie Itelligece, IEEE Trasactios Vol. 0, Issue 6, Jue 998, pp Christopher J. C. Burges, A tutorial o support vector machies for patter recogitio (998) R.O. Duda, P. E. Hart ad D. Stork, Wiley, Patter Classificatio (d Editio) by 00 R.J. Schalkoff. (99) Patter Recogitio: Statistical, Structural, ad Neural Approaches, Wiley.* 9
Introduction to Machine Learning DIS10
CS 189 Fall 017 Itroductio to Machie Learig DIS10 1 Fu with Lagrage Multipliers (a) Miimize the fuctio such that f (x,y) = x + y x + y = 3. Solutio: The Lagragia is: L(x,y,λ) = x + y + λ(x + y 3) Takig
More informationLECTURE 17: Linear Discriminant Functions
LECURE 7: Liear Discrimiat Fuctios Perceptro leari Miimum squared error (MSE) solutio Least-mea squares (LMS) rule Ho-Kashyap procedure Itroductio to Patter Aalysis Ricardo Gutierrez-Osua exas A&M Uiversity
More informationSupport vector machine revisited
6.867 Machie learig, lecture 8 (Jaakkola) 1 Lecture topics: Support vector machie ad kerels Kerel optimizatio, selectio Support vector machie revisited Our task here is to first tur the support vector
More informationIntroduction to Optimization Techniques. How to Solve Equations
Itroductio to Optimizatio Techiques How to Solve Equatios Iterative Methods of Optimizatio Iterative methods of optimizatio Solutio of the oliear equatios resultig form a optimizatio problem is usually
More informationJacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3
No-Parametric Techiques Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3 Parametric vs. No-Parametric Parametric Based o Fuctios (e.g Normal Distributio) Uimodal Oly oe peak Ulikely real data cofies
More informationDefinitions and Theorems. where x are the decision variables. c, b, and a are constant coefficients.
Defiitios ad Theorems Remember the scalar form of the liear programmig problem, Miimize, Subject to, f(x) = c i x i a 1i x i = b 1 a mi x i = b m x i 0 i = 1,2,, where x are the decisio variables. c, b,
More informationLinear Classifiers III
Uiversität Potsdam Istitut für Iformatik Lehrstuhl Maschielles Lere Liear Classifiers III Blaie Nelso, Tobias Scheffer Cotets Classificatio Problem Bayesia Classifier Decisio Liear Classifiers, MAP Models
More information1 Duality revisited. AM 221: Advanced Optimization Spring 2016
AM 22: Advaced Optimizatio Sprig 206 Prof. Yaro Siger Sectio 7 Wedesday, Mar. 9th Duality revisited I this sectio, we will give a slightly differet perspective o duality. optimizatio program: f(x) x R
More information10-701/ Machine Learning Mid-term Exam Solution
0-70/5-78 Machie Learig Mid-term Exam Solutio Your Name: Your Adrew ID: True or False (Give oe setece explaatio) (20%). (F) For a cotiuous radom variable x ad its probability distributio fuctio p(x), it
More informationLinear Support Vector Machines
Liear Support Vector Machies David S. Roseberg The Support Vector Machie For a liear support vector machie (SVM), we use the hypothesis space of affie fuctios F = { f(x) = w T x + b w R d, b R } ad evaluate
More informationPattern recognition systems Laboratory 10 Linear Classifiers and the Perceptron Algorithm
Patter recogitio systems Laboratory 10 Liear Classifiers ad the Perceptro Algorithm 1. Objectives his laboratory sessio presets the perceptro learig algorithm for the liear classifier. We will apply gradiet
More informationPAPER : IIT-JAM 2010
MATHEMATICS-MA (CODE A) Q.-Q.5: Oly oe optio is correct for each questio. Each questio carries (+6) marks for correct aswer ad ( ) marks for icorrect aswer.. Which of the followig coditios does NOT esure
More informationMachine Learning Theory Tübingen University, WS 2016/2017 Lecture 12
Machie Learig Theory Tübige Uiversity, WS 06/07 Lecture Tolstikhi Ilya Abstract I this lecture we derive risk bouds for kerel methods. We will start by showig that Soft Margi kerel SVM correspods to miimizig
More informationSupport Vector Machines and Kernel Methods
Support Vector Machies ad Kerel Methods Daiel Khashabi Fall 202 Last Update: September 26, 206 Itroductio I Support Vector Machies the goal is to fid a separator betwee data which has the largest margi,
More informationChapter 7. Support Vector Machine
Chapter 7 Support Vector Machie able of Cotet Margi ad support vectors SVM formulatio Slack variables ad hige loss SVM for multiple class SVM ith Kerels Relevace Vector Machie Support Vector Machie (SVM)
More informationThe Method of Least Squares. To understand least squares fitting of data.
The Method of Least Squares KEY WORDS Curve fittig, least square GOAL To uderstad least squares fittig of data To uderstad the least squares solutio of icosistet systems of liear equatios 1 Motivatio Curve
More informationBIOINF 585: Machine Learning for Systems Biology & Clinical Informatics
BIOINF 585: Machie Learig for Systems Biology & Cliical Iformatics Lecture 14: Dimesio Reductio Jie Wag Departmet of Computatioal Medicie & Bioiformatics Uiversity of Michiga 1 Outlie What is feature reductio?
More informationIntroduction to Artificial Intelligence CAP 4601 Summer 2013 Midterm Exam
Itroductio to Artificial Itelligece CAP 601 Summer 013 Midterm Exam 1. Termiology (7 Poits). Give the followig task eviromets, eter their properties/characteristics. The properties/characteristics of the
More informationLinear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d
Liear regressio Daiel Hsu (COMS 477) Maximum likelihood estimatio Oe of the simplest liear regressio models is the followig: (X, Y ),..., (X, Y ), (X, Y ) are iid radom pairs takig values i R d R, ad Y
More informationPattern recognition systems Lab 10 Linear Classifiers and the Perceptron Algorithm
Patter recogitio systems Lab 10 Liear Classifiers ad the Perceptro Algorithm 1. Objectives his lab sessio presets the perceptro learig algorithm for the liear classifier. We will apply gradiet descet ad
More informationMachine Learning for Data Science (CS 4786)
Machie Learig for Data Sciece CS 4786) Lecture & 3: Pricipal Compoet Aalysis The text i black outlies high level ideas. The text i blue provides simple mathematical details to derive or get to the algorithm
More informationFMA901F: Machine Learning Lecture 4: Linear Models for Classification. Cristian Sminchisescu
FMA90F: Machie Learig Lecture 4: Liear Models for Classificatio Cristia Smichisescu Liear Classificatio Classificatio is itrisically o liear because of the traiig costraits that place o idetical iputs
More informationAPPENDIX A SMO ALGORITHM
AENDIX A SMO ALGORITHM Sequetial Miimal Optimizatio SMO) is a simple algorithm that ca quickly solve the SVM Q problem without ay extra matrix storage ad without usig time-cosumig umerical Q optimizatio
More informationTEACHER CERTIFICATION STUDY GUIDE
COMPETENCY 1. ALGEBRA SKILL 1.1 1.1a. ALGEBRAIC STRUCTURES Kow why the real ad complex umbers are each a field, ad that particular rigs are ot fields (e.g., itegers, polyomial rigs, matrix rigs) Algebra
More informationOptimally Sparse SVMs
A. Proof of Lemma 3. We here prove a lower boud o the umber of support vectors to achieve geeralizatio bouds of the form which we cosider. Importatly, this result holds ot oly for liear classifiers, but
More informationSeptember 2012 C1 Note. C1 Notes (Edexcel) Copyright - For AS, A2 notes and IGCSE / GCSE worksheets 1
September 0 s (Edecel) Copyright www.pgmaths.co.uk - For AS, A otes ad IGCSE / GCSE worksheets September 0 Copyright www.pgmaths.co.uk - For AS, A otes ad IGCSE / GCSE worksheets September 0 Copyright
More informationApply change-of-basis formula to rewrite x as a linear combination of eigenvectors v j.
Eigevalue-Eigevector Istructor: Nam Su Wag eigemcd Ay vector i real Euclidea space of dimesio ca be uiquely epressed as a liear combiatio of liearly idepedet vectors (ie, basis) g j, j,,, α g α g α g α
More informationPC5215 Numerical Recipes with Applications - Review Problems
PC55 Numerical Recipes with Applicatios - Review Problems Give the IEEE 754 sigle precisio bit patter (biary or he format) of the followig umbers: 0 0 05 00 0 00 Note that it has 8 bits for the epoet,
More informationA collocation method for singular integral equations with cosecant kernel via Semi-trigonometric interpolation
Iteratioal Joural of Mathematics Research. ISSN 0976-5840 Volume 9 Number 1 (017) pp. 45-51 Iteratioal Research Publicatio House http://www.irphouse.com A collocatio method for sigular itegral equatios
More informationThe Growth of Functions. Theoretical Supplement
The Growth of Fuctios Theoretical Supplemet The Triagle Iequality The triagle iequality is a algebraic tool that is ofte useful i maipulatig absolute values of fuctios. The triagle iequality says that
More informationSequences A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence
Sequeces A sequece of umbers is a fuctio whose domai is the positive itegers. We ca see that the sequece 1, 1, 2, 2, 3, 3,... is a fuctio from the positive itegers whe we write the first sequece elemet
More informationMath 21C Brian Osserman Practice Exam 2
Math 1C Bria Osserma Practice Exam 1 (15 pts.) Determie the radius ad iterval of covergece of the power series (x ) +1. First we use the root test to determie for which values of x the series coverges
More information18.657: Mathematics of Machine Learning
8.657: Mathematics of Machie Learig Lecturer: Philippe Rigollet Lecture 0 Scribe: Ade Forrow Oct. 3, 05 Recall the followig defiitios from last time: Defiitio: A fuctio K : X X R is called a positive symmetric
More informationINF Introduction to classifiction Anne Solberg Based on Chapter 2 ( ) in Duda and Hart: Pattern Classification
INF 4300 90 Itroductio to classifictio Ae Solberg ae@ifiuioo Based o Chapter -6 i Duda ad Hart: atter Classificatio 90 INF 4300 Madator proect Mai task: classificatio You must implemet a classificatio
More informationECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015
ECE 8527: Itroductio to Machie Learig ad Patter Recogitio Midterm # 1 Vaishali Ami Fall, 2015 tue39624@temple.edu Problem No. 1: Cosider a two-class discrete distributio problem: ω 1 :{[0,0], [2,0], [2,2],
More informationv = -!g(x 0 ) Ûg Ûx 1 Ûx 2 Ú If we work out the details in the partial derivatives, we get a pleasing result. n Ûx k, i x i - 2 b k
The Method of Steepest Descet This is the quadratic fuctio from to that is costructed to have a miimum at the x that solves the system A x = b: g(x) = - 2 I the method of steepest descet, we
More informationDifferentiable Convex Functions
Differetiable Covex Fuctios The followig picture motivates Theorem 11. f ( x) f ( x) f '( x)( x x) ˆx x 1 Theorem 11 : Let f : R R be differetiable. The, f is covex o the covex set C R if, ad oly if for
More informationChimica Inorganica 3
himica Iorgaica Irreducible Represetatios ad haracter Tables Rather tha usig geometrical operatios, it is ofte much more coveiet to employ a ew set of group elemets which are matrices ad to make the rule
More informationProblem Cosider the curve give parametrically as x = si t ad y = + cos t for» t» ß: (a) Describe the path this traverses: Where does it start (whe t =
Mathematics Summer Wilso Fial Exam August 8, ANSWERS Problem 1 (a) Fid the solutio to y +x y = e x x that satisfies y() = 5 : This is already i the form we used for a first order liear differetial equatio,
More informationVector Quantization: a Limiting Case of EM
. Itroductio & defiitios Assume that you are give a data set X = { x j }, j { 2,,, }, of d -dimesioal vectors. The vector quatizatio (VQ) problem requires that we fid a set of prototype vectors Z = { z
More informationSupplemental Material: Proofs
Proof to Theorem Supplemetal Material: Proofs Proof. Let be the miimal umber of traiig items to esure a uique solutio θ. First cosider the case. It happes if ad oly if θ ad Rak(A) d, which is a special
More informationTopics Machine learning: lecture 2. Review: the learning problem. Hypotheses and estimation. Estimation criterion cont d. Estimation criterion
.87 Machie learig: lecture Tommi S. Jaakkola MIT CSAIL tommi@csail.mit.edu Topics The learig problem hypothesis class, estimatio algorithm loss ad estimatio criterio samplig, empirical ad epected losses
More informationNotes 8 Singularities
ECE 6382 Fall 27 David R. Jackso Notes 8 Sigularities Notes are from D. R. Wilto, Dept. of ECE Sigularity A poit s is a sigularity of the fuctio f () if the fuctio is ot aalytic at s. (The fuctio does
More information( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let.
Joh Riley 8 Jue 03 ANSWERS TO EXERCISES IN APPENDIX B Sectio B VECTORS AND SETS Exercise B-: Covex sets (a) Let 0 x, x X, X, hece 0 x, x X ad 0 x, x X Sice X ad X are covex, x X ad x X The x X X, which
More informationStatistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions
Statistical ad Mathematical Methods DS-GA 00 December 8, 05. Short questios Sample Fial Problems Solutios a. Ax b has a solutio if b is i the rage of A. The dimesio of the rage of A is because A has liearly-idepedet
More informationSolutions to Homework 7
Solutios to Homework 7 Due Wedesday, August 4, 004. Chapter 4.1) 3, 4, 9, 0, 7, 30. Chapter 4.) 4, 9, 10, 11, 1. Chapter 4.1. Solutio to problem 3. The sum has the form a 1 a + a 3 with a k = 1/k. Sice
More informationA sequence of numbers is a function whose domain is the positive integers. We can see that the sequence
Sequeces A sequece of umbers is a fuctio whose domai is the positive itegers. We ca see that the sequece,, 2, 2, 3, 3,... is a fuctio from the positive itegers whe we write the first sequece elemet as
More informationCALCULATION OF FIBONACCI VECTORS
CALCULATION OF FIBONACCI VECTORS Stuart D. Aderso Departmet of Physics, Ithaca College 953 Daby Road, Ithaca NY 14850, USA email: saderso@ithaca.edu ad Dai Novak Departmet of Mathematics, Ithaca College
More informationLecture 3. Digital Signal Processing. Chapter 3. z-transforms. Mikael Swartling Nedelko Grbic Bengt Mandersson. rev. 2016
Lecture 3 Digital Sigal Processig Chapter 3 z-trasforms Mikael Swartlig Nedelko Grbic Begt Madersso rev. 06 Departmet of Electrical ad Iformatio Techology Lud Uiversity z-trasforms We defie the z-trasform
More informationChapter Vectors
Chapter 4. Vectors fter readig this chapter you should be able to:. defie a vector. add ad subtract vectors. fid liear combiatios of vectors ad their relatioship to a set of equatios 4. explai what it
More informationIP Reference guide for integer programming formulations.
IP Referece guide for iteger programmig formulatios. by James B. Orli for 15.053 ad 15.058 This documet is iteded as a compact (or relatively compact) guide to the formulatio of iteger programs. For more
More informationΩ ). Then the following inequality takes place:
Lecture 8 Lemma 5. Let f : R R be a cotiuously differetiable covex fuctio. Choose a costat δ > ad cosider the subset Ωδ = { R f δ } R. Let Ωδ ad assume that f < δ, i.e., is ot o the boudary of f = δ, i.e.,
More informationPhysics 116A Solutions to Homework Set #1 Winter Boas, problem Use equation 1.8 to find a fraction describing
Physics 6A Solutios to Homework Set # Witer 0. Boas, problem. 8 Use equatio.8 to fid a fractio describig 0.694444444... Start with the formula S = a, ad otice that we ca remove ay umber of r fiite decimals
More informationNotes on iteration and Newton s method. Iteration
Notes o iteratio ad Newto s method Iteratio Iteratio meas doig somethig over ad over. I our cotet, a iteratio is a sequece of umbers, vectors, fuctios, etc. geerated by a iteratio rule of the type 1 f
More informationis also known as the general term of the sequence
Lesso : Sequeces ad Series Outlie Objectives: I ca determie whether a sequece has a patter. I ca determie whether a sequece ca be geeralized to fid a formula for the geeral term i the sequece. I ca determie
More informationMachine Learning Theory (CS 6783)
Machie Learig Theory (CS 6783) Lecture 2 : Learig Frameworks, Examples Settig up learig problems. X : istace space or iput space Examples: Computer Visio: Raw M N image vectorized X = 0, 255 M N, SIFT
More informationMarkov Decision Processes
Markov Decisio Processes Defiitios; Statioary policies; Value improvemet algorithm, Policy improvemet algorithm, ad liear programmig for discouted cost ad average cost criteria. Markov Decisio Processes
More informationThe multiplicative structure of finite field and a construction of LRC
IERG6120 Codig for Distributed Storage Systems Lecture 8-06/10/2016 The multiplicative structure of fiite field ad a costructio of LRC Lecturer: Keeth Shum Scribe: Zhouyi Hu Notatios: We use the otatio
More informationPerceptron. Inner-product scalar Perceptron. XOR problem. Gradient descent Stochastic Approximation to gradient descent 5/10/10
Perceptro Ier-product scalar Perceptro Perceptro learig rule XOR problem liear separable patters Gradiet descet Stochastic Approximatio to gradiet descet LMS Adalie 1 Ier-product et =< w, x >= w x cos(θ)
More informationPattern Classification, Ch4 (Part 1)
Patter Classificatio All materials i these slides were take from Patter Classificatio (2d ed) by R O Duda, P E Hart ad D G Stork, Joh Wiley & Sos, 2000 with the permissio of the authors ad the publisher
More informationMatrix Representation of Data in Experiment
Matrix Represetatio of Data i Experimet Cosider a very simple model for resposes y ij : y ij i ij, i 1,; j 1,,..., (ote that for simplicity we are assumig the two () groups are of equal sample size ) Y
More informationMachine Learning Regression I Hamid R. Rabiee [Slides are based on Bishop Book] Spring
Machie Learig Regressio I Hamid R. Rabiee [Slides are based o Bishop Book] Sprig 015 http://ce.sharif.edu/courses/93-94//ce717-1 Liear Regressio Liear regressio: ivolves a respose variable ad a sigle predictor
More informationBinary classification, Part 1
Biary classificatio, Part 1 Maxim Ragisky September 25, 2014 The problem of biary classificatio ca be stated as follows. We have a radom couple Z = (X,Y ), where X R d is called the feature vector ad Y
More informationInfinite Sequences and Series
Chapter 6 Ifiite Sequeces ad Series 6.1 Ifiite Sequeces 6.1.1 Elemetary Cocepts Simply speakig, a sequece is a ordered list of umbers writte: {a 1, a 2, a 3,...a, a +1,...} where the elemets a i represet
More informationMa 4121: Introduction to Lebesgue Integration Solutions to Homework Assignment 5
Ma 42: Itroductio to Lebesgue Itegratio Solutios to Homework Assigmet 5 Prof. Wickerhauser Due Thursday, April th, 23 Please retur your solutios to the istructor by the ed of class o the due date. You
More informationAnalytic Continuation
Aalytic Cotiuatio The stadard example of this is give by Example Let h (z) = 1 + z + z 2 + z 3 +... kow to coverge oly for z < 1. I fact h (z) = 1/ (1 z) for such z. Yet H (z) = 1/ (1 z) is defied for
More information8. Applications To Linear Differential Equations
8. Applicatios To Liear Differetial Equatios 8.. Itroductio 8.. Review Of Results Cocerig Liear Differetial Equatios Of First Ad Secod Orders 8.3. Eercises 8.4. Liear Differetial Equatios Of Order N 8.5.
More informationLINEAR PROGRAMMING II
LINEAR PROGRAMMING II Patrik Forssé Office: 2404 Phoe: 08/47 29 66 E-mail: patrik@tdb.uu.se HE LINEAR PROGRAMMING PROBLEM (LP) A LP-problem ca be formulated as: mi c subject a + am + g + g p + + + c to
More informationLecture 20. Brief Review of Gram-Schmidt and Gauss s Algorithm
8.409 A Algorithmist s Toolkit Nov. 9, 2009 Lecturer: Joatha Keler Lecture 20 Brief Review of Gram-Schmidt ad Gauss s Algorithm Our mai task of this lecture is to show a polyomial time algorithm which
More informationOutline. Linear regression. Regularization functions. Polynomial curve fitting. Stochastic gradient descent for regression. MLE for regression
REGRESSION 1 Outlie Liear regressio Regularizatio fuctios Polyomial curve fittig Stochastic gradiet descet for regressio MLE for regressio Step-wise forward regressio Regressio methods Statistical techiques
More informationTHE SOLUTION OF NONLINEAR EQUATIONS f( x ) = 0.
THE SOLUTION OF NONLINEAR EQUATIONS f( ) = 0. Noliear Equatio Solvers Bracketig. Graphical. Aalytical Ope Methods Bisectio False Positio (Regula-Falsi) Fied poit iteratio Newto Raphso Secat The root of
More informationb i u x i U a i j u x i u x j
M ath 5 2 7 Fall 2 0 0 9 L ecture 1 9 N ov. 1 6, 2 0 0 9 ) S ecod- Order Elliptic Equatios: Weak S olutios 1. Defiitios. I this ad the followig two lectures we will study the boudary value problem Here
More informationMachine Learning Brett Bernstein
Machie Learig Brett Berstei Week 2 Lecture: Cocept Check Exercises Starred problems are optioal. Excess Risk Decompositio 1. Let X = Y = {1, 2,..., 10}, A = {1,..., 10, 11} ad suppose the data distributio
More informationx x x Using a second Taylor polynomial with remainder, find the best constant C so that for x 0,
Math Activity 9( Due with Fial Eam) Usig first ad secod Taylor polyomials with remaider, show that for, 8 Usig a secod Taylor polyomial with remaider, fid the best costat C so that for, C 9 The th Derivative
More informationSieve Estimators: Consistency and Rates of Convergence
EECS 598: Statistical Learig Theory, Witer 2014 Topic 6 Sieve Estimators: Cosistecy ad Rates of Covergece Lecturer: Clayto Scott Scribe: Julia Katz-Samuels, Brado Oselio, Pi-Yu Che Disclaimer: These otes
More informationLecture #18
18-1 Variatioal Method (See CTDL 1148-1155, [Variatioal Method] 252-263, 295-307[Desity Matrices]) Last time: Quasi-Degeeracy Diagoalize a part of ifiite H * sub-matrix : H (0) + H (1) * correctios for
More informationw (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ.
2 5. Weighted umber of late jobs 5.1. Release dates ad due dates: maximimizig the weight of o-time jobs Oce we add release dates, miimizig the umber of late jobs becomes a sigificatly harder problem. For
More informationSimilarity Solutions to Unsteady Pseudoplastic. Flow Near a Moving Wall
Iteratioal Mathematical Forum, Vol. 9, 04, o. 3, 465-475 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/0.988/imf.04.48 Similarity Solutios to Usteady Pseudoplastic Flow Near a Movig Wall W. Robi Egieerig
More informationStatistical Pattern Recognition
Statistical Patter Recogitio Classificatio: No-Parametric Modelig Hamid R. Rabiee Jafar Muhammadi Sprig 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Ageda Parametric Modelig No-Parametric Modelig
More informationInverse Matrix. A meaning that matrix B is an inverse of matrix A.
Iverse Matrix Two square matrices A ad B of dimesios are called iverses to oe aother if the followig holds, AB BA I (11) The otio is dual but we ofte write 1 B A meaig that matrix B is a iverse of matrix
More informationDiscrete Orthogonal Moment Features Using Chebyshev Polynomials
Discrete Orthogoal Momet Features Usig Chebyshev Polyomials R. Mukuda, 1 S.H.Og ad P.A. Lee 3 1 Faculty of Iformatio Sciece ad Techology, Multimedia Uiversity 75450 Malacca, Malaysia. Istitute of Mathematical
More informationChapter 1. Complex Numbers. Dr. Pulak Sahoo
Chapter 1 Complex Numbers BY Dr. Pulak Sahoo Assistat Professor Departmet of Mathematics Uiversity Of Kalyai West Begal, Idia E-mail : sahoopulak1@gmail.com 1 Module-2: Stereographic Projectio 1 Euler
More informationDept. of Biomed. Eng. BME801: Inverse Problems in Bioengineering Kyung Hee Univ.
Dept of Biomed Eg BME801: Iverse Problems i Bioegieerig Kyug ee Uiv Adaptive Filters - Statistical digital sigal processig: i may problems of iterest, the sigals exhibit some iheret variability plus additive
More information: Transforms and Partial Differential Equations
Trasforms ad Partial Differetial Equatios 018 SUBJECT NAME : Trasforms ad Partial Differetial Equatios SUBJECT CODE : MA 6351 MATERIAL NAME : Part A questios REGULATION : R013 WEBSITE : wwwharigaeshcom
More informationM 340L CS Homew ork Set 6 Solutions
1. Suppose P is ivertible ad M 34L CS Homew ork Set 6 Solutios A PBP 1. Solve for B i terms of P ad A. Sice A PBP 1, w e have 1 1 1 B P PBP P P AP ( ).. Suppose ( B C) D, w here B ad C are m matrices ad
More informationM 340L CS Homew ork Set 6 Solutions
. Suppose P is ivertible ad M 4L CS Homew ork Set 6 Solutios A PBP. Solve for B i terms of P ad A. Sice A PBP, w e have B P PBP P P AP ( ).. Suppose ( B C) D, w here B ad C are m matrices ad D is ivertible.
More informationTaylor Series (BC Only)
Studet Study Sessio Taylor Series (BC Oly) Taylor series provide a way to fid a polyomial look-alike to a o-polyomial fuctio. This is doe by a specific formula show below (which should be memorized): Taylor
More informationIntelligent Systems I 08 SVM
Itelliget Systems I 08 SVM Stefa Harmelig & Philipp Heig 12. December 2013 Max Plack Istitute for Itelliget Systems Dptmt. of Empirical Iferece 1 / 30 Your feeback Ejoye most Laplace approximatio gettig
More informationCSI 2101 Discrete Structures Winter Homework Assignment #4 (100 points, weight 5%) Due: Thursday, April 5, at 1:00pm (in lecture)
CSI 101 Discrete Structures Witer 01 Prof. Lucia Moura Uiversity of Ottawa Homework Assigmet #4 (100 poits, weight %) Due: Thursday, April, at 1:00pm (i lecture) Program verificatio, Recurrece Relatios
More informationDefinition 4.2. (a) A sequence {x n } in a Banach space X is a basis for X if. unique scalars a n (x) such that x = n. a n (x) x n. (4.
4. BASES I BAACH SPACES 39 4. BASES I BAACH SPACES Sice a Baach space X is a vector space, it must possess a Hamel, or vector space, basis, i.e., a subset {x γ } γ Γ whose fiite liear spa is all of X ad
More informationClassification of problem & problem solving strategies. classification of time complexities (linear, logarithmic etc)
Classificatio of problem & problem solvig strategies classificatio of time complexities (liear, arithmic etc) Problem subdivisio Divide ad Coquer strategy. Asymptotic otatios, lower boud ad upper boud:
More information5.1. The Rayleigh s quotient. Definition 49. Let A = A be a self-adjoint matrix. quotient is the function. R(x) = x,ax, for x = 0.
40 RODICA D. COSTIN 5. The Rayleigh s priciple ad the i priciple for the eigevalues of a self-adjoit matrix Eigevalues of self-adjoit matrices are easy to calculate. This sectio shows how this is doe usig
More informationCSCI567 Machine Learning (Fall 2014)
CSCI567 Machie Learig (Fall 2014) Drs. Sha & Liu {feisha,yaliu.cs}@usc.edu October 9, 2014 Drs. Sha & Liu ({feisha,yaliu.cs}@usc.edu) CSCI567 Machie Learig (Fall 2014) October 9, 2014 1 / 49 Outlie Admiistratio
More informationMATH 10550, EXAM 3 SOLUTIONS
MATH 155, EXAM 3 SOLUTIONS 1. I fidig a approximate solutio to the equatio x 3 +x 4 = usig Newto s method with iitial approximatio x 1 = 1, what is x? Solutio. Recall that x +1 = x f(x ) f (x ). Hece,
More informationThe Interval of Convergence for a Power Series Examples
The Iterval of Covergece for a Power Series Examples To review the process: How to Test a Power Series for Covergece. Fid the iterval where the series coverges absolutely. We have to use the Ratio or Root
More informationSeunghee Ye Ma 8: Week 5 Oct 28
Week 5 Summary I Sectio, we go over the Mea Value Theorem ad its applicatios. I Sectio 2, we will recap what we have covered so far this term. Topics Page Mea Value Theorem. Applicatios of the Mea Value
More informationState Space Representation
Optimal Cotrol, Guidace ad Estimatio Lecture 2 Overview of SS Approach ad Matrix heory Prof. Radhakat Padhi Dept. of Aerospace Egieerig Idia Istitute of Sciece - Bagalore State Space Represetatio Prof.
More informationMIDTERM 3 CALCULUS 2. Monday, December 3, :15 PM to 6:45 PM. Name PRACTICE EXAM SOLUTIONS
MIDTERM 3 CALCULUS MATH 300 FALL 08 Moday, December 3, 08 5:5 PM to 6:45 PM Name PRACTICE EXAM S Please aswer all of the questios, ad show your work. You must explai your aswers to get credit. You will
More informationInformation-based Feature Selection
Iformatio-based Feature Selectio Farza Faria, Abbas Kazeroui, Afshi Babveyh Email: {faria,abbask,afshib}@staford.edu 1 Itroductio Feature selectio is a topic of great iterest i applicatios dealig with
More informationLinearly Independent Sets, Bases. Review. Remarks. A set of vectors,,, in a vector space is said to be linearly independent if the vector equation
Liearly Idepedet Sets Bases p p c c p Review { v v vp} A set of vectors i a vector space is said to be liearly idepedet if the vector equatio cv + c v + + c has oly the trivial solutio = = { v v vp} The
More information