Support-Vector Machines
|
|
- Ella Townsend
- 5 years ago
- Views:
Transcription
1 Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material frm Ricard Gutierrez-Osuna s Pattern Analysis lectures. The basic idea f SVM is t cnstruct a separating hyperplane where the margin f separatin between psitive and negative eamples are maimized. Principled derivatin: structural risk minimizatin errr rate is bunded by: (1) training errr-rate and (2) VC-dimensin f the mdel. SVM makes (1) becme zer and minimizes (2). 1 2 Optimal Hyperplane Distance t the Optimal Hyperplane Fr linearly separable patterns {( i, d i )} N (with d i {+1, 1}): i d+r w The separating hyperplane is w T + b = 0: w T + b 0 fr d i = +1 θ d r w T + b < 0 fr d i = 1 Let w be the ptimal hyperplane and b the ptimal bias. Frm w T i = b, the distance frm the rigin t the hyperplane is calculated as: d = i cs( i, w ) = b w since w T i = w i cs(w, i ) = b 3 4
2 Distance t the Optimal Hyperplane (cnt d) i w d+r θ d r Optimal Hyperplane and Supprt Vectrs ρ Optimal hyperplane The distance frm an arbitrary pint t the hyperplane can be calculated as: When the pint is in the psitive area: r = cs(, w ) d = T w w + w = T w + b w When the pint is in the negative area: r = d cs(, w ) = T w w w = T w + b w 5 b b.. Supprt Vectrs Supprt vectrs: input pints clsest t the separating hyperplane. Margin f separatin ρ: distance between the separating hyperplane and the clsest input pint. 6 Optimal Hyperplane and Supprt Vectrs (cnt d) The ptimal hyperplane is suppsed t maimize the margin f separatin ρ. With that requirement, we can write the cnditins that w and b must meet: w T + b +1 fr d i = +1 w T + b 1 fr d i = 1 Nte: +1 and 1, and supprt vectrs are thse (s) where equality hlds (i.e., w T (s) + b = +1 r 1). Since r = (w T + b )/ w, 1/ w if d = +1 r = 1/ w if d = 1 7 Optimal Hyperplane and Supprt Vectrs (cnt d) Supprt Vectrs Optimal hyperplane Margin f separatin between tw classes is ρ ρ = 2r = 2 w. Thus, maimizing the margin f separatin between tw classes is equivalent t minimizing the Euclidean nrm f the weight w! 8
3 Primal Prblem: Cnstrained Optimizatin Fr the training set T = {( i, d i )} N find w and b such that they minimize a certain value (1/ρ) while satisfying a cnstraint (all eamples are crrectly classified): Cnstraint: d i (w T i + b) 1 fr i = 1, 2,..., N. Cst functin: Φ(w) = 1 2 wt w. This prblem can be slved using the methd f Lagrange multipliers (see net tw slides). Mathematical Aside: Lagrange Multipliers Turn a cnstrained ptimizatin prblem int an uncnstrained ptimizatin prblem by absrbing the cnstraints int the cst functin, weighted by the Lagrange multipliers. Eample: Find pint n the circle 2 + y 2 = 1 clsest t the pint (2, 3) (adapted frm Ballard, An Intrductin t Natural Cmputatin, 1997, pp ). Minimize F (, y) = ( 2) 2 + (y 3) 2 subject t the cnstraint 2 + y 2 1 = 0. Absrb the cnstraint int the cst functin, after multiplying the Lagrange multiplier α: F (, y, α) = ( 2) 2 + (y 3) 2 + α( 2 + y 2 1) Lagrange Multipliers (cnt d) Must find, y, α that minimizes F (, y, α) = ( 2) 2 + (y 2) 2 + α( 2 + y 2 1). Set the partial derivatives t 0, and slve the system f equatins. F F y = 2( 2) + 2α = 0 = 2(y 2) + 2αy = 0 F α = 2 + y 2 1 = 0 Slve fr and y in the 1st and 2nd, and plug in thse t the 3rd equatin = y = 2 ( ) 2 2 ( ) α, s + = α 1 + α frm which we get α = Thus, (, y) = (1/ 2, 1/ 2). 11 Primal Prblem: Cnstrained Optimizatin (cnt d) Putting the cnstrained ptimizatin prblem int the Lagrangian frm, we get (utilizing the Kunh-Tucker therem) J(w, b, α) = 1 2 wt w Frm J(w,b,α) w = 0: Frm J(w,b,α) b = 0: w = ] α i [d i (w T i + b) 1. α i d i i. α i d i = 0 12
4 Primal Prblem: Cnstrained Optimizatin (cnt d) Nte that when the ptimal slutin is reached, the fllwing cnditin must hld (Karush-Kuhn-Tucker cmplementary cnditin) fr all i = 1, 2,..., N. ] α i [d i (w T i + b) 1 = 0 Thus, nn-zer α i s can be attained nly when [ di (w T i + b) 1 ] = 0, i.e., when the α i is assciated with a supprt vectr (s)! Other cnditins include α i 0. Primal Prblem: Cnstrained Optimizatin (cnt d) Plugging in w = N α id i N i and α id i = 0 back int J(w, b, α), we get the dual prblem. J(w, b, α) = 1 2 wt w N α i [ ] d i (w T i + b) 1 = 1 2 wt w N α id i w T i b N α id i + N α i { nting w T w = N α id i w T i } N and frm α id i = 0 N α id i w T i + N = 1 2 = 1 N 2 = Q(α). α i N j=1 α iα j d i d j T i j + N α i S, J(w, b, α) = Q(α) (α i 0). This results in the dual prblem (net slide) Dual Prblem Given the training sample {( i, d i )} N, find the Lagrange multipliers {α i } N Q(α) = 1 2 that maimize the bjective functin: subject t the cnstraints N α id i = 0 j=1 α i 0 fr all i = 1, 2,..., N. α i α j d i d j T i j + The prblem is stated entirely in terms f the training data ( i, d i ), and the dt prducts T i j play a key rle. 15 α i Slutin t the Optimizatin Prblem Once all the ptimal Lagrange mulitpliers α,i are fund (use Sequential minimal ptimizatin, etc.), w and b can be fund as fllws: w = α,i d i i and frm w T i + b = d i when i is a supprt vectr: b = d (s) w T (s) Nte: calculatin f final estimated functin des nt need any eplicit calculatin f w since they can be calculated frm the dt prduct between the input vectrs! w T = α,i d i T i 16
5 Margin f Separatin in SVM and VC Dimensin Statistical learning thery shws that it is desirable t reduce bth the errr (empirical risk) and the VC dimensin f the classifier. Vapnik (1995, 1998) shwed: Let D be the diameter f the smallest ball cntaining all input vectrs i. The set f ptimal hyperplanes defined by w T + b = 0 has a VC dimensin h bunded frm abve as { D 2 h min ρ 2, m 0 } + 1 where is the ceiling, ρ the margin f separatin equal t 2/ w, and m 0 the dimensinality f the input space. The implicatin is that the VC dimensin can be cntrlled independetly f m 0, by chsing an apprpriate (large) ρ! 17 Sft-Margin Classificatin ρ Optimal hyperplane Supprt Vectrs Inside margin, crrectly classified Inside margin, incrrectly classified Sme prblems can vilate the cnditin: d i (w T i + b) 1 We can intrduce a new set f variables {ξ i } N : d i (w T i + b) 1 ξ i where ξ i is called the slack variable. 18 Sft-Margin Classificatin (cnt d) We want t find a separating hyperplane that minimizes: Φ(ξ) = I(ξ i 1) where I(ξ) = 0 if ξ 0 and 1 therwise. Slving the abve is NP-cmplete, s we instead slve an apprimatin: Φ(ξ) = ξ i Sft-Margin Classificatin: Slutin Fllwing a similar rute invlving Lagrange multipliers, and a mre restrictive cnditin f 0 α i C, we get the slutin: N s w = α,i d i i b = d i (1 ξ i ) w T i Furthermre, the weight vectr can be factred in: Φ(, ξ) = 1 2 wt w + C ξ i }{{} }{{} Cntrls VC dim Cntrls errr with a cntrl parameter C
6 Nnlinear SVM Inner-Prduct Kernel Input is mapped t ϕ(). Input space ( ) ( i ) With the weight w (including the bias b), the decisin surface in the feature space becmes (assume ϕ 0 () = 1): i w T ϕ() = 0 Feature space Using the steps in linear SVM, we get Nnlinear mapping f an input vectr t a high-dimensinal feature space (eplit Cver s therem) Cnstructin f an ptimal hyperplane fr separating the features identified in the abve step. 21 w = α i d i ϕ( i ) Cmbining the abve tw, we get the decisin surface α i d i ϕ T ( i )ϕ() = Inner-Prduct Kernel (cnt d) The inner prduct ϕ T ()ϕ( i ) is between tw vectrs in the feature space. The calculatin f this inner prduct can be simpified by use f a inner-prduct kernel K(, i ): m 1 K(, i ) = ϕ T ()ϕ( i ) = ϕ j ()ϕ j ( i ) j=0 where m 1 is the dimensin f the feature space. (Nte: K(, i ) = K( i, ).) Inner-Prduct Kernel (cnt d) Mercer s therem states that K(, i ) that fllw certain cnditins (cntinuus, symmetric, psitive semi-definite) can be epressed in terms f an inner-prduct in a nnlinearly mapped feature space. Kernel functin K(, i ) allws us t calculate the inner prduct ϕ T ()ϕ( i ) in the mapped feature space withut any eplicit calculatin f the mapping functin ϕ( ). S, the ptimal hyperplane becmes: α i d i K(, i ) =
7 Eamples f Kernel Functins Linear: K(, i ) = T i. Plynmial: K(, i ) = ( T i + 1) p. ( ) RBF: K(, i ) = ep 1 2σ 2 i 2. Tw-layer perceptrn: K(, i ) = tanh ( β 0 T i + β 1 ) (fr sme β 0 and β 1 ). Epanding Kernel Eample K(, i ) = (1 + T i ) 2 with = [ 1, 2 ] T, i = [ i1, i2 ] T, K(, i ) = i i1 i2 = i i i2 [1, 2 1, 2 1 2, 2 2, 2 1, 2 2 ] [1, 2 i1, 2 i1 i2, 2 i2, 2 i1, 2 i2 ] T = ϕ() T ϕ( i ), where ϕ() = [1, 2 1, 2 1 2, 2 2, 2 1, 2 2 ] T Nnlinear SVM: Slutin The slutin is basically the same as the linear case, where T i is replaced with K(, i ), and an additinal cnstraint that α C is added. Nnlinear SVM Summary Prject input t high-dimensinal space t turn the prblem int a linearly separable prblem. Issues with a prjectin t higher dimensinal feature space: Statistical prblem: Danger f invking curse f dimensinality and higher chance f verfitting Use large margins t reduce VC dimensin Cmputatinal prblem: cmputatinal verhead fr calculating the mapping ϕ( ): Slve by using the kernel trick
Pattern Recognition 2014 Support Vector Machines
Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft
More informationCOMP 551 Applied Machine Learning Lecture 11: Support Vector Machines
COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse
More informationIn SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:
In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin
More informationIAML: Support Vector Machines
1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int
More informationThe blessing of dimensionality for kernel methods
fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented
More informationCOMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)
COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise
More informationSURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES
1 SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES Wlfgang HÄRDLE Ruslan MORO Center fr Applied Statistics and Ecnmics (CASE), Humbldt-Universität zu Berlin Mtivatin 2 Applicatins in Medicine estimatin f
More informationLinear programming III
Linear prgramming III Review 1/33 What have cvered in previus tw classes LP prblem setup: linear bjective functin, linear cnstraints. exist extreme pint ptimal slutin. Simplex methd: g thrugh extreme pint
More informationSupport Vector Machines and Flexible Discriminants
12 Supprt Vectr Machines and Flexible Discriminants This is page 417 Printer: Opaque this 12.1 Intrductin In this chapter we describe generalizatins f linear decisin bundaries fr classificatin. Optimal
More informationx 1 Outline IAML: Logistic Regression Decision Boundaries Example Data
Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares
More informationMATHEMATICS SYLLABUS SECONDARY 5th YEAR
Eurpean Schls Office f the Secretary-General Pedaggical Develpment Unit Ref. : 011-01-D-8-en- Orig. : EN MATHEMATICS SYLLABUS SECONDARY 5th YEAR 6 perid/week curse APPROVED BY THE JOINT TEACHING COMMITTEE
More informationLyapunov Stability Stability of Equilibrium Points
Lyapunv Stability Stability f Equilibrium Pints 1. Stability f Equilibrium Pints - Definitins In this sectin we cnsider n-th rder nnlinear time varying cntinuus time (C) systems f the frm x = f ( t, x),
More informationIntroduction: A Generalized approach for computing the trajectories associated with the Newtonian N Body Problem
A Generalized apprach fr cmputing the trajectries assciated with the Newtnian N Bdy Prblem AbuBar Mehmd, Syed Umer Abbas Shah and Ghulam Shabbir Faculty f Engineering Sciences, GIK Institute f Engineering
More informationHomology groups of disks with holes
Hmlgy grups f disks with hles THEOREM. Let p 1,, p k } be a sequence f distinct pints in the interir unit disk D n where n 2, and suppse that fr all j the sets E j Int D n are clsed, pairwise disjint subdisks.
More informationSlide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons
Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large
More informationElements of Machine Intelligence - I
ECE-175A Elements f Machine Intelligence - I Ken Kreutz-Delgad Nun Vascncels ECE Department, UCSD Winter 2011 The curse The curse will cver basic, but imprtant, aspects f machine learning and pattern recgnitin
More informationThe Solution Path of the Slab Support Vector Machine
CCCG 2008, Mntréal, Québec, August 3 5, 2008 The Slutin Path f the Slab Supprt Vectr Machine Michael Eigensatz Jachim Giesen Madhusudan Manjunath Abstract Given a set f pints in a Hilbert space that can
More informationChapter 3 Kinematics in Two Dimensions; Vectors
Chapter 3 Kinematics in Tw Dimensins; Vectrs Vectrs and Scalars Additin f Vectrs Graphical Methds (One and Tw- Dimensin) Multiplicatin f a Vectr b a Scalar Subtractin f Vectrs Graphical Methds Adding Vectrs
More informationANSWER KEY FOR MATH 10 SAMPLE EXAMINATION. Instructions: If asked to label the axes please use real world (contextual) labels
ANSWER KEY FOR MATH 10 SAMPLE EXAMINATION Instructins: If asked t label the axes please use real wrld (cntextual) labels Multiple Chice Answers: 0 questins x 1.5 = 30 Pints ttal Questin Answer Number 1
More informationCOMP 551 Applied Machine Learning Lecture 4: Linear classification
COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted
More informationEDA Engineering Design & Analysis Ltd
EDA Engineering Design & Analysis Ltd THE FINITE ELEMENT METHOD A shrt tutrial giving an verview f the histry, thery and applicatin f the finite element methd. Intrductin Value f FEM Applicatins Elements
More informationPre-Calculus Individual Test 2017 February Regional
The abbreviatin NOTA means Nne f the Abve answers and shuld be chsen if chices A, B, C and D are nt crrect. N calculatr is allwed n this test. Arcfunctins (such as y = Arcsin( ) ) have traditinal restricted
More informationSmoothing, penalized least squares and splines
Smthing, penalized least squares and splines Duglas Nychka, www.image.ucar.edu/~nychka Lcally weighted averages Penalized least squares smthers Prperties f smthers Splines and Reprducing Kernels The interplatin
More informationLinear, threshold units. Linear Discriminant Functions and Support Vector Machines. Biometrics CSE 190 Lecture 11. X i : inputs W i : weights
Linear Discriminant Functions and Support Vector Machines Linear, threshold units CSE19, Winter 11 Biometrics CSE 19 Lecture 11 1 X i : inputs W i : weights θ : threshold 3 4 5 1 6 7 Courtesy of University
More informationPart 3 Introduction to statistical classification techniques
Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms
More informationStats Classification Ji Zhu, Michigan Statistics 1. Classification. Ji Zhu 445C West Hall
Stats 415 - Classificatin Ji Zhu, Michigan Statistics 1 Classificatin Ji Zhu 445C West Hall 734-936-2577 jizhu@umich.edu Stats 415 - Classificatin Ji Zhu, Michigan Statistics 2 Examples f Classificatin
More informationAn Introduction to Complex Numbers - A Complex Solution to a Simple Problem ( If i didn t exist, it would be necessary invent me.
An Intrductin t Cmple Numbers - A Cmple Slutin t a Simple Prblem ( If i didn t eist, it wuld be necessary invent me. ) Our Prblem. The rules fr multiplying real numbers tell us that the prduct f tw negative
More informationNUMBERS, MATHEMATICS AND EQUATIONS
AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t
More informationSUPPORT VECTOR MACHINES FOR BANKRUPTCY ANALYSIS
1 SUPPORT VECTOR MACHINES FOR BANKRUPTCY ANALYSIS Wlfgang HÄRDLE 2 Ruslan MORO 1,2 Drthea SCHÄFER 1 1 Deutsches Institut für Wirtschaftsfrschung (DIW) 2 Center fr Applied Statistics and Ecnmics (CASE),
More informationResampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017
Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with
More informationComputational modeling techniques
Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins
More informationFloating Point Method for Solving Transportation. Problems with Additional Constraints
Internatinal Mathematical Frum, Vl. 6, 20, n. 40, 983-992 Flating Pint Methd fr Slving Transprtatin Prblems with Additinal Cnstraints P. Pandian and D. Anuradha Department f Mathematics, Schl f Advanced
More informationT Algorithmic methods for data mining. Slide set 6: dimensionality reduction
T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,
More informationContents. This is page i Printer: Opaque this
Cntents This is page i Printer: Opaque this Supprt Vectr Machines and Flexible Discriminants. Intrductin............. The Supprt Vectr Classifier.... Cmputing the Supprt Vectr Classifier........ Mixture
More informationCOMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification
COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551
More informationMATHEMATICS Higher Grade - Paper I
Higher Mathematics - Practice Eaminatin B Please nte the frmat f this practice eaminatin is different frm the current frmat. The paper timings are different and calculatrs can be used thrughut. MATHEMATICS
More informationChapter 3: Cluster Analysis
Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA
More informationDifferentiation Applications 1: Related Rates
Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm
More information1 The limitations of Hartree Fock approximation
Chapter: Pst-Hartree Fck Methds - I The limitatins f Hartree Fck apprximatin The n electrn single determinant Hartree Fck wave functin is the variatinal best amng all pssible n electrn single determinants
More informationCHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India
CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce
More informationk-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels
Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t
More informationMath Foundations 20 Work Plan
Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant
More informationSupport Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012
Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Linear classifier Which classifier? x 2 x 1 2 Linear classifier Margin concept x 2
More informationTree Structured Classifier
Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients
More informationSupport Vector Machines and Flexible Discriminants
Supprt Vectr Machines and Flexible Discriminants This is page Printer: Opaque this. Intrductin In this chapter we describe generalizatins f linear decisin bundaries fr classificatin. Optimal separating
More informationAdmissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs
Admissibility Cnditins and Asympttic Behavir f Strngly Regular Graphs VASCO MOÇO MANO Department f Mathematics University f Prt Oprt PORTUGAL vascmcman@gmailcm LUÍS ANTÓNIO DE ALMEIDA VIEIRA Department
More informationKinematic transformation of mechanical behavior Neville Hogan
inematic transfrmatin f mechanical behavir Neville Hgan Generalized crdinates are fundamental If we assume that a linkage may accurately be described as a cllectin f linked rigid bdies, their generalized
More informationLEARNING : At the end of the lesson, students should be able to: OUTCOMES a) state trigonometric ratios of sin,cos, tan, cosec, sec and cot
Mathematics DM 05 Tpic : Trignmetric Functins LECTURE OF 5 TOPIC :.0 TRIGONOMETRIC FUNCTIONS SUBTOPIC :. Trignmetric Ratis and Identities LEARNING : At the end f the lessn, students shuld be able t: OUTCOMES
More informationSOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 4. Function spaces
SOLUTIONS TO EXERCISES FOR MATHEMATICS 205A Part 4 Fall 2014 IV. Functin spaces IV.1 : General prperties Additinal exercises 1. The mapping q is 1 1 because q(f) = q(g) implies that fr all x we have f(x)
More informationMachine Learning. Support Vector Machines. Manfred Huber
Machine Learning Support Vector Machines Manfred Huber 2015 1 Support Vector Machines Both logistic regression and linear discriminant analysis learn a linear discriminant function to separate the data
More informationPart a: Writing the nodal equations and solving for v o gives the magnitude and phase response: tan ( 0.25 )
+ - Hmewrk 0 Slutin ) In the circuit belw: a. Find the magnitude and phase respnse. b. What kind f filter is it? c. At what frequency is the respnse 0.707 if the generatr has a ltage f? d. What is the
More informationA Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture
Few asic Facts but Isthermal Mass Transfer in a inary Miture David Keffer Department f Chemical Engineering University f Tennessee first begun: pril 22, 2004 last updated: January 13, 2006 dkeffer@utk.edu
More informationBootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >
Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);
More informationComputational modeling techniques
Cmputatinal mdeling techniques Lecture 2: Mdeling change. In Petre Department f IT, Åb Akademi http://users.ab.fi/ipetre/cmpmd/ Cntent f the lecture Basic paradigm f mdeling change Examples Linear dynamical
More informationSection 6-2: Simplex Method: Maximization with Problem Constraints of the Form ~
Sectin 6-2: Simplex Methd: Maximizatin with Prblem Cnstraints f the Frm ~ Nte: This methd was develped by Gerge B. Dantzig in 1947 while n assignment t the U.S. Department f the Air Frce. Definitin: Standard
More informationPhysical Layer: Outline
18-: Intrductin t Telecmmunicatin Netwrks Lectures : Physical Layer Peter Steenkiste Spring 01 www.cs.cmu.edu/~prs/nets-ece Physical Layer: Outline Digital Representatin f Infrmatin Characterizatin f Cmmunicatin
More informationGENESIS Structural Optimization for ANSYS Mechanical
P3 STRUCTURAL OPTIMIZATION (Vl. II) GENESIS Structural Optimizatin fr ANSYS Mechanical An Integrated Extensin that adds Structural Optimizatin t ANSYS Envirnment New Features and Enhancements Release 2017.03
More informationSequential Allocation with Minimal Switching
In Cmputing Science and Statistics 28 (1996), pp. 567 572 Sequential Allcatin with Minimal Switching Quentin F. Stut 1 Janis Hardwick 1 EECS Dept., University f Michigan Statistics Dept., Purdue University
More information22.54 Neutron Interactions and Applications (Spring 2004) Chapter 11 (3/11/04) Neutron Diffusion
.54 Neutrn Interactins and Applicatins (Spring 004) Chapter (3//04) Neutrn Diffusin References -- J. R. Lamarsh, Intrductin t Nuclear Reactr Thery (Addisn-Wesley, Reading, 966) T study neutrn diffusin
More informationChapter 9 Vector Differential Calculus, Grad, Div, Curl
Chapter 9 Vectr Differential Calculus, Grad, Div, Curl 9.1 Vectrs in 2-Space and 3-Space 9.2 Inner Prduct (Dt Prduct) 9.3 Vectr Prduct (Crss Prduct, Outer Prduct) 9.4 Vectr and Scalar Functins and Fields
More informationLHS Mathematics Department Honors Pre-Calculus Final Exam 2002 Answers
LHS Mathematics Department Hnrs Pre-alculus Final Eam nswers Part Shrt Prblems The table at the right gives the ppulatin f Massachusetts ver the past several decades Using an epnential mdel, predict the
More informationLim f (x) e. Find the largest possible domain and its discontinuity points. Why is it discontinuous at those points (if any)?
THESE ARE SAMPLE QUESTIONS FOR EACH OF THE STUDENT LEARNING OUTCOMES (SLO) SET FOR THIS COURSE. SLO 1: Understand and use the cncept f the limit f a functin i. Use prperties f limits and ther techniques,
More informationYou need to be able to define the following terms and answer basic questions about them:
CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f
More informationSTATS216v Introduction to Statistical Learning Stanford University, Summer Practice Final (Solutions) Duration: 3 hours
STATS216v Intrductin t Statistical Learning Stanfrd University, Summer 2016 Practice Final (Slutins) Duratin: 3 hurs Instructins: (This is a practice final and will nt be graded.) Remember the university
More informationChapter 2 GAUSS LAW Recommended Problems:
Chapter GAUSS LAW Recmmended Prblems: 1,4,5,6,7,9,11,13,15,18,19,1,7,9,31,35,37,39,41,43,45,47,49,51,55,57,61,6,69. LCTRIC FLUX lectric flux is a measure f the number f electric filed lines penetrating
More information[COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t o m a k e s u r e y o u a r e r e a d y )
(Abut the final) [COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t m a k e s u r e y u a r e r e a d y ) The department writes the final exam s I dn't really knw what's n it and I can't very well
More informationx x
Mdeling the Dynamics f Life: Calculus and Prbability fr Life Scientists Frederick R. Adler cfrederick R. Adler, Department f Mathematics and Department f Bilgy, University f Utah, Salt Lake City, Utah
More informationIntroduction to Support Vector Machines
Introduction to Support Vector Machines Shivani Agarwal Support Vector Machines (SVMs) Algorithm for learning linear classifiers Motivated by idea of maximizing margin Efficient extension to non-linear
More informationEASTERN ARIZONA COLLEGE Precalculus Trigonometry
EASTERN ARIZONA COLLEGE Precalculus Trignmetry Curse Design 2017-2018 Curse Infrmatin Divisin Mathematics Curse Number MAT 181 Title Precalculus Trignmetry Credits 3 Develped by Gary Rth Lecture/Lab Rati
More informationBuilding to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems.
Building t Transfrmatins n Crdinate Axis Grade 5: Gemetry Graph pints n the crdinate plane t slve real-wrld and mathematical prblems. 5.G.1. Use a pair f perpendicular number lines, called axes, t define
More informationThermodynamics Partial Outline of Topics
Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)
More informationFall 2013 Physics 172 Recitation 3 Momentum and Springs
Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.
More informationWhat is Statistical Learning?
What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,
More informationDESIGN OPTIMIZATION OF HIGH-LIFT CONFIGURATIONS USING A VISCOUS ADJOINT-BASED METHOD
DESIGN OPTIMIZATION OF HIGH-LIFT CONFIGURATIONS USING A VISCOUS ADJOINT-BASED METHOD Sangh Kim Stanfrd University Juan J. Alns Stanfrd University Antny Jamesn Stanfrd University 40th AIAA Aerspace Sciences
More informationCambridge Assessment International Education Cambridge Ordinary Level. Published
Cambridge Assessment Internatinal Educatin Cambridge Ordinary Level ADDITIONAL MATHEMATICS 4037/1 Paper 1 Octber/Nvember 017 MARK SCHEME Maximum Mark: 80 Published This mark scheme is published as an aid
More information5 th grade Common Core Standards
5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin
More informationSource Coding and Compression
Surce Cding and Cmpressin Heik Schwarz Cntact: Dr.-Ing. Heik Schwarz heik.schwarz@hhi.fraunhfer.de Heik Schwarz Surce Cding and Cmpressin September 22, 2013 1 / 60 PartI: Surce Cding Fundamentals Heik
More informationTrigonometric Ratios Unit 5 Tentative TEST date
1 U n i t 5 11U Date: Name: Trignmetric Ratis Unit 5 Tentative TEST date Big idea/learning Gals In this unit yu will extend yur knwledge f SOH CAH TOA t wrk with btuse and reflex angles. This extensin
More informationMargin Distribution and Learning Algorithms
ICML 03 Margin Distributin and Learning Algrithms Ashutsh Garg IBM Almaden Research Center, San Jse, CA 9513 USA Dan Rth Department f Cmputer Science, University f Illinis, Urbana, IL 61801 USA ASHUTOSH@US.IBM.COM
More informationRevisiting the Socrates Example
Sectin 1.6 Sectin Summary Valid Arguments Inference Rules fr Prpsitinal Lgic Using Rules f Inference t Build Arguments Rules f Inference fr Quantified Statements Building Arguments fr Quantified Statements
More informationELECTRON CYCLOTRON HEATING OF AN ANISOTROPIC PLASMA. December 4, PLP No. 322
ELECTRON CYCLOTRON HEATING OF AN ANISOTROPIC PLASMA by J. C. SPROTT December 4, 1969 PLP N. 3 These PLP Reprts are infrmal and preliminary and as such may cntain errrs nt yet eliminated. They are fr private
More informationRevision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax
.7.4: Direct frequency dmain circuit analysis Revisin: August 9, 00 5 E Main Suite D Pullman, WA 9963 (509) 334 6306 ice and Fax Overview n chapter.7., we determined the steadystate respnse f electrical
More informationInflow Control on Expressway Considering Traffic Equilibria
Memirs f the Schl f Engineering, Okayama University Vl. 20, N.2, February 1986 Inflw Cntrl n Expressway Cnsidering Traffic Equilibria Hirshi INOUYE* (Received February 14, 1986) SYNOPSIS When expressway
More informationV. Balakrishnan and S. Boyd. (To Appear in Systems and Control Letters, 1992) Abstract
On Cmputing the WrstCase Peak Gain f Linear Systems V Balakrishnan and S Byd (T Appear in Systems and Cntrl Letters, 99) Abstract Based n the bunds due t Dyle and Byd, we present simple upper and lwer
More informationCOMP9414/ 9814/ 3411: Artificial Intelligence. 14. Course Review. COMP3411 c UNSW, 2014
COMP9414/ 9814/ 3411: Artificial Intelligence 14. Curse Review COMP9414/9814/3411 14s1 Review 1 Assessment Assessable cmpnents f the curse: Assignment 1 10% Assignment 2 8% Assignment 3 12% Written Eam
More information8 th Grade Math: Pre-Algebra
Hardin Cunty Middle Schl (2013-2014) 1 8 th Grade Math: Pre-Algebra Curse Descriptin The purpse f this curse is t enhance student understanding, participatin, and real-life applicatin f middle-schl mathematics
More informationSupport Vector Machine (SVM) and Kernel Methods
Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2014 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin
More informationComputational modeling techniques
Cmputatinal mdeling techniques Lecture 11: Mdeling with systems f ODEs In Petre Department f IT, Ab Akademi http://www.users.ab.fi/ipetre/cmpmd/ Mdeling with differential equatins Mdeling strategy Fcus
More informationLead/Lag Compensator Frequency Domain Properties and Design Methods
Lectures 6 and 7 Lead/Lag Cmpensatr Frequency Dmain Prperties and Design Methds Definitin Cnsider the cmpensatr (ie cntrller Fr, it is called a lag cmpensatr s K Fr s, it is called a lead cmpensatr Ntatin
More informationSupport Vector Machines: Maximum Margin Classifiers
Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind
More informationENGI 1313 Mechanics I
ENGI 1313 Mechanics I Lecture 11: 2D and 3D Particle Equilibrium Shawn Kenny, Ph.D., P.Eng. Assistant Prfessr aculty f Engineering and Applied Science Memrial University f Newfundland spkenny@engr.mun.ca
More informationA.H. Helou Ph.D.~P.E.
1 EVALUATION OF THE STIFFNESS MATRIX OF AN INDETERMINATE TRUSS USING MINIMIZATION TECHNIQUES A.H. Helu Ph.D.~P.E. :\.!.\STRAC'l' Fr an existing structure the evaluatin f the Sti"ffness matrix may be hampered
More informationA Scalable Recurrent Neural Network Framework for Model-free
A Scalable Recurrent Neural Netwrk Framewrk fr Mdel-free POMDPs April 3, 2007 Zhenzhen Liu, Itamar Elhanany Machine Intelligence Lab Department f Electrical and Cmputer Engineering The University f Tennessee
More informationMathematics and Computer Sciences Department. o Work Experience, General. o Open Entry/Exit. Distance (Hybrid Online) for online supported courses
SECTION A - Curse Infrmatin 1. Curse ID: 2. Curse Title: 3. Divisin: 4. Department: 5. Subject: 6. Shrt Curse Title: 7. Effective Term:: MATH 70S Integrated Intermediate Algebra Natural Sciences Divisin
More informationLecture 5: Equilibrium and Oscillations
Lecture 5: Equilibrium and Oscillatins Energy and Mtin Last time, we fund that fr a system with energy cnserved, v = ± E U m ( ) ( ) One result we see immediately is that there is n slutin fr velcity if
More informationKinetic Model Completeness
5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins
More informationDead-beat controller design
J. Hetthéssy, A. Barta, R. Bars: Dead beat cntrller design Nvember, 4 Dead-beat cntrller design In sampled data cntrl systems the cntrller is realised by an intelligent device, typically by a PLC (Prgrammable
More informationI.S. 239 Mark Twain. Grade 7 Mathematics Spring Performance Task: Proportional Relationships
I.S. 239 Mark Twain 7 ID Name: Date: Grade 7 Mathematics Spring Perfrmance Task: Prprtinal Relatinships Directins: Cmplete all parts f each sheet fr each given task. Be sure t read thrugh the rubrics s
More informationA New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation
III-l III. A New Evaluatin Measure J. Jiner and L. Werner Abstract The prblems f evaluatin and the needed criteria f evaluatin measures in the SMART system f infrmatin retrieval are reviewed and discussed.
More information