The Solution Path of the Slab Support Vector Machine

Size: px
Start display at page:

Download "The Solution Path of the Slab Support Vector Machine"

Transcription

1 CCCG 2008, Mntréal, Québec, August 3 5, 2008 The Slutin Path f the Slab Supprt Vectr Machine Michael Eigensatz Jachim Giesen Madhusudan Manjunath Abstract Given a set f pints in a Hilbert space that can be separated frm the rigin. The slab supprt vectr machine (slab SVM) is an ptimizatin prblem that aims at finding a slab (tw parallel hyperplanes whse distance the slab width is essentially fixed) that enclses the pints and is maximally separated frm the rigin. Extreme cases f the slab SVM include the smallest enclsing ball prblem and an interplatin prblem that was used (as the slab SVM itself) in surface recnstructin with radial basis functins. Here we shw that the path f slutins f the slab SVM, i.e., the slutin parametrized by the slab width is piecewise linear. Intrductin Data structures used in fields like graphics, visualizatin and learning ften have many free parameters. In mst cases a gd chice f these parameters is nt bvius. Cmputatinal gemetry was facing similar prblems: fr example when using alpha shapes [Ede95] fr surface recnstructin r in bi-gemetric mdeling the questin arises as t what value t chse fr alpha. Cmputatinal gemetry [Ede95, ELZ02, GCPZ06] gave an answer t this questin that can be seminal als fr the afrementined areas f cmputer science, namely, d nt cmpute the slutin fr a fixed mre r less well chsen value f the parameter, but cmpute the whle spectrum f structures and then lk fr gd slutins in this spectrum. One methd t determine a gd structure is tplgical persistence pineered by Edelsbrunner, Harer and Zmrdian [ELZ02]. Here we investigate an ptimizatin prblem that has its rts in machine learning and was als applied in varius frms t the surface recnstructin prblem. The prblem is called slab supprt vectr machine (slab SVM) [SGS04] and takes as input a set f data pints in a Hilbert space that can be separated frm the rigin and aims at finding a slab (tw parallel hyperplanes whse width is essentially fixed as δ > 0) that enclses the pints and is maximally separated frm the rigin. Applied Gemetry Grup, ETH Zürich, eigensatz@inf.ethz.ch Institut für Infrmatik, Friedrich-Schiller-Universität Jena, giesen@minet.uni-jena.de Max-Planck Institut für Infrmatik, manjun@mpi-inf.mpg.de The slab SVM has fund applicatins in surface recnstructin [SGS04], and quantile estimatin and nvelty detectin [SS02]. In these applicatins the data pints reside in d-dimensinal Euclidean space but are mapped by a feature map int anther (ften infinite dimensinal) Hilbert space. The structure f the slab SVM is such that the feature map des nt have t be given explicitly, but nly implicitly thrugh a psitive kernel: the dual ptimizatin prblem f the slab SVM depends nly n the pairwise inner prducts f the data pints. A psitive kernel can be used t replace these inner prducts withut changing the nature (cnvex quadratic prgram) f the ptimizatin prblem. The parameter we are interested in is δ, which essentially fixes the width f the slab. In the applicatins, it is difficult t tell befrehand what a gd chice f δ is. Hence in the spirit f the cmputatinal gemetry apprach we want t cmpute the slutin t the slab SVM fr all values f δ. Once we have this spectrum f slutins ther methds can be emplyed t find gd chices fr δ. Here we d nt want t discuss hw such methds culd lk like, but fcus n cmputing the slutin spectrum. We shw that the slutin path f the slab SVM, i.e., the slutin parametrized by δ is piecewise linear. Our arguments prvide a cmplete gemetric characterizatin f the turning pints (ndes) f the slutin path. Our results are in spirit similar t results f Hastie et al. [HRTZ04] wh btained the piecewise linearity f the slutin f the classificatin supprt vectr machine [SS02]. Thugh bth results give piecewise linear slutin paths, the parameters are different in nature and s are the means t establish the results. Our prf is f gemetric nature, whereas Hastie et al. use algebraic arguments. 2 The slab SVM Given data pints X = {x,..., x n } H, where H is a Hilbert space with inner prduct,, such that the data pints can be separated frm the rigin by a hyperplane, i.e., there exists w H\{0} and ρ 0 such that w, x i ρ fr all i =,...,n. The distance f the hyperplane {x H : w, x = ρ} t the rigin f H is given as ρ/ w, where the nrm f

2 20th Canadian Cnference n Cmputatinal Gemetry, 2008 w in H is defined as usual by w = w, w. The slab SVM is the fllwing cnvex quadratic ptimizatin prblem that aims at finding the slab (the space between tw parallel hyperplanes) with width δ/ w that cntains all the data pints and minimizes 2 w 2 ρ, i.e., essentially maximizes the distance f the slab t the rigin (see als Figure ): 2 w 2 ρ ρ w, x i ρ + δ fr all i =,...,n and H is the kernel reprducing Hilbert space. Since the dual f the slab SVM nly depends n the inner prducts f the data pints, we can replace x i, x j by k(x i, x j ). A ppular psitive kernel is the Gaussian k(x i, x j ) = exp ( x i x j 2 ) 2σ 2, which is an example f a s called radial basis functin kernel, i.e., a kernel that nly depends n the distance x i x j. The data pints x i = φ(y i ) are linearly independent and the Gram matrix ( k(x i, x j ) ) assciated with the Gaussian kernel is psitive, i.e., it has full rank and thus is invertible. In the fllwing, we always assume that the data pints x i are linear independent. (ρ+δ) / w w. ρ/ w 3 Tw extreme cases The bjective functin f the slab SVM might lk smewhat arbitrary at a first glance. Cnsidering the extreme cases δ = and δ = 0 helps t get a better understanding f the gemetry behind it. Figure : The gemetric set-up fr the slab SVM. Nte that the slab SVM prblem is always feasible since (w, ρ) = (0, 0) is always cntained in the cnstraint plytpe. The Lagrangian dual t this prblem can be derived frm the saddle pint cnditin fr the Lagrangian 3. The pen slab SVM We dente as pen slab SVM the special case δ = f the slab SVM, see Figure 2 fr the gemetric set-up. The pen slab SVM ptimizatin prblem reads as 2 w 2 ρ ρ w, x i fr all i =,...,n L(w, ρ, α, β) = 2 w 2 ρ α i ( w, x i ρ) + i= β i ( w, x i ρ δ), i= where α i, β i 0. The saddle pint cnditin gives L/ w = 0 which implies w = i= (α i β i )x i and L/ ρ = 0 which implies i= (α i β i ) = frm which the dual fllws w. ρ / w min n α,β 2 i,j= (α i β i )(α j β j ) x i, x j + δ i= β i α i, β i 0 fr all i =,...,n. i= (α i β i ) = In mst applicatins [SS02] the data pints are btained frm applying a feature map φ t input data pints y,...,y n R d, i.e., x i = φ(y i ) H, where the feature map is nt given explicitly, but implicitly in frm f a psitive kernel functin k : R d R d R, i.e., x i, x j = φ(y i ), φ(y j ) = k(x i, x j ). Figure 2: The gemetric set-up fr the pen slab SVM. As in the general case, we can derive the Lagrangian dual f the pen slab SVM and get the fllwing ptimizatin prblem in the dual variables α i, i =,...,n, min α 2 i,j= α iα j x i, x j α i 0 fr all i =,...,n. i= α i =

3 CCCG 2008, Mntréal, Québec, August 3 5, 2008 Observe that the ptimal value f this prblem is the distance f the rigin t the cnvex hull f the data pints x,..., x n. The saddle pint cnditin L/ w = 0 implies w = α i x i. i= Hence the ptimal vectr w is the shrtest vectr frm the rigin t the cnvex hull f the data pints and w is the distance f rigin t this cnvex hull. If the data pints x i H are btained frm data pints y i R d that were mapped t H by a feature map implicitly given by a radial basis functin kernel r( ), i.e., by replacing inner prducts y i, y j by r( y i y j ), then the pen slab SVM is equivalent t cmputing the smallest enclsing ball f the data pints x i, see [SS02]. 3.2 The zer slab SVM As zer slab SVM we dente the slab SVM fr the case δ = 0, see Figure 3 fr the gemetric set-up. The zer slab SVM ptimizatin prblem reads as 2 w 2 ρ w, x i = ρ fr all i =,...,n It is wrthwhile t nte that the dual ptimizatin prblem bils dwn t slving a linear system. We derive frm the saddle pint cnditin 0 = L(w, ρ, α) α j = w, x j ρ fr all j =,...,n, which tgether with w = i= α ix i implies α i x i, x j = ρ fr all j =,...,n, i= which is a linear system with unknwn right-hand side ρ fr the dual variables α i. Substituting γ i = α i /ρ gives the linear system γ i x i, x j = fr all j =,...,n, i= which can be slved fr the γ i (assuming the Gram matrix ( x, x j ) has full rank). Frm the γ i we can cmpute ρ as ρ = ( i= γ i), all the α i as α i = ργ i and finally w as w = α i x i = ρ γ i x i. i= i= The linear system fr the γ i was studied extensively in cmputer graphics fr implicit surface recnstructin [CBC + 0]. 4 Surface recnstructin. w ρ / w Figure 3: The gemetric set-up fr the zer slab SVM. The ptimizatin prblem reads in dual variables α i as min α 2 i,j= α iα j x i, x j i= α i = Nte that the ptimal value f this prblem is the distance f the rigin t the affine hull f the data pints x,...,x n. Again we have w = i= α ix i, and thus w is the shrtest vectr frm the rigin t the affine hull f the data pints and w is the distance f rigin t this affine hull. Figure 4: An example surface recnstructin (Max- Planck Head: 2022 pints) using the slab SVM fr a fixed (small) value f δ. Let us briefly recapitulate hw the slab SVM can be used directly fr surface recnstructin [SGS04]. Given are sample pints y,..., y n R 3 frm a smth surface

4 20th Canadian Cnference n Cmputatinal Gemetry, 2008 embedded int R 3. These sample pints are mapped int the feature space assciated with the Gaussian kernel. The recnstructin is given implicitly as f (0), where f : R 3 R is the kernel expansin f(x) = w, φ(x) ρ = (α i β i )exp ( x i x 2 ) 2σ 2 ρ, i= where x R 3, φ( ) is the feature map assciated with the Gaussian kernel, and α and β are the slutins t the dual SVM. Nte that ρ can als be cmputed frm the slutin t the slab SVM (r its dual). See Figure 4 fr an example and als nte that especially in the presence f nise ne prbably des nt want t have an interplating slutin (as ne gets it frm the zer slab SVM and the related methd prpsed in [CBC + 0]), but wuld like t allw small slack in terms f a small value f δ > 0. Nte that the slab SVM wrks the same fr surface recnstructin in dimensins beynd three. Since 2 x i 2 is cnstant, i.e., des nt depend n w r ρ, we can drp it frm the bjective functin. This gives if we set w = w x i and refrmulate the cnstraints in the new variable w accrdingly the fllwing versin f the slab SVM: 2 w 2 0 w, x j x i + x i, x j x i 2 δ fr j i This prblem asks fr the shrtest vectr w in the cnstraint plytpe r equivalently the distance f the cnstraint plytpe t the rigin. Nte that this distance prblem is als always feasible, i.e., the cnstraint plytpe des nt becme empty. T see this bserve that w = x i is always in the plytpe. The gain and lse events can be nicely illustrated fr the distance prblem, see Figure 5. 5 States and events Fr a given value f δ (0, ) let (w, ρ) be the ptimal slutin f the slab SVM. We assciate states with the data pints x i, i =,...,n: () lwer supprting, if w, x i = ρ (2) upper supprting, if w, x i = ρ + δ (3) nn-supprting, if neither lwer- nr upper supprting An event ccurs when while decreasing δ the state f any data pint changes. We distinguish tw types f events: a supprting data pint becmes nnsupprting, r a nn-supprting data pint becmes supprting. We call the first type f event a lse event and the secnd type f event a gain event. 6 The Slutin Path Frm the cnstraints i= (α i β i ) = and α i, β i 0 f the dual f the slab SVM we can cnclude that there exists α i > 0. This in turn allws us t cnclude using the Karuhn-Kuhn-Tucker cnditin α i ( w, xi ρ ) = 0 that fr any δ there always exists a lwer supprting data pint. Fr a given δ, let x i be a lwer supprting data pint. The cntinuus dependence f the cefficient α i n the parameter δ implies that α i > 0 fr sme neighbrhd f U(δ ) (0, ). Hence x i is a lwer supprting data pint fr all δ U(δ ). We use this insight t lcally, i.e., fr δ U(δ ), transfrm the slab SVM int an equivalent distance prblem. Nte that we have ρ = w, x i. Thus we can write the bjective functin f the slab SVM as 2 w 2 ρ = 2 w 2 w, x i = 2 w x i 2 2 x i 2. w w Figure 5: The lwer (nn-mving) cnstraints are shwn by thick slid lines and the upper (mving) cnstraints are shwn by thin slid lines. On the left: when the mving cnstraint hits w this cnstraint becmes binding (gain event) and the slutin is n lnger statinary. On the right: nce the mving cnstraint becmes rthgnal t w we lse the nn-mving cnstraint (lse event). The frmulatin f the slab SVM as a distance prblem allws t make sme bservatins. Lemma The slutin t the slab SVM is unique. Prf. There is always a unique pint in the cnvex cnstraint plytpe f an equivalent distance prblem that realizes the distance f the plytpe t the rigin. Lemma 2 There exists a δ 0 such that fr all δ > δ 0 the slutin t the slab SVM is statinary, i.e. des nt vary with δ. Prf. The prf is via the distance prblem. Let x i be ne f the (lwer) supprting data pints f the pen slab SVM. We use this x i t frmulate the distance

5 CCCG 2008, Mntréal, Québec, August 3 5, 2008 prblem. The slutin f the distance prblem at δ = is finite (we can cnclude this frm the prperties f the pen slab SVM). Cming frm small values f δ the cnstraint plytpes f the distance prblem fr these values f δ sweep the cnstraint plytpe f the distance prblem at δ =. Since the slutin t the latter is finite the sweep needs t hit the pint that realizes this finite distance at sme finite value δ 0 f δ. That is, fr all δ > δ 0 the pint x i is lwer supprting fr the slab SVM and we can cnclude that the slutin f the slab SVM can be derived frm this statinary slutin f the distance prblem as w = w + x i and ρ = w, x i. Lemma 3 Fr all 0 < δ < δ 0 the slab SVM has an upper supprting data pint. Prf. By the prf f Lemma 2 we have that at δ 0, the slab SVM needs t have an upper supprting data pint, because nly the upper cnstraints sweep the cnstraint plytpe f the distance prblem at δ =. Assume there exists 0 < δ < δ 0 such that at δ the slab SVM has n upper supprting data pint. Let be the set f all δ with this prperty and let δ = sup. At δ the slab SVM needs t have an upper supprting data pint. T see this nte that there exists a data pint x j that is upper supprting at δ+ε fr all sufficiently small ε > 0. At δ we can derive a distance prblem that is equivalent t the slab SVM fr sme neighbrhd f δ. The data pint x j needs t be upper supprting als fr this distance prblems at δ + ε fr all sufficiently small ε > 0. The cnstraint hyperplane given by w, x j x i + x i, x j x i 2 = δ () fr the data pint x j has all the cnstraint hyperplanes given by w, x j x i + x i, x j x i 2 = δ + ε (2) n ne side. The latter hyperplanes all cntain a pint that realizes the slutin f the crrespnding distance prblem. By the cntinuity f the distance prblem in δ any sequence in the latter pint set cnverges t the slutin f the distance prblem at δ. Hence this slutin needs t be cntained in the cnstraint hyperplane given by Equatin () and x j is an upper supprting data pint fr bth the distance- and the slab SVM prblem at δ. By ur assumptin there needs t exist sme neighbrhd U f δ such that the distance prblem des nt have an upper supprting data pint fr all δ U (0, δ ). This means that the family f hyperplanes given by Equatin (2) sweeps with ε 0, i.e., at δ, ut f the cnstraint plytpe given by the cnstraints w, x j x i + x i, x j x i 2 = 0. But this can nly happen if the cnstraint plytpe f the distance prblem grws while sweeping the hyperplane given by Equatin (2) frm δ + ε t δ ε, which is a cntradictin. Crllary Fr all 0 < δ < δ 0 the slutin t the slab SVM is nn-statinary. We can cnclude that the slutin path f the slab SVM is piecewise linear (since w the pint that realizes the distance f the cnstraint plytpe t the rigin is a piecewise linear curve parametrized by δ). Therem 4 The slutin path f the slab SVM, i.e., the ptimal cefficients α i and β i (in the dual) and w and ρ (in the primal) are piecewise linear functins f δ. Crllary 2 The ptimal slutin w t the slab SVM is a piecewise linear path that cnnects the pint clsest t the rigin n the cnvex hull (slutin at δ = ) f the data pints with the pint clsest t the rigin n the affine hull (slutin at δ = 0) f the data pints. 7 Cnclusins Therem 4 characterizes the slutin path, but des nt immediately suggest an algrithm t cmpute it. But algrithms fr parametrized cnvex quadratic prgrams (such as the slab SVM) are knwn, see fr example [Rit8]. Anther interesting pen questin is abut the cmplexity f the slutin path, i.e., the number f bends. We cnjecture that this cmplexity can be expnential in the number f data pints. Acknwledgments. Jachim Giesen wants t thank Edgar Rams and Bardia Sadri fr valuable discussins n the slab SVM. References [CBC + 0] Jnathan C. Carr, Richard K. Beatsn, Jn B. Cherrie, Tim J. Mitchell, W. Richard Fright, Bruce C. McCallum, and Tim R. Evans. Recnstructin and representatin f 3d bjects with radial basis functins. In SIGGRAPH, pages 67 76, 200. [Ede95] [ELZ02] Herbert Edelsbrunner. The unin f balls and its dual shape. Discrete & Cmputatinal Gemetry, 3:45 440, 995. Herbert Edelsbrunner, David Letscher, and Afra Zmrdian. Tplgical persistence and simplificatin. Discrete & Cmputatinal Gemetry, 28(4):5 533, [GCPZ06] Jachim Giesen, Frédéric Cazals, Mark Pauly, and Afra Zmrdian. The cnfrmal alpha shape filtratin. The Visual Cmputer, 22(8):53 540, 2006.

6 20th Canadian Cnference n Cmputatinal Gemetry, 2008 [HRTZ04] Trevr Hastie, Saharn Rsset, Rbert Tibshirani, and Ji Zhu. The entire regularizatin path fr the supprt vectr machine. Jurnal f Machine Learning Research, 5:39 45, [Rit8] [SGS04] [SS02] Klaus Ritter. On parametric linear and quadratic prgramming prblems. In Mathematical Prgramming: Prceedings f the Internatinal Cngress n Mathematical Prgramming, pages , 98. Bernhard Schölkpf, Jachim Giesen, and Simn Spalinger. Kernel methds fr implicit surface mdeling. In NIPS, Bernhard Schölkpf and Alex Smla. Learning with Kernels: Supprt Vectr Machines, Regularizatin, Optimizatin and Beynd. MIT Press, Cambridge, MA, 2002.

Support-Vector Machines

Support-Vector Machines Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

The blessing of dimensionality for kernel methods

The blessing of dimensionality for kernel methods fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented

More information

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d) COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise

More information

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*

More information

IAML: Support Vector Machines

IAML: Support Vector Machines 1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs Admissibility Cnditins and Asympttic Behavir f Strngly Regular Graphs VASCO MOÇO MANO Department f Mathematics University f Prt Oprt PORTUGAL vascmcman@gmailcm LUÍS ANTÓNIO DE ALMEIDA VIEIRA Department

More information

Linear programming III

Linear programming III Linear prgramming III Review 1/33 What have cvered in previus tw classes LP prblem setup: linear bjective functin, linear cnstraints. exist extreme pint ptimal slutin. Simplex methd: g thrugh extreme pint

More information

Lyapunov Stability Stability of Equilibrium Points

Lyapunov Stability Stability of Equilibrium Points Lyapunv Stability Stability f Equilibrium Pints 1. Stability f Equilibrium Pints - Definitins In this sectin we cnsider n-th rder nnlinear time varying cntinuus time (C) systems f the frm x = f ( t, x),

More information

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:

In SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw: In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin

More information

Homology groups of disks with holes

Homology groups of disks with holes Hmlgy grups f disks with hles THEOREM. Let p 1,, p k } be a sequence f distinct pints in the interir unit disk D n where n 2, and suppse that fr all j the sets E j Int D n are clsed, pairwise disjint subdisks.

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES

SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES 1 SURVIVAL ANALYSIS WITH SUPPORT VECTOR MACHINES Wlfgang HÄRDLE Ruslan MORO Center fr Applied Statistics and Ecnmics (CASE), Humbldt-Universität zu Berlin Mtivatin 2 Applicatins in Medicine estimatin f

More information

Department of Economics, University of California, Davis Ecn 200C Micro Theory Professor Giacomo Bonanno. Insurance Markets

Department of Economics, University of California, Davis Ecn 200C Micro Theory Professor Giacomo Bonanno. Insurance Markets Department f Ecnmics, University f alifrnia, Davis Ecn 200 Micr Thery Prfessr Giacm Bnann Insurance Markets nsider an individual wh has an initial wealth f. ith sme prbability p he faces a lss f x (0

More information

Support Vector Machines and Flexible Discriminants

Support Vector Machines and Flexible Discriminants 12 Supprt Vectr Machines and Flexible Discriminants This is page 417 Printer: Opaque this 12.1 Intrductin In this chapter we describe generalizatins f linear decisin bundaries fr classificatin. Optimal

More information

NUMBERS, MATHEMATICS AND EQUATIONS

NUMBERS, MATHEMATICS AND EQUATIONS AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t

More information

Least Squares Optimal Filtering with Multirate Observations

Least Squares Optimal Filtering with Multirate Observations Prc. 36th Asilmar Cnf. n Signals, Systems, and Cmputers, Pacific Grve, CA, Nvember 2002 Least Squares Optimal Filtering with Multirate Observatins Charles W. herrien and Anthny H. Hawes Department f Electrical

More information

Section 6-2: Simplex Method: Maximization with Problem Constraints of the Form ~

Section 6-2: Simplex Method: Maximization with Problem Constraints of the Form ~ Sectin 6-2: Simplex Methd: Maximizatin with Prblem Cnstraints f the Frm ~ Nte: This methd was develped by Gerge B. Dantzig in 1947 while n assignment t the U.S. Department f the Air Frce. Definitin: Standard

More information

Fall 2013 Physics 172 Recitation 3 Momentum and Springs

Fall 2013 Physics 172 Recitation 3 Momentum and Springs Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.

More information

Chapter 2 GAUSS LAW Recommended Problems:

Chapter 2 GAUSS LAW Recommended Problems: Chapter GAUSS LAW Recmmended Prblems: 1,4,5,6,7,9,11,13,15,18,19,1,7,9,31,35,37,39,41,43,45,47,49,51,55,57,61,6,69. LCTRIC FLUX lectric flux is a measure f the number f electric filed lines penetrating

More information

Chapter Summary. Mathematical Induction Strong Induction Recursive Definitions Structural Induction Recursive Algorithms

Chapter Summary. Mathematical Induction Strong Induction Recursive Definitions Structural Induction Recursive Algorithms Chapter 5 1 Chapter Summary Mathematical Inductin Strng Inductin Recursive Definitins Structural Inductin Recursive Algrithms Sectin 5.1 3 Sectin Summary Mathematical Inductin Examples f Prf by Mathematical

More information

Smoothing, penalized least squares and splines

Smoothing, penalized least squares and splines Smthing, penalized least squares and splines Duglas Nychka, www.image.ucar.edu/~nychka Lcally weighted averages Penalized least squares smthers Prperties f smthers Splines and Reprducing Kernels The interplatin

More information

Modeling the Nonlinear Rheological Behavior of Materials with a Hyper-Exponential Type Function

Modeling the Nonlinear Rheological Behavior of Materials with a Hyper-Exponential Type Function www.ccsenet.rg/mer Mechanical Engineering Research Vl. 1, N. 1; December 011 Mdeling the Nnlinear Rhelgical Behavir f Materials with a Hyper-Expnential Type Functin Marc Delphin Mnsia Département de Physique,

More information

Chapter 3 Kinematics in Two Dimensions; Vectors

Chapter 3 Kinematics in Two Dimensions; Vectors Chapter 3 Kinematics in Tw Dimensins; Vectrs Vectrs and Scalars Additin f Vectrs Graphical Methds (One and Tw- Dimensin) Multiplicatin f a Vectr b a Scalar Subtractin f Vectrs Graphical Methds Adding Vectrs

More information

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression 4th Indian Institute f Astrphysics - PennState Astrstatistics Schl July, 2013 Vainu Bappu Observatry, Kavalur Crrelatin and Regressin Rahul Ry Indian Statistical Institute, Delhi. Crrelatin Cnsider a tw

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 2: Mdeling change. In Petre Department f IT, Åb Akademi http://users.ab.fi/ipetre/cmpmd/ Cntent f the lecture Basic paradigm f mdeling change Examples Linear dynamical

More information

Dataflow Analysis and Abstract Interpretation

Dataflow Analysis and Abstract Interpretation Dataflw Analysis and Abstract Interpretatin Cmputer Science and Artificial Intelligence Labratry MIT Nvember 9, 2015 Recap Last time we develped frm first principles an algrithm t derive invariants. Key

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

SAMPLING DYNAMICAL SYSTEMS

SAMPLING DYNAMICAL SYSTEMS SAMPLING DYNAMICAL SYSTEMS Melvin J. Hinich Applied Research Labratries The University f Texas at Austin Austin, TX 78713-8029, USA (512) 835-3278 (Vice) 835-3259 (Fax) hinich@mail.la.utexas.edu ABSTRACT

More information

MODULE 1. e x + c. [You can t separate a demominator, but you can divide a single denominator into each numerator term] a + b a(a + b)+1 = a + b

MODULE 1. e x + c. [You can t separate a demominator, but you can divide a single denominator into each numerator term] a + b a(a + b)+1 = a + b . REVIEW OF SOME BASIC ALGEBRA MODULE () Slving Equatins Yu shuld be able t slve fr x: a + b = c a d + e x + c and get x = e(ba +) b(c a) d(ba +) c Cmmn mistakes and strategies:. a b + c a b + a c, but

More information

THE TOPOLOGY OF SURFACE SKIN FRICTION AND VORTICITY FIELDS IN WALL-BOUNDED FLOWS

THE TOPOLOGY OF SURFACE SKIN FRICTION AND VORTICITY FIELDS IN WALL-BOUNDED FLOWS THE TOPOLOGY OF SURFACE SKIN FRICTION AND VORTICITY FIELDS IN WALL-BOUNDED FLOWS M.S. Chng Department f Mechanical Engineering The University f Melburne Victria 3010 AUSTRALIA min@unimelb.edu.au J.P. Mnty

More information

Thermodynamics and Equilibrium

Thermodynamics and Equilibrium Thermdynamics and Equilibrium Thermdynamics Thermdynamics is the study f the relatinship between heat and ther frms f energy in a chemical r physical prcess. We intrduced the thermdynamic prperty f enthalpy,

More information

Part 3 Introduction to statistical classification techniques

Part 3 Introduction to statistical classification techniques Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms

More information

, which yields. where z1. and z2

, which yields. where z1. and z2 The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

1 The limitations of Hartree Fock approximation

1 The limitations of Hartree Fock approximation Chapter: Pst-Hartree Fck Methds - I The limitatins f Hartree Fck apprximatin The n electrn single determinant Hartree Fck wave functin is the variatinal best amng all pssible n electrn single determinants

More information

Quantum Harmonic Oscillator, a computational approach

Quantum Harmonic Oscillator, a computational approach IOSR Jurnal f Applied Physics (IOSR-JAP) e-issn: 78-4861.Vlume 7, Issue 5 Ver. II (Sep. - Oct. 015), PP 33-38 www.isrjurnals Quantum Harmnic Oscillatr, a cmputatinal apprach Sarmistha Sahu, Maharani Lakshmi

More information

OF SIMPLY SUPPORTED PLYWOOD PLATES UNDER COMBINED EDGEWISE BENDING AND COMPRESSION

OF SIMPLY SUPPORTED PLYWOOD PLATES UNDER COMBINED EDGEWISE BENDING AND COMPRESSION U. S. FOREST SERVICE RESEARCH PAPER FPL 50 DECEMBER U. S. DEPARTMENT OF AGRICULTURE FOREST SERVICE FOREST PRODUCTS LABORATORY OF SIMPLY SUPPORTED PLYWOOD PLATES UNDER COMBINED EDGEWISE BENDING AND COMPRESSION

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,

More information

Module 4: General Formulation of Electric Circuit Theory

Module 4: General Formulation of Electric Circuit Theory Mdule 4: General Frmulatin f Electric Circuit Thery 4. General Frmulatin f Electric Circuit Thery All electrmagnetic phenmena are described at a fundamental level by Maxwell's equatins and the assciated

More information

CONSTRUCTING STATECHART DIAGRAMS

CONSTRUCTING STATECHART DIAGRAMS CONSTRUCTING STATECHART DIAGRAMS The fllwing checklist shws the necessary steps fr cnstructing the statechart diagrams f a class. Subsequently, we will explain the individual steps further. Checklist 4.6

More information

Figure 1a. A planar mechanism.

Figure 1a. A planar mechanism. ME 5 - Machine Design I Fall Semester 0 Name f Student Lab Sectin Number EXAM. OPEN BOOK AND CLOSED NOTES. Mnday, September rd, 0 Write n ne side nly f the paper prvided fr yur slutins. Where necessary,

More information

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur Cdewrd Distributin fr Frequency Sensitive Cmpetitive Learning with One Dimensinal Input Data Aristides S. Galanpuls and Stanley C. Ahalt Department f Electrical Engineering The Ohi State University Abstract

More information

The standards are taught in the following sequence.

The standards are taught in the following sequence. B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and

More information

Distributions, spatial statistics and a Bayesian perspective

Distributions, spatial statistics and a Bayesian perspective Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics

More information

Chapter 9 Vector Differential Calculus, Grad, Div, Curl

Chapter 9 Vector Differential Calculus, Grad, Div, Curl Chapter 9 Vectr Differential Calculus, Grad, Div, Curl 9.1 Vectrs in 2-Space and 3-Space 9.2 Inner Prduct (Dt Prduct) 9.3 Vectr Prduct (Crss Prduct, Outer Prduct) 9.4 Vectr and Scalar Functins and Fields

More information

You need to be able to define the following terms and answer basic questions about them:

You need to be able to define the following terms and answer basic questions about them: CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f

More information

Thermodynamics Partial Outline of Topics

Thermodynamics Partial Outline of Topics Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)

More information

ENGI 4430 Parametric Vector Functions Page 2-01

ENGI 4430 Parametric Vector Functions Page 2-01 ENGI 4430 Parametric Vectr Functins Page -01. Parametric Vectr Functins (cntinued) Any nn-zer vectr r can be decmpsed int its magnitude r and its directin: r rrˆ, where r r 0 Tangent Vectr: dx dy dz dr

More information

A Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture

A Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture Few asic Facts but Isthermal Mass Transfer in a inary Miture David Keffer Department f Chemical Engineering University f Tennessee first begun: pril 22, 2004 last updated: January 13, 2006 dkeffer@utk.edu

More information

Building to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems.

Building to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems. Building t Transfrmatins n Crdinate Axis Grade 5: Gemetry Graph pints n the crdinate plane t slve real-wrld and mathematical prblems. 5.G.1. Use a pair f perpendicular number lines, called axes, t define

More information

Math Foundations 20 Work Plan

Math Foundations 20 Work Plan Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant

More information

Optimization Programming Problems For Control And Management Of Bacterial Disease With Two Stage Growth/Spread Among Plants

Optimization Programming Problems For Control And Management Of Bacterial Disease With Two Stage Growth/Spread Among Plants Internatinal Jurnal f Engineering Science Inventin ISSN (Online): 9 67, ISSN (Print): 9 676 www.ijesi.rg Vlume 5 Issue 8 ugust 06 PP.0-07 Optimizatin Prgramming Prblems Fr Cntrl nd Management Of Bacterial

More information

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline

More information

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving.

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving. Sectin 3.2: Many f yu WILL need t watch the crrespnding vides fr this sectin n MyOpenMath! This sectin is primarily fcused n tls t aid us in finding rts/zers/ -intercepts f plynmials. Essentially, ur fcus

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 11: Mdeling with systems f ODEs In Petre Department f IT, Ab Akademi http://www.users.ab.fi/ipetre/cmpmd/ Mdeling with differential equatins Mdeling strategy Fcus

More information

What is Statistical Learning?

What is Statistical Learning? What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,

More information

Lim f (x) e. Find the largest possible domain and its discontinuity points. Why is it discontinuous at those points (if any)?

Lim f (x) e. Find the largest possible domain and its discontinuity points. Why is it discontinuous at those points (if any)? THESE ARE SAMPLE QUESTIONS FOR EACH OF THE STUDENT LEARNING OUTCOMES (SLO) SET FOR THIS COURSE. SLO 1: Understand and use the cncept f the limit f a functin i. Use prperties f limits and ther techniques,

More information

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal

More information

Stats Classification Ji Zhu, Michigan Statistics 1. Classification. Ji Zhu 445C West Hall

Stats Classification Ji Zhu, Michigan Statistics 1. Classification. Ji Zhu 445C West Hall Stats 415 - Classificatin Ji Zhu, Michigan Statistics 1 Classificatin Ji Zhu 445C West Hall 734-936-2577 jizhu@umich.edu Stats 415 - Classificatin Ji Zhu, Michigan Statistics 2 Examples f Classificatin

More information

Hubble s Law PHYS 1301

Hubble s Law PHYS 1301 1 PHYS 1301 Hubble s Law Why: The lab will verify Hubble s law fr the expansin f the universe which is ne f the imprtant cnsequences f general relativity. What: Frm measurements f the angular size and

More information

IN a recent article, Geary [1972] discussed the merit of taking first differences

IN a recent article, Geary [1972] discussed the merit of taking first differences The Efficiency f Taking First Differences in Regressin Analysis: A Nte J. A. TILLMAN IN a recent article, Geary [1972] discussed the merit f taking first differences t deal with the prblems that trends

More information

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS 1 Influential bservatins are bservatins whse presence in the data can have a distrting effect n the parameter estimates and pssibly the entire analysis,

More information

Kinematic transformation of mechanical behavior Neville Hogan

Kinematic transformation of mechanical behavior Neville Hogan inematic transfrmatin f mechanical behavir Neville Hgan Generalized crdinates are fundamental If we assume that a linkage may accurately be described as a cllectin f linked rigid bdies, their generalized

More information

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected

More information

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came. MATH 1342 Ch. 24 April 25 and 27, 2013 Page 1 f 5 CHAPTER 24: INFERENCE IN REGRESSION Chapters 4 and 5: Relatinships between tw quantitative variables. Be able t Make a graph (scatterplt) Summarize the

More information

Floating Point Method for Solving Transportation. Problems with Additional Constraints

Floating Point Method for Solving Transportation. Problems with Additional Constraints Internatinal Mathematical Frum, Vl. 6, 20, n. 40, 983-992 Flating Pint Methd fr Slving Transprtatin Prblems with Additinal Cnstraints P. Pandian and D. Anuradha Department f Mathematics, Schl f Advanced

More information

Revisiting the Socrates Example

Revisiting the Socrates Example Sectin 1.6 Sectin Summary Valid Arguments Inference Rules fr Prpsitinal Lgic Using Rules f Inference t Build Arguments Rules f Inference fr Quantified Statements Building Arguments fr Quantified Statements

More information

Revision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax

Revision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax .7.4: Direct frequency dmain circuit analysis Revisin: August 9, 00 5 E Main Suite D Pullman, WA 9963 (509) 334 6306 ice and Fax Overview n chapter.7., we determined the steadystate respnse f electrical

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

Particle Size Distributions from SANS Data Using the Maximum Entropy Method. By J. A. POTTON, G. J. DANIELL AND B. D. RAINFORD

Particle Size Distributions from SANS Data Using the Maximum Entropy Method. By J. A. POTTON, G. J. DANIELL AND B. D. RAINFORD 3 J. Appl. Cryst. (1988). 21,3-8 Particle Size Distributins frm SANS Data Using the Maximum Entrpy Methd By J. A. PTTN, G. J. DANIELL AND B. D. RAINFRD Physics Department, The University, Suthamptn S9

More information

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large

More information

SPH3U1 Lesson 06 Kinematics

SPH3U1 Lesson 06 Kinematics PROJECTILE MOTION LEARNING GOALS Students will: Describe the mtin f an bject thrwn at arbitrary angles thrugh the air. Describe the hrizntal and vertical mtins f a prjectile. Slve prjectile mtin prblems.

More information

Contents. This is page i Printer: Opaque this

Contents. This is page i Printer: Opaque this Cntents This is page i Printer: Opaque this Supprt Vectr Machines and Flexible Discriminants. Intrductin............. The Supprt Vectr Classifier.... Cmputing the Supprt Vectr Classifier........ Mixture

More information

Introduction to Smith Charts

Introduction to Smith Charts Intrductin t Smith Charts Dr. Russell P. Jedlicka Klipsch Schl f Electrical and Cmputer Engineering New Mexic State University as Cruces, NM 88003 September 2002 EE521 ecture 3 08/22/02 Smith Chart Summary

More information

ENSC Discrete Time Systems. Project Outline. Semester

ENSC Discrete Time Systems. Project Outline. Semester ENSC 49 - iscrete Time Systems Prject Outline Semester 006-1. Objectives The gal f the prject is t design a channel fading simulatr. Upn successful cmpletin f the prject, yu will reinfrce yur understanding

More information

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression 3.3.4 Prstate Cancer Data Example (Cntinued) 3.4 Shrinkage Methds 61 Table 3.3 shws the cefficients frm a number f different selectin and shrinkage methds. They are best-subset selectin using an all-subsets

More information

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10]

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10] EECS 270, Winter 2017, Lecture 3 Page 1 f 6 Medium Scale Integrated (MSI) devices [Sectins 2.9 and 2.10] As we ve seen, it s smetimes nt reasnable t d all the design wrk at the gate-level smetimes we just

More information

A solution of certain Diophantine problems

A solution of certain Diophantine problems A slutin f certain Diphantine prblems Authr L. Euler* E7 Nvi Cmmentarii academiae scientiarum Petrplitanae 0, 1776, pp. 8-58 Opera Omnia: Series 1, Vlume 3, pp. 05-17 Reprinted in Cmmentat. arithm. 1,

More information

On Boussinesq's problem

On Boussinesq's problem Internatinal Jurnal f Engineering Science 39 (2001) 317±322 www.elsevier.cm/lcate/ijengsci On Bussinesq's prblem A.P.S. Selvadurai * Department f Civil Engineering and Applied Mechanics, McGill University,

More information

Aerodynamic Separability in Tip Speed Ratio and Separability in Wind Speed- a Comparison

Aerodynamic Separability in Tip Speed Ratio and Separability in Wind Speed- a Comparison Jurnal f Physics: Cnference Series OPEN ACCESS Aerdynamic Separability in Tip Speed Rati and Separability in Wind Speed- a Cmparisn T cite this article: M L Gala Sants et al 14 J. Phys.: Cnf. Ser. 555

More information

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9. Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.

More information

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y=

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y= Intrductin t Vectrs I 21 Intrductin t Vectrs I 22 I. Determine the hrizntal and vertical cmpnents f the resultant vectr by cunting n the grid. X= y= J. Draw a mangle with hrizntal and vertical cmpnents

More information

A Simple Set of Test Matrices for Eigenvalue Programs*

A Simple Set of Test Matrices for Eigenvalue Programs* Simple Set f Test Matrices fr Eigenvalue Prgrams* By C. W. Gear** bstract. Sets f simple matrices f rder N are given, tgether with all f their eigenvalues and right eigenvectrs, and simple rules fr generating

More information

We can see from the graph above that the intersection is, i.e., [ ).

We can see from the graph above that the intersection is, i.e., [ ). MTH 111 Cllege Algebra Lecture Ntes July 2, 2014 Functin Arithmetic: With nt t much difficulty, we ntice that inputs f functins are numbers, and utputs f functins are numbers. S whatever we can d with

More information

A finite steps algorithm for solving convex feasibility problems

A finite steps algorithm for solving convex feasibility problems J Glb Optim (2007) 38:143 160 DOI 10.1007/s10898-006-9088-y ORIGINAL ARTICLE A finite steps algrithm fr slving cnvex feasibility prblems M. Ait Rami U. Helmke J. B. Mre Received: 18 February 2006 / Accepted:

More information

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 4. Function spaces

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 4. Function spaces SOLUTIONS TO EXERCISES FOR MATHEMATICS 205A Part 4 Fall 2008 IV. Functin spaces IV.1 : General prperties (Munkres, 45 47) Additinal exercises 1. Suppse that X and Y are metric spaces such that X is cmpact.

More information

On small defining sets for some SBIBD(4t - 1, 2t - 1, t - 1)

On small defining sets for some SBIBD(4t - 1, 2t - 1, t - 1) University f Wllngng Research Online Faculty f Infrmatics - Papers (Archive) Faculty f Engineering and Infrmatin Sciences 992 On small defining sets fr sme SBIBD(4t -, 2t -, t - ) Jennifer Seberry University

More information

Kinetic Model Completeness

Kinetic Model Completeness 5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins

More information

Elements of Machine Intelligence - I

Elements of Machine Intelligence - I ECE-175A Elements f Machine Intelligence - I Ken Kreutz-Delgad Nun Vascncels ECE Department, UCSD Winter 2011 The curse The curse will cver basic, but imprtant, aspects f machine learning and pattern recgnitin

More information

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t

More information