Support Vector Machines. More Generally Kernel Methods

Size: px
Start display at page:

Download "Support Vector Machines. More Generally Kernel Methods"

Transcription

1 Support Vector Machnes More Generall Kernel Methods

2 Important Concepts Seek lnear decson boundar wth two new concepts Optmzaton based on mamzng margns not GD Kernel mappng for better lnear separaton massagng data PR ANN & ML

3 Two Class Problem: Lnear Separable Case Class Class Man decson boundares can separate these two classes Whch one should we choose? 3

4 Eample of Bad Decson Boundares Class Class Class Class 4

5 Lnear classfers: Whch Hperplane? Lots of possble solutons for abc. Some methods fnd a separatng hperplane but not the optmal one [accordng to some crteron of epected goodness] E.g. perceptron GD Support Vector Machne SVM fnds an optmal soluton. Mamzes the dstance between the hperplane and the dffcult ponts close to decson boundar One ntuton: f there are no ponts near the decson surface then there are no ver uncertan classfcaton decsons Ths lne represents the decson boundar: a + b - c = 0 5 5

6 Another ntuton If ou have to place a fat separator between classes ou have less choces and so the capact of the model has been decreased 6 6

7 Support Vector Machne SVM SVMs mamze the margn around the separatng hperplane. A.k.a. large margn classfers The decson functon s full specfed b a subset of tranng samples the support vectors. Quadratc programmng problem Seen b man as most successful current tet classfcaton method Support vectors Mamze margn 7 7

8 SVM wth Kernel Mappng Orgnal feature space mght not be well condtoned X -> fx n a hgher dmensonal space PR ANN & ML 8

9 Non-lnear SVMs Datasets that are lnearl separable wth some nose work out great: 0 But what are we gong to do f the dataset s just too hard? 0 How about mappng data to a hgher-dmensonal space: 0 9 9

10 Non-lnear SVMs: Feature spaces General dea: the orgnal feature space can alwas be mapped to some hgherdmensonal feature space where the tranng set s lkel separable: Φ: φ 0 0

11 Transformaton to Feature Space Hgh computaton burden due to hgh-dmensonalt and hard to get a good estmate Kernel trcks for effcent computaton: Onl the nner products of feature vectors are used mappng s not eplctl computed effcenc n tme and space Input space. Feature space

12 Kernel-Induced Feature Spaces Recall from Perceptron learnng rule weght s a combnaton of feature vectors those that are dffcult to handle or classfed wrongl w k w k w c k w k 0 or otherwse PR ANN & ML

13 Kernel-Induced Feature Spaces Or the decson rule nvolves onl nner product of features not features themselves h w The same holds true for mapped space Dmenson s not mportant ma not know the mappng Inner products can be evaluated at the low-dmensonal space h w PR ANN & ML 3

14 Kernels A functon that returns the value of dot product of mappng of two arguments k The same perceptron stle learnng can be appled just replace all dot products wth kernels; however SVM has more sophstcated learnng rules PR ANN & ML 4

15 5 Eample : 3 k R F

16 6 More Eample : 4 k R F

17 7 Even More Eample The nterpretaton of mappng s not unque even wth a sngle k functon : 3 k R F

18 Gaussan Kernel Identcal to the Radal Bass Functon z k z z ep Recall that e 0! The feature dmenson s nfntel hgh n ths case Embeddng s not eplctl computed mpossble onl nner product s needed n SVM 8

19 SVM Cost Functon Smlar to logstc functon but pecewse lnear no penalt for z> = and z<- =0 PR ANN & ML 9

20 Margn for w T to be > for = and <- for =- w must be larger n the left fgure than the rght. So large margn mpl small w w w PR ANN & ML 0

21 Numercal Methods Mamze margn or mnmze w cannot be done b gradent descent but b quadratc programmng DON'T tr t ourself fnd a readl avalable mplementaton SVM-lght SVMlb h b h K b s the number of support vectors s the -th support vector a s the weght and b s a bas PR ANN & ML

22 The Optmzaton Problem Let {... n } be our data set and let {-} be the class label of The decson boundar should classf all ponts correctl hard magn A constraned optmzaton problem w = w T w

23 Optmzaton fx no constrants f = 0 f= + PR ANN & ML 3

24 Optmzaton fx equalt gx =0: constrants Mn f +λg or f + λ g = 0 f f= + g g: += PR ANN & ML 4

25 Optmzaton fx nequalt constrants hx<=0: Mn f +λh or f + λ h = 0 Necessar condton for optmalt: f=0 Outsde feasble regon hx>0 not a soluton Insde feasble regon a soluton hx<0 h does not matter λ = 0 On the boundar of feasble regon hx=0 h behaves just lke g an equalt constrant For mnmzaton problem f must pont nsde h must pont outsde f = λ h λ > 0 5

26 KKT condton h 0 λ > 0 f + λ h = 0 PR ANN & ML 6

27 Lagrangan of Orgnal Problem The Lagrangan s Lagrangan multplers Note that w = w T w Settng the gradent of 0 w.r.t. w and b to zero L w = 0 L b = 0 7

28 L= wt w + α w T + b = wt w + α w T α +b α = wt w + α w T w =- wt w + α = - α T α j j j + α = α - α α j j j PR ANN & ML 8

29 The Dual Optmzaton Problem We can transform the problem to ts dual Dot product of X Ths s a conve quadratc programmng QP problem Global mamum of can alwas be found well establshed tools for solvng ths optmzaton problem e.g. cple s New varables Lagrangan multplers 9

30 Quadratc Programmng The cost s quadratc Wthout constrant there s a unque mnmum An descent search technque e.g. gradent descent should eventuall fnd the mnmum Added twst The search s wth constrant PR ANN & ML 30

31 A Geometrcal Interpretaton Class 5 =0 8 =0.6 0 =0 7 =0 =0 Support vectors s wth values dfferent from zero the hold up the separatng plane! 4 =0 9 =0 Class 3 =0 6 =.4 =0.8 3

32 Classfcaton wth SVMs Gven a new pont we can score ts projecton onto the hperplane normal: In dms: score = w +w +b. I.e. compute score: w + b = Σα T + b Set confdence threshold t. Score > t: es Score < -t: no Else: don t know

33 Soft Margn Classfcaton If the tranng set s not lnearl separable slack varables ξ can be added to allow msclassfcaton of dffcult or nos eamples. Allow some errors Let some ponts be moved to where the belong at a cost Stll tr to mnmze tranng set errors and to place hperplane far from each class large margn ξ ξ j 33 33

34 Soft margn We allow error n classfcaton; t s based on the output of the dscrmnant functon w T +b appromates the number of msclassfed samples New objectve functon: Class C : tradeoff parameter between error and margn; chosen b the user; large C means a hgher penalt to errors Class 34

35 Lnear Non-Separable Case When classes are not separable Introduce slack varables and rela constrants w 0 Mnmze the objectve functons Allow some samples nto the buffer zone but mnmze the number and the amount of protruson L w C C : 0 weght PR ANN & ML 35

36 36 PR ANN & ML Wolfe Dual C C C L j j j j w w

37 37 PR ANN & ML General Case C k C C L j j j j w w

38 38 PR ANN & ML Eample Xor X ] [ L K k T j j j j k L

39 39 PR ANN & ML Eample cont ] [ X X X X X w >=0 w SVM can solve Xor!

40 Eample: 5 D data ponts Value of dscrmnant functon class class class

41 5 D data ponts Eample = = 3 =4 4 =5 5 =6 wth 6 as class and 4 5 as class = = 3 =- 4 =- 5 = We use the polnomal kernel of degree K = + C s set to 00 We frst fnd = 5 b 4

42 Eample B usng a QP solver we get =0 =.5 3 =0 4 = =4.833 Verf at home that the constrants are ndeed satsfed The support vectors are { = 4 =5 5 =6} The dscrmnant functon s b s recovered b solvng f= or b f5=- or b f6= as 4 5 le on f and all gve b=9 wth 4

43 Software A lst of SVM mplementaton can be found at Some mplementaton such as LIBSVM can handle mult-class classfcaton SVMLght s among one of the earlest mplementaton of SVM Several Matlab toolboes for SVM are also avalable 43

44 Evaluaton: Classc Reuters Data Set Most overused data set 578 documents 9603 tranng 399 test artcles 8 categores An artcle can be n more than one categor Learn 8 bnar categor dstnctons Average document: about 90 tpes 00 tokens Average number of classes assgned.4 for docs wth at least one categor Onl about 0 out of 8 categores are large Common categores #tran #test Earn Acqustons Mone-f Gran Crude Trade 3699 Interest Shp Wheat 7 Corn

45 Reuters Tet Categorzaton data set Reuters-578 document <REUTERS TOPICS="YES" LEWISSPLIT="TRAIN" CGISPLIT="TRAINING-SET" OLDID="98" NEWID="798"> <DATE> -MAR-987 6:5:43.4</DATE> <TOPICS><D>lvestock</D><D>hog</D></TOPICS> <TITLE>AMERICAN PORK CONGRESS KICKS OFF TOMORROW</TITLE> <DATELINE> CHICAGO March - </DATELINE><BODY>The Amercan Pork Congress kcks off tomorrow March 3 n Indanapols wth 60 of the natons pork producers from 44 member states determnng ndustr postons on a number of ssues accordng to the Natonal Pork Producers Councl NPPC. Delegates to the three da Congress wll be consderng 6 resolutons concernng varous ssues ncludng the future drecton of farm polc and the ta law as t apples to the agrculture sector. The delegates wll also debate whether to endorse concepts of a natonal PRV pseudorabes vrus control and eradcaton program the NPPC sad. A large trade show n conjuncton wth the congress wll feature the latest n technolog n all areas of the ndustr the NPPC added. Reuter 45 &#3;</BODY></TEXT></REUTERS> 45

46 New Reuters: RCV: docs Top topcs n Reuters RCV 46 46

47 Actual Class Good practce department: Confuson matr Ths j entr means 53 of the docs actuall n class were put n class j b the classfer. Class assgned b classfer 53 In a perfect classfcaton onl the dagonal has non-zero entres 47 47

48 Per class evaluaton measures Recall: Fracton of docs n class classfed correctl: Precson: Fracton of docs assgned class that are actuall about class : Correct rate : - error rate Fracton of docs classfed correctl: c c j j j c c c j j c j 48 48

49 Dumas et al. 998: Reuters - Accurac Roccho NBaes Trees LnearSVM earn 9.9% 95.9% 97.8% 98.% acq 64.7% 87.8% 89.7% 9.8% mone-f 46.7% 56.6% 66.% 74.0% gran 67.5% 78.8% 85.0% 9.4% crude 70.% 79.5% 85.0% 88.3% trade 65.% 63.9% 7.5% 73.5% nterest 63.4% 64.9% 67.% 76.3% shp 49.% 85.4% 74.% 78.0% wheat 68.9% 69.7% 9.5% 89.7% corn 48.% 65.3% 9.8% 9.% Avg Top % 8.5% 88.4% 9.4% Avg All Cat 6.7% 75.% na 86.4% Recall: % labeled n categor among those stores that are reall n categor Precson: % reall n categor among those stores labeled n categor Break Even: Recall + Precson / 49 49

50 Reuters ROC - Categor Gran Recall LSVM Decson Tree Naïve Baes Fnd Smlar Precson Recall: % labeled n categor among those stores that are reall n categor 50 Precson: % reall n categor among those stores labeled n categor 50

51 ROC for Categor - Crude Recall LSVM Decson Tree Naïve Baes Fnd Smlar Precson 5 5

52 Lnear Programmng Standard ma Standard mn PR ANN & ML 5

53 Standard Ma Eample : #pants #shrts bb: #buttons #zppers cc: proft/pant proft/shrt : $/button $/zpper <- shadow prce Aj: :buttons zppers j:pants shrts use ma c c Subject to bn bn zp zp b b mn b 0 0 b bn bn Subject to zp zp c c 0 0 PR ANN & ML 53

54 Standard Ma Ma s prmal Make pants/shrts b ourself Ma profts for ou Sta wthn avalable buttons and zppers Mn s dual Sell our materal to a buer Mn costs for buer Buer must offer more than what ou can earn b makng thngs ourself PR ANN & ML 54

55 Standard Mn Eample : pound of meat pound of vegg bb: $/pound meat $/pound vegg cc: requred calore requred proten Aj: :meat vegg j:calore proten provded : $/ calore supplement $/ proten supplement <- shadow prce mn b b c/pmeat p/pmeat Subject to c c c/pvegg p/pvegg 0 0 ma c c c/meat p/meat Subject to c/vegg p/vegg b b 0 0 PR ANN & ML 55

56 Standard Mn Mn s prmal Bu meat and vegg to provde calore and proten Mn cost for ou Suppl enough nutrent Ma s dual Alens sell ou magc calore and proten plls Ma profts for seller seller must offer less than what ou pa to bu for ourself PR ANN & ML 56

57 Some Facts about LP All lnear programmng problems can be transformed nto ether standard ma or standard mn If ma f feasble then mn f feasble and vce versa Optmal ma s also the optmal mn no gap PR ANN & ML 57

58 Converson from Prmal to Dual mn b b a a Subject to a a c c 0 0 c-a*-a* 0 c-a*-a* 0 ma mn b*+b* +*c-a*-a* ma mn c + c +*c-a*-a* +*b-a*-a* +*b-a*-a* ma c c Subject to a a a a b b 0 0 PR ANN & ML 58

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Support Vector Machines

Support Vector Machines CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

SVMs: Duality and Kernel Trick. SVMs as quadratic programs

SVMs: Duality and Kernel Trick. SVMs as quadratic programs /8/9 SVMs: Dualt and Kernel rck Machne Learnng - 6 Geoff Gordon MroslavDudík [[[partl ased on sldes of Zv-Bar Joseph] http://.cs.cmu.edu/~ggordon/6/ Novemer 8 9 SVMs as quadratc programs o optmzaton prolems:

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? + + + + + + + + + Intuton of Margn Consder ponts

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Linear Classification, SVMs and Nearest Neighbors

Linear Classification, SVMs and Nearest Neighbors 1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush

More information

SVMs: Duality and Kernel Trick. SVMs as quadratic programs

SVMs: Duality and Kernel Trick. SVMs as quadratic programs 11/17/9 SVMs: Dualt and Kernel rck Machne Learnng - 161 Geoff Gordon MroslavDudík [[[partl ased on sldes of Zv-Bar Joseph] http://.cs.cmu.edu/~ggordon/161/ Novemer 18 9 SVMs as quadratc programs o optmzaton

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

Lecture 10 Support Vector Machines. Oct

Lecture 10 Support Vector Machines. Oct Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron

More information

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them? Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of

More information

Support Vector Machines

Support Vector Machines Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x n class

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Lecture 3: Dual problems and Kernels

Lecture 3: Dual problems and Kernels Lecture 3: Dual problems and Kernels C4B Machne Learnng Hlary 211 A. Zsserman Prmal and dual forms Lnear separablty revsted Feature mappng Kernels for SVMs Kernel trck requrements radal bass functons SVM

More information

Maximal Margin Classifier

Maximal Margin Classifier CS81B/Stat41B: Advanced Topcs n Learnng & Decson Makng Mamal Margn Classfer Lecturer: Mchael Jordan Scrbes: Jana van Greunen Corrected verson - /1/004 1 References/Recommended Readng 1.1 Webstes www.kernel-machnes.org

More information

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD Lnear dscrmnants Nuno Vasconcelos ECE Department UCSD Classfcaton a classfcaton problem as to tpes of varables e.g. X - vector of observatons features n te orld Y - state class of te orld X R 2 fever blood

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far So far Supervsed machne learnng Lnear models Non-lnear models Unsupervsed machne learnng Generc scaffoldng So far

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far Supervsed machne learnng Lnear models Least squares regresson Fsher s dscrmnant, Perceptron, Logstc model Non-lnear

More information

Lagrange Multipliers Kernel Trick

Lagrange Multipliers Kernel Trick Lagrange Multplers Kernel Trck Ncholas Ruozz Unversty of Texas at Dallas Based roughly on the sldes of Davd Sontag General Optmzaton A mathematcal detour, we ll come back to SVMs soon! subject to: f x

More information

Support Vector Machines

Support Vector Machines /14/018 Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x

More information

Chapter 6 Support vector machine. Séparateurs à vaste marge

Chapter 6 Support vector machine. Séparateurs à vaste marge Chapter 6 Support vector machne Séparateurs à vaste marge Méthode de classfcaton bnare par apprentssage Introdute par Vladmr Vapnk en 1995 Repose sur l exstence d un classfcateur lnéare Apprentssage supervsé

More information

Generative classification models

Generative classification models CS 675 Intro to Machne Learnng Lecture Generatve classfcaton models Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Data: D { d, d,.., dn} d, Classfcaton represents a dscrete class value Goal: learn

More information

Intro to Visual Recognition

Intro to Visual Recognition CS 2770: Computer Vson Intro to Vsual Recognton Prof. Adrana Kovashka Unversty of Pttsburgh February 13, 2018 Plan for today What s recognton? a.k.a. classfcaton, categorzaton Support vector machnes Separable

More information

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018 INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING 1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N

More information

Advanced Introduction to Machine Learning

Advanced Introduction to Machine Learning Advanced Introducton to Machne Learnng 10715, Fall 2014 The Kernel Trck, Reproducng Kernel Hlbert Space, and the Representer Theorem Erc Xng Lecture 6, September 24, 2014 Readng: Erc Xng @ CMU, 2014 1

More information

Evaluation for sets of classes

Evaluation for sets of classes Evaluaton for Tet Categorzaton Classfcaton accuracy: usual n ML, the proporton of correct decsons, Not approprate f the populaton rate of the class s low Precson, Recall and F 1 Better measures 21 Evaluaton

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Classification learning II

Classification learning II Lecture 8 Classfcaton learnng II Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Logstc regresson model Defnes a lnear decson boundar Dscrmnant functons: g g g g here g z / e z f, g g - s a logstc functon

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Lecture 20: November 7

Lecture 20: November 7 0-725/36-725: Convex Optmzaton Fall 205 Lecturer: Ryan Tbshran Lecture 20: November 7 Scrbes: Varsha Chnnaobreddy, Joon Sk Km, Lngyao Zhang Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer:

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng robablstc Classfer Gven an nstance, hat does a probablstc classfer do dfferentl compared to, sa, perceptron? It does not drectl predct Instead,

More information

Machine Learning. What is a good Decision Boundary? Support Vector Machines

Machine Learning. What is a good Decision Boundary? Support Vector Machines Machne Learnng 0-70/5 70/5-78 78 Sprng 200 Support Vector Machnes Erc Xng Lecture 7 March 5 200 Readng: Chap. 6&7 C.B book and lsted papers Erc Xng @ CMU 2006-200 What s a good Decson Boundar? Consder

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng 2 Logstc Regresson Gven tranng set D stc regresson learns the condtonal dstrbuton We ll assume onl to classes and a parametrc form for here s

More information

Support Vector Machines. Jie Tang Knowledge Engineering Group Department of Computer Science and Technology Tsinghua University 2012

Support Vector Machines. Jie Tang Knowledge Engineering Group Department of Computer Science and Technology Tsinghua University 2012 Support Vector Machnes Je Tang Knowledge Engneerng Group Department of Computer Scence and Technology Tsnghua Unversty 2012 1 Outlne What s a Support Vector Machne? Solvng SVMs Kernel Trcks 2 What s a

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Machine Learning & Data Mining CS/CNS/EE 155. Lecture 4: Regularization, Sparsity & Lasso

Machine Learning & Data Mining CS/CNS/EE 155. Lecture 4: Regularization, Sparsity & Lasso Machne Learnng Data Mnng CS/CS/EE 155 Lecture 4: Regularzaton, Sparsty Lasso 1 Recap: Complete Ppelne S = {(x, y )} Tranng Data f (x, b) = T x b Model Class(es) L(a, b) = (a b) 2 Loss Functon,b L( y, f

More information

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z )

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z ) C4B Machne Learnng Answers II.(a) Show that for the logstc sgmod functon dσ(z) dz = σ(z) ( σ(z)) A. Zsserman, Hlary Term 20 Start from the defnton of σ(z) Note that Then σ(z) = σ = dσ(z) dz = + e z e z

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009 A utoral on Data Reducton Lnear Dscrmnant Analss (LDA) hreen Elhaban and Al A Farag Unverst of Lousvlle, CVIP Lab eptember 009 Outlne LDA objectve Recall PCA No LDA LDA o Classes Counter eample LDA C Classes

More information

OPTIMISATION. Introduction Single Variable Unconstrained Optimisation Multivariable Unconstrained Optimisation Linear Programming

OPTIMISATION. Introduction Single Variable Unconstrained Optimisation Multivariable Unconstrained Optimisation Linear Programming OPTIMIATION Introducton ngle Varable Unconstraned Optmsaton Multvarable Unconstraned Optmsaton Lnear Programmng Chapter Optmsaton /. Introducton In an engneerng analss, sometmes etremtes, ether mnmum or

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Kristin P. Bennett. Rensselaer Polytechnic Institute

Kristin P. Bennett. Rensselaer Polytechnic Institute Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Evaluation of classifiers MLPs

Evaluation of classifiers MLPs Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Lecture 10: Dimensionality reduction

Lecture 10: Dimensionality reduction Lecture : Dmensonalt reducton g The curse of dmensonalt g Feature etracton s. feature selecton g Prncpal Components Analss g Lnear Dscrmnant Analss Intellgent Sensor Sstems Rcardo Guterrez-Osuna Wrght

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Lecture 6: Support Vector Machines

Lecture 6: Support Vector Machines Lecture 6: Support Vector Machnes Marna Melă mmp@stat.washngton.edu Department of Statstcs Unversty of Washngton November, 2018 Lnear SVM s The margn and the expected classfcaton error Maxmum Margn Lnear

More information

Nonlinear Classifiers II

Nonlinear Classifiers II Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural

More information

Recap: the SVM problem

Recap: the SVM problem Machne Learnng 0-70/5-78 78 Fall 0 Advanced topcs n Ma-Margn Margn Learnng Erc Xng Lecture 0 Noveber 0 Erc Xng @ CMU 006-00 Recap: the SVM proble We solve the follong constraned opt proble: a s.t. J 0

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

Machine Learning. Support Vector Machines. Eric Xing , Fall Lecture 9, October 6, 2015

Machine Learning. Support Vector Machines. Eric Xing , Fall Lecture 9, October 6, 2015 Machne Learnng 0-70 Fall 205 Support Vector Machnes Erc Xng Lecture 9 Octoer 6 205 Readng: Chap. 6&7 C.B ook and lsted papers Erc Xng @ CMU 2006-205 What s a good Decson Boundar? Consder a nar classfcaton

More information

Finite Difference Method

Finite Difference Method 7/0/07 Instructor r. Ramond Rump (9) 747 698 rcrump@utep.edu EE 337 Computatonal Electromagnetcs (CEM) Lecture #0 Fnte erence Method Lecture 0 These notes ma contan coprghted materal obtaned under ar use

More information

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Neural networks. Nuno Vasconcelos ECE Department, UCSD Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X

More information

15-381: Artificial Intelligence. Regression and cross validation

15-381: Artificial Intelligence. Regression and cross validation 15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

CSE 252C: Computer Vision III

CSE 252C: Computer Vision III CSE 252C: Computer Vson III Lecturer: Serge Belonge Scrbe: Catherne Wah LECTURE 15 Kernel Machnes 15.1. Kernels We wll study two methods based on a specal knd of functon k(x, y) called a kernel: Kernel

More information

UVA CS / Introduc8on to Machine Learning and Data Mining. Lecture 10: Classifica8on with Support Vector Machine (cont.

UVA CS / Introduc8on to Machine Learning and Data Mining. Lecture 10: Classifica8on with Support Vector Machine (cont. UVA CS 4501-001 / 6501 007 Introduc8on to Machne Learnng and Data Mnng Lecture 10: Classfca8on wth Support Vector Machne (cont. ) Yanjun Q / Jane Unversty of Vrgna Department of Computer Scence 9/6/14

More information

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

CSC 411 / CSC D11 / CSC C11

CSC 411 / CSC D11 / CSC C11 18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t

More information

Fisher Linear Discriminant Analysis

Fisher Linear Discriminant Analysis Fsher Lnear Dscrmnant Analyss Max Wellng Department of Computer Scence Unversty of Toronto 10 Kng s College Road Toronto, M5S 3G5 Canada wellng@cs.toronto.edu Abstract Ths s a note to explan Fsher lnear

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

17 Support Vector Machines

17 Support Vector Machines 17 We now dscuss an nfluental and effectve classfcaton algorthm called (SVMs). In addton to ther successes n many classfcaton problems, SVMs are responsble for ntroducng and/or popularzng several mportant

More information

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution. Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,

More information

The Karush-Kuhn-Tucker. Nuno Vasconcelos ECE Department, UCSD

The Karush-Kuhn-Tucker. Nuno Vasconcelos ECE Department, UCSD e Karus-Kun-ucker condtons and dualt Nuno Vasconcelos ECE Department, UCSD Optmzaton goal: nd mamum or mnmum o a uncton Denton: gven unctons, g, 1,...,k and, 1,...m dened on some doman Ω R n mn w, w Ω

More information

18-660: Numerical Methods for Engineering Design and Optimization

18-660: Numerical Methods for Engineering Design and Optimization 8-66: Numercal Methods for Engneerng Desgn and Optmzaton n L Department of EE arnege Mellon Unversty Pttsburgh, PA 53 Slde Overve lassfcaton Support vector machne Regularzaton Slde lassfcaton Predct categorcal

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Conic Programming in GAMS

Conic Programming in GAMS Conc Programmng n GAMS Armn Pruessner, Mchael Busseck, Steven Drkse, Ale Meeraus GAMS Development Corporaton INFORMS 003, Atlanta October 19- Drecton What ths talk s about Overvew: the class of conc programs

More information

We present the algorithm first, then derive it later. Assume access to a dataset {(x i, y i )} n i=1, where x i R d and y i { 1, 1}.

We present the algorithm first, then derive it later. Assume access to a dataset {(x i, y i )} n i=1, where x i R d and y i { 1, 1}. CS 189 Introducton to Machne Learnng Sprng 2018 Note 26 1 Boostng We have seen that n the case of random forests, combnng many mperfect models can produce a snglodel that works very well. Ths s the dea

More information

Image Processing for Bubble Detection in Microfluidics

Image Processing for Bubble Detection in Microfluidics Image Processng for Bubble Detecton n Mcrofludcs Introducton Chen Fang Mechancal Engneerng Department Stanford Unverst Startng from recentl ears, mcrofludcs devces have been wdel used to buld the bomedcal

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

FMA901F: Machine Learning Lecture 5: Support Vector Machines. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 5: Support Vector Machines. Cristian Sminchisescu FMA901F: Machne Learnng Lecture 5: Support Vector Machnes Crstan Smnchsescu Back to Bnary Classfcaton Setup We are gven a fnte, possbly nosy, set of tranng data:,, 1,..,. Each nput s pared wth a bnary

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Parameter estimation class 5

Parameter estimation class 5 Parameter estmaton class 5 Multple Ve Geometr Comp 9-89 Marc Pollefes Content Background: Projectve geometr (D, 3D), Parameter estmaton, Algortm evaluaton. Sngle Ve: Camera model, Calbraton, Sngle Ve Geometr.

More information

p 1 c 2 + p 2 c 2 + p 3 c p m c 2

p 1 c 2 + p 2 c 2 + p 3 c p m c 2 Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance

More information

Linear Regression Introduction to Machine Learning. Matt Gormley Lecture 5 September 14, Readings: Bishop, 3.1

Linear Regression Introduction to Machine Learning. Matt Gormley Lecture 5 September 14, Readings: Bishop, 3.1 School of Computer Scence 10-601 Introducton to Machne Learnng Lnear Regresson Readngs: Bshop, 3.1 Matt Gormle Lecture 5 September 14, 016 1 Homework : Remnders Extenson: due Frda (9/16) at 5:30pm Rectaton

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Originated from experimental optimization where measurements are very noisy Approximation can be actually more accurate than

Originated from experimental optimization where measurements are very noisy Approximation can be actually more accurate than Surrogate (approxmatons) Orgnated from expermental optmzaton where measurements are ver nos Approxmaton can be actuall more accurate than data! Great nterest now n applng these technques to computer smulatons

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

CSE 546 Midterm Exam, Fall 2014(with Solution)

CSE 546 Midterm Exam, Fall 2014(with Solution) CSE 546 Mdterm Exam, Fall 014(wth Soluton) 1. Personal nfo: Name: UW NetID: Student ID:. There should be 14 numbered pages n ths exam (ncludng ths cover sheet). 3. You can use any materal you brought:

More information

x yi In chapter 14, we want to perform inference (i.e. calculate confidence intervals and perform tests of significance) in this setting.

x yi In chapter 14, we want to perform inference (i.e. calculate confidence intervals and perform tests of significance) in this setting. The Practce of Statstcs, nd ed. Chapter 14 Inference for Regresson Introducton In chapter 3 we used a least-squares regresson lne (LSRL) to represent a lnear relatonshp etween two quanttatve explanator

More information

9.913 Pattern Recognition for Vision. Class IV Part I Bayesian Decision Theory Yuri Ivanov

9.913 Pattern Recognition for Vision. Class IV Part I Bayesian Decision Theory Yuri Ivanov 9.93 Class IV Part I Bayesan Decson Theory Yur Ivanov TOC Roadmap to Machne Learnng Bayesan Decson Makng Mnmum Error Rate Decsons Mnmum Rsk Decsons Mnmax Crteron Operatng Characterstcs Notaton x - scalar

More information

IV. Performance Optimization

IV. Performance Optimization IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton

More information

14 Lagrange Multipliers

14 Lagrange Multipliers Lagrange Multplers 14 Lagrange Multplers The Method of Lagrange Multplers s a powerful technque for constraned optmzaton. Whle t has applcatons far beyond machne learnng t was orgnally developed to solve

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information