PATTERN RECOGNITION AND IMAGE UNDERSTANDING
|
|
- Barnaby Evans
- 6 years ago
- Views:
Transcription
1 PATTERN RECOGNITION AND IMAGE UNDERSTANDING The ultmate objectve of many mage analyss tasks s to dscover meanng of the analysed mage, e.g. categorse the objects, provde symbolc/semantc nterpretaton of the mage or global understandng of the mage scene. Because these tasks are applcaton specfc no ready solutons are avalable. Hence, mage analyss problems have strong resemblance to ntellgent cognton rather than rely on applcaton of ready to use set of standard processng technques. IE Techncal Unversty of Lodz
2 Image understandng Intellgence: ablty to extract meanngful nformaton from a set of rrelevant detals, capablty to learn from examples and to generalse the acqured knowledge (so that t can be useful n dfferent processng contexts), ablty to comprehend data from ncomplete nformaton.
3 The concept of learnng - example Emergence
4 An example of a dffcult pattern recognton problem Invarance
5 Gestalt Theory of mnd and bran Multstablty Refcaton (llusory objects)
6 Man parts of an mage analyss system Segmentaton Representaton and descrpton Analysed scene Preprocessng Image acquston Knowledge processng Recognton and nterpretaton Result
7 Man parts of an mage analyss system Intermdate-level processng Segmentaton Representaton and descrpton Analysed scene Preprocessng Image acquston Knowledge processng Recognton and nterpretaton Result Low-level processng Hgh-level processng
8 Image analyss system A complete mage analyss system comprses three major data processng modules: Low-level processng- comprses mage acquston, mage pre- processng and enhancement, e.g. nose reducton, contrast mprovement, flterng, zoomng, etc. Intermedate level processng- deals wth extracton and descrpton of mage components dentfed from a knowledge base, e.g. mage segmentaton, boundary and edge descrpton, morphologcal processng. Hgh-level processng- nvolves classfcaton, recognton and nterpretaton of the mage scene.
9 Patterns and pattern classes A pattern s a set of features that form a qualtatve or structural descrpton of an analysed object; n a mathematcal sense a pattern s defned as a vector of features x=[x 1, x,, x N ]. A pattern class s a group of patterns that have smlar feature vectors (accordng to some smlarty measure). Pattern classes are denoted ω 1, ω 1, ω M, where M s the number of classes. Pattern recognton (alternatvely termed pattern classfcaton) s the task of assgnng patterns to ther respectve classes. It s equvalent to establshng a mappng: x ω from the feature space X to the pattern class space Ω.
10 Pattern classfcaton defnton feature space pattern class space many to one mappng ω 1 ω ω 3 x 3 ω 4 x ω 5 x 1 ω 6
11 Pattern classfcaton problem The mappng x ω should pont to the true class label ω. The optmum among all the possble mappngs x ω s the one that generates mnmum error rate. Ths rate depends on the dstrbuton characterstcs of feature vectors n the pattern space X,.e., ther dstrbutons, overlap between dstrbutons, etc. If the knowledge about dstrbutons of the patterns s unavalable the classfcaton system must rely on the learnng paradgms, e.g. artfcal neural networks. In such approaches the knowledge s acqured from a suffcently large number set of tranng samples drawn from the pattern examples
12 Selecton of feature propertes dscrmnaton features should assume consderably dfferent values for patterns from dfferent classes, e.g., dameter of fruts s a good feature for classfcaton between grapefruts and cherres, robustness features should assume smlar values for all patterns belongng to the same class, e.g., colour s a poor feature for apples, ndependence features used n a classfcaton system should be uncorrelated, e.g., weght and sze of a frut are strongly correlated features, small number of features complexty of data classfcaton system grows substantally wth the number of classfed features, e.g., features that are strongly correlated should be elmnated.
13 Selecton of feature propertes Correlaton coeffcent between features X and Y: γ xy = 1 P P =1 ( x µ )( y µ ) σ x x σ y y where: P s the number of patterns, µ x, µ y are the average values for features X and Y correspondngly, and σ x,σ y are ther standard devatons. The correlaton coeffcent assumes values n the range [- 1,1].
14 Selecton of feature propertes A measure of separaton for feature X between classes j and k: ˆ D xjk = µ Large value for ths measure means feature X yelds good separaton between classes. σ xj xj µ + σ xk xk bad good
15 Reducton of the number of features θ Drecton of the frst prncpal component Angle θ chosen for best separaton of features
16 Decson-theoretc methods Pattern classfcaton problem can be approached by means of the so called decson functons. The number of these functons s equal to the number of classes. Let x=[x 1, x,, x N ] T represent an N- dmensonal pattern vector. For M pattern classes ω 1, ω, ω M, we are searchng for M decson functons d 1 (x), d (x),, d M (x) wth a property that, f a pattern x belongs to class ω then d ( x) > d ( x) j = 1,, K, M ; j j In other words, decson functon d (x), wns the competton for assgnng the nput feature vector x to the pattern class ω.
17 Decson functons Wley 1
18 Decson functons The decson boundary between two arbtrary classes and j ( j) s defned by the functon: d j (x) = d (x) - d j (x) = 0 Then, for patterns of class ω : d j (x)>0 and for patterns of class ω j : d j (x)<0.
19 Mnmum dstance classfer Assume that each pattern class s represented by a mean vector (also called a class prototype): m 1 = P j x ω j x, where N j s the number of pattern vectors from class ω j. Possble way to determne the class membershp of an unknown pattern vector x s to assgn t to the class of ts closest prototype vector. j = 1,, K,M?
20 Mnmum dstance classfer If the Eucldean dstance s used the dstance measure s of the form: D j ( x) = x m j j = 1,, K, M x 1/ and. T = ( x x) Feature vector x s assgned to class ω j f D j (x) s the smallest dstance.
21 Mnmum dstance classfer The followng dstance functon can be constructed d j T 1 T ( x) = x m j m j m j j = 1,, K, M and assgnng x to class ω j f d j (x) gves the largest value.
22 Mnmum dstance classfer The decson boundary between classes ω and ω j for a mnmum dstance classfer s: d j ( x) = d ( x) - d j ( x) = = x T ( m m j ) 1 ( m m j ) T ( m m j ) = 0 The surface defned by ths equaton s the perpendcular bsector to the lne jonng m and m j. For N= the bsector s a lne, for N=3 t s a plane, and for N>3 t s called a hyperplane.
23 Mnmum dstance classfer Wley d 1 (x) =.8x x = 0 ( ) ( )
24 Mnmum dstance classfer Example 1: Consder two classes denoted by ω 1 and ω, that have mean vectors m 1 = (4.3, 1.3) T and m = (1.5, 0.3) T. The decson functons for each of the classes are: d 1 (x) = x T m m 1T m 1 = 4.3x x d (x) = x T m m T m = 1.5x x Equaton for the decson boundary : d 1 (x) = d 1 (x) - d (x) =.8x x = 0 Class membershp of a new feature vector s defned on the bass of the sgn of d 1 (x).
25 Vorono mosac %MATLAB rand('state',0); 1 x = rand(1,10); y = rand(1,10); [vx, [vx, vy] vy] = vorono(x,y); plot(x,y,'r+',vx,vy,'b-'); 0.5 axs axsequal; class prototype
26 Example : Classfcaton of Amercan Bankers Assocaton font character set. The desgn of the font ensures that the waveform correspondng to each character s dstnct from that of all others. There are N=14 prototype vectors n 10- Dfeature space.
27 Mnmum dstance classfer The mnmum dstance classfer works well on the followng condtons: the dstance between class means s large compared to the spread of each class. classes are lnearly separable (class boundares are: lnes, planes or hyperplanes).
28 Evaluaton of pattern classfcaton qualty Statstcal dstrbuton of sngle feature TN A B FN FP TP decson border tested feature
29 Evaluaton of pattern classfcaton qualty Confuson matrx True classfcaton Class B (wrong, ll, ) Class A Classfer output Class B Class A TP (ang. true-postve) FN (ang. false-negatve) FP (ang. false-postve) TN (ang. true-negatve)
30 Evaluaton of pattern classfcaton qualty Accuracy: ACC = TP TP + TN + TN + FP + FN Senstvty: SE = TP TP + FN TN A B All class B examples FN FP TP
31 Evaluaton of pattern classfcaton qualty Specfty: SP = TN FP + TN All class A examples A False alarm rate: TN B FA = 1 SP = FP FP + TN FN FP TP
32 Recever Operator Characterstc (ROC curve) good test senstvty 1 dagnostc parameter poor test 0 false alarm rate 1 dagnostc parameter See excellent demo explanng ROC:
33 Types of classfers Parametrc: probablty densty functons are known and only parameters of these functons are unknown e.g., the Gaussan dstrbuton s defned by ts mean µ and standard devaton σ), The Bayes classfer s a parametrc classfer,.e., requres knowledge about densty functon dstrbutons Nonparametrc: probablty densty functons are not known and must be estmated from a suffcently large measurement data.
34 Image segmentaton example
35 Colour based segmentaton {R,G,B} Statstcal dstrbuton of colour components for the ndcated mage regons B G R
36 The Bayes classfer c 1 - class 1 c - class c a pror probabltes p(x) c x
37 The Bayes classfer From the probablty theory the followng holds: p ( a / b) = p( a) p( b / p( b) a) hence: p ( c / x) = P ( c ) ( ) p x / c p( x) a pror probablty Bayes rule a posteror probablty
38 Maxmum lkelhood estmator An mage pxel s assgned to - th class for whch: L = arg max p ( ) P( c ) p( / c ) x c / x = j p ( x), =1,, K,N.e. L > L j, j, = 1,, K,N
39 The Bayes classfer c 1 - class 1 c - class c a pror probabltes p( ω / x) > p( ω1 / x) p(x) c p ( c / x) > p( c1 / x) p ( c1 / x) > p( c / x)
40 ( ) 1, ) ( exp 1 / = = m x c x p σ πσ A pror probablty for 1D Gaussan dstrbuton: ( ) ( ) 1, ) ( ) ( exp 1 ) ( / / = = = c P m x c P c x p x c p σ πσ A posteror probablty: The Bayes classfer
41 The Bayes classfer - example Pot A Pot B 0 red 0 whte 30 red 10 whte We pck a pot randomly and then the ball randomly. Queston: From whch pot the ball was pcked?
42 The Bayes classfer - example Pot A Pot B 0 red 0 whte 30 red 10 whte Before we see the colour of the ball we pcked (a pror knowledge) we assume the probablty of choosng each of the pots s equal,.e. 0.5.
43 The Bayes classfer - example Pot A Pot B 0 red 0 whte 30 red 10 whte Suppose we pcked a red ball. Can we verfy our frst hypothess havng ths a posteror knowledge? Yes, the answer comes from the Bayes theorem.
44 The Bayes classfer -example Pot A Pot B That was the pot! probably 0 red 30 red 0 whte 10 whte p ( A / red ) = P P( A) p( red / A) ( A) p( red / A) + P( B) p( red / B) = = 0. 4 p ( B / red ) = P P( B) p( red / B) ( A) p( red / A) + P( B) p( red / B) = = 0. 6
45 Multvarate Gaussan dstrbuton A pror probablty for D Gassan dstrbuton x T =[x 1 x ]: Where s class ndex of N gven classes and the covarance matrx: ( ) ( ) N c p T, 1,, ) ( ) ( exp 1 / 1 1/ 1/ K = = m x m x x π x x x x x x = Σ σ σ σ σ σ σ
46 D Gaussan dstrbuton - example Let m 1 =m =0: 6 4 Σ = Σ 1 = Σ = p ( x ) 1 T = exp 1/ x ( π ) 1/ 1 1 x = Aexp( B) A = 1 = 1 1/ 1/ ( π ) 8 4 π B = 1 1 x 1 1 x 1 x x x x 1
47 Colour based segmentaton {R,G,B} Multvarate Gaussan dstrbuton B G R
48 Selecton of the feature vector Feature vector: coordnates θ = { x, y, R, G, B, u, v} colour u v dx = dt dy = dt optcal flow
49 Optcal flow features Optcal flow based segmentaton {u,v} optcal flow vector feld
50 k-nearest Neghbours Classfer (k-nn) x x In each pont of the feature space, a number of k- neghbours s counted. The pont s assgned to the class from whch there s the largest number of ponts n the neghbourhood. x 1
51 Image segmentaton examples
52 Image segmentaton examples
53 Fngerprnt recognton FBI mage database of fngerprnts (199)
54 Irs recognton (bometry) The patented Daugman s algorthm
55 Analyss of bomedcal mages Mamografa
56 Image database querng Idea of the search mage or a copy of the mage database ht DWT C.E. Jacobs, A. Fnkelsten, D.H. Saless, Fast multresoluton mage querng, 1999
57 Analyss of X-ray mages of grans P. Strumllo, J. Newczas, P. Szczypńsk, P. Makowsk, W. Woznak, Computer system for analyss of X ray mages of wheat grans, Internatonal Agrophyscs, 1999, vol. 13, No. 1, pp
58 Automatc text detecton
Lecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More informationGenerative classification models
CS 675 Intro to Machne Learnng Lecture Generatve classfcaton models Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Data: D { d, d,.., dn} d, Classfcaton represents a dscrete class value Goal: learn
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationEvaluation of classifiers MLPs
Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationENG 8801/ Special Topics in Computer Engineering: Pattern Recognition. Memorial University of Newfoundland Pattern Recognition
EG 880/988 - Specal opcs n Computer Engneerng: Pattern Recognton Memoral Unversty of ewfoundland Pattern Recognton Lecture 7 May 3, 006 http://wwwengrmunca/~charlesr Offce Hours: uesdays hursdays 8:30-9:30
More informationClassification learning II
Lecture 8 Classfcaton learnng II Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Logstc regresson model Defnes a lnear decson boundar Dscrmnant functons: g g g g here g z / e z f, g g - s a logstc functon
More informationCommunication with AWGN Interference
Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More information5. POLARIMETRIC SAR DATA CLASSIFICATION
Polarmetrc SAR data Classfcaton 5. POLARIMETRIC SAR DATA CLASSIFICATION 5.1 Classfcaton of polarmetrc scatterng mechansms - Drect nterpretaton of decomposton results - Cameron classfcaton - Lee classfcaton
More informationOutline. Multivariate Parametric Methods. Multivariate Data. Basic Multivariate Statistics. Steven J Zeil
Outlne Multvarate Parametrc Methods Steven J Zel Old Domnon Unv. Fall 2010 1 Multvarate Data 2 Multvarate ormal Dstrbuton 3 Multvarate Classfcaton Dscrmnants Tunng Complexty Dscrete Features 4 Multvarate
More informationStatistical Foundations of Pattern Recognition
Statstcal Foundatons of Pattern Recognton Learnng Objectves Bayes Theorem Decson-mang Confdence factors Dscrmnants The connecton to neural nets Statstcal Foundatons of Pattern Recognton NDE measurement
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationInstance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification
Instance-Based earnng (a.k.a. memory-based learnng) Part I: Nearest Neghbor Classfcaton Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n
More informationClustering & Unsupervised Learning
Clusterng & Unsupervsed Learnng Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 2012 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationMIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU
Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationAutomatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models
Automatc Object Trajectory- Based Moton Recognton Usng Gaussan Mxture Models Fasal I. Bashr, Ashfaq A. Khokhar, Dan Schonfeld Electrcal and Computer Engneerng, Unversty of Illnos at Chcago. Chcago, IL,
More informationChapter 8 Indicator Variables
Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n
More informationMaximum Likelihood Estimation (MLE)
Maxmum Lkelhood Estmaton (MLE) Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 01 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informationMULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN
MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology
More informationLogistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI
Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationClustering & (Ken Kreutz-Delgado) UCSD
Clusterng & Unsupervsed Learnng Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y ), fnd an approxmatng
More informationClassification as a Regression Problem
Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationOther NN Models. Reinforcement learning (RL) Probabilistic neural networks
Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn
More informationTutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant
Tutoral 2 COMP434 ometrcs uthentcaton Jun Xu, Teachng sstant csjunxu@comp.polyu.edu.hk February 9, 207 Table of Contents Problems Problem : nswer the questons Problem 2: Power law functon Problem 3: Convoluton
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationFinite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin
Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed
More informationPsychology 282 Lecture #24 Outline Regression Diagnostics: Outliers
Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationImage classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?
Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of
More informationMachine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing
Machne Learnng 0-70/5 70/5-78, 78, Fall 008 Theory of Classfcaton and Nonarametrc Classfer Erc ng Lecture, Setember 0, 008 Readng: Cha.,5 CB and handouts Classfcaton Reresentng data: M K Hyothess classfer
More informationClustering gene expression data & the EM algorithm
CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern
More informationChange Detection: Current State of the Art and Future Directions
Change Detecton: Current State of the Art and Future Drectons Dapeng Olver Wu Electrcal & Computer Engneerng Unversty of Florda http://www.wu.ece.ufl.edu/ Outlne Motvaton & problem statement Change detecton
More informationAbsolute chain codes. Relative chain code. Chain code. Shape representations vs. descriptors. Start
Shape representatons vs. descrptors After the segmentaton of an mage, ts regons or edges are represented and descrbed n a manner approprate for further processng. Shape representaton: the ways we store
More informationStatistical machine learning and its application to neonatal seizure detection
19/Oct/2009 Statstcal machne learnng and ts applcaton to neonatal sezure detecton Presented by Andry Temko Department of Electrcal and Electronc Engneerng Page 2 of 42 A. Temko, Statstcal Machne Learnng
More informationCHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD
CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB
More informationThe big picture. Outline
The bg pcture Vncent Claveau IRISA - CNRS, sldes from E. Kjak INSA Rennes Notatons classes: C = {ω = 1,.., C} tranng set S of sze m, composed of m ponts (x, ω ) per class ω representaton space: R d (=
More informationGEMINI GEneric Multimedia INdexIng
GEMINI GEnerc Multmeda INdexIng Last lecture, LSH http://www.mt.edu/~andon/lsh/ Is there another possble soluton? Do we need to perform ANN? 1 GEnerc Multmeda INdexIng dstance measure Sub-pattern Match
More informationA General Method for Assessing the Uncertainty in Classified Remotely Sensed Data at Pixel Scale
Proceedngs of the 8th Internatonal Symposum on Spatal Accuracy Assessment n Natural Resources and Envronmental Scences Shangha, P. R. Chna, June 25-27, 2008, pp. 86-94 A General ethod for Assessng the
More informationLecture Nov
Lecture 18 Nov 07 2008 Revew Clusterng Groupng smlar obects nto clusters Herarchcal clusterng Agglomeratve approach (HAC: teratvely merge smlar clusters Dfferent lnkage algorthms for computng dstances
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationj) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1
Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons
More informationPulse Coded Modulation
Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal
More informationComparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method
Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method
More informationUNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours
UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x
More informationINF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018
INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton
More information8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore
8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø
More informationSpace of ML Problems. CSE 473: Artificial Intelligence. Parameter Estimation and Bayesian Networks. Learning Topics
/7/7 CSE 73: Artfcal Intellgence Bayesan - Learnng Deter Fox Sldes adapted from Dan Weld, Jack Breese, Dan Klen, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer What s Beng Learned? Space
More informationWhy Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one)
Why Bayesan? 3. Bayes and Normal Models Alex M. Martnez alex@ece.osu.edu Handouts Handoutsfor forece ECE874 874Sp Sp007 If all our research (n PR was to dsappear and you could only save one theory, whch
More informationImage Processing for Bubble Detection in Microfluidics
Image Processng for Bubble Detecton n Mcrofludcs Introducton Chen Fang Mechancal Engneerng Department Stanford Unverst Startng from recentl ears, mcrofludcs devces have been wdel used to buld the bomedcal
More informationFirst Year Examination Department of Statistics, University of Florida
Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve
More informationCOMPLEX NUMBERS AND QUADRATIC EQUATIONS
COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not
More informationLecture 10: Dimensionality reduction
Lecture : Dmensonalt reducton g The curse of dmensonalt g Feature etracton s. feature selecton g Prncpal Components Analss g Lnear Dscrmnant Analss Intellgent Sensor Sstems Rcardo Guterrez-Osuna Wrght
More informationMixture o f of Gaussian Gaussian clustering Nov
Mture of Gaussan clusterng Nov 11 2009 Soft vs hard lusterng Kmeans performs Hard clusterng: Data pont s determnstcally assgned to one and only one cluster But n realty clusters may overlap Soft-clusterng:
More informationProbability Density Function Estimation by different Methods
EEE 739Q SPRIG 00 COURSE ASSIGMET REPORT Probablty Densty Functon Estmaton by dfferent Methods Vas Chandraant Rayar Abstract The am of the assgnment was to estmate the probablty densty functon (PDF of
More informationProbability Theory (revisited)
Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted
More informationApplication research on rough set -neural network in the fault diagnosis system of ball mill
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the
More informationRadial-Basis Function Networks
Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal
More informationDepartment of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING
MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationLecture 3: Probability Distributions
Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the
More informationERROR RATES STABILITY OF THE HOMOSCEDASTIC DISCRIMINANT FUNCTION
ISSN - 77-0593 UNAAB 00 Journal of Natural Scences, Engneerng and Technology ERROR RATES STABILITY OF THE HOMOSCEDASTIC DISCRIMINANT FUNCTION A. ADEBANJI, S. NOKOE AND O. IYANIWURA 3 *Department of Mathematcs,
More informationTurbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH
Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant
More informationFeb 14: Spatial analysis of data fields
Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s
More informationCS 468 Lecture 16: Isometry Invariance and Spectral Techniques
CS 468 Lecture 16: Isometry Invarance and Spectral Technques Justn Solomon Scrbe: Evan Gawlk Introducton. In geometry processng, t s often desrable to characterze the shape of an object n a manner that
More informationConjugacy and the Exponential Family
CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the
More informationLecture 3: Dual problems and Kernels
Lecture 3: Dual problems and Kernels C4B Machne Learnng Hlary 211 A. Zsserman Prmal and dual forms Lnear separablty revsted Feature mappng Kernels for SVMs Kernel trck requrements radal bass functons SVM
More informationChapter 12 Analysis of Covariance
Chapter Analyss of Covarance Any scentfc experment s performed to know somethng that s unknown about a group of treatments and to test certan hypothess about the correspondng treatment effect When varablty
More informationCS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements
CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationHomework 9 STAT 530/J530 November 22 nd, 2005
Homework 9 STAT 530/J530 November 22 nd, 2005 Instructor: Bran Habng 1) Dstrbuton Q-Q plot Boxplot Heavy Taled Lght Taled Normal Skewed Rght Department of Statstcs LeConte 203 ch-square dstrbuton, Telephone:
More informationCluster Validation Determining Number of Clusters. Umut ORHAN, PhD.
Cluster Analyss Cluster Valdaton Determnng Number of Clusters 1 Cluster Valdaton The procedure of evaluatng the results of a clusterng algorthm s known under the term cluster valdty. How do we evaluate
More informationIV. Performance Optimization
IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton
More informationClassification. Representing data: Hypothesis (classifier) Lecture 2, September 14, Reading: Eric CMU,
Machne Learnng 10-701/15-781, 781, Fall 2011 Nonparametrc methods Erc Xng Lecture 2, September 14, 2011 Readng: 1 Classfcaton Representng data: Hypothess (classfer) 2 1 Clusterng 3 Supervsed vs. Unsupervsed
More informationBoostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More informationChapter 9: Statistical Inference and the Relationship between Two Variables
Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,
More informationChat eld, C. and A.J.Collins, Introduction to multivariate analysis. Chapman & Hall, 1980
MT07: Multvarate Statstcal Methods Mke Tso: emal mke.tso@manchester.ac.uk Webpage for notes: http://www.maths.manchester.ac.uk/~mkt/new_teachng.htm. Introducton to multvarate data. Books Chat eld, C. and
More informationFeature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013
Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys,
More informationMachine learning: Density estimation
CS 70 Foundatons of AI Lecture 3 Machne learnng: ensty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square ata: ensty estmaton {.. n} x a vector of attrbute values Objectve: estmate the model of
More information