Bayesian Decision Theory
|
|
- Darcy Allison
- 5 years ago
- Views:
Transcription
1 No.4 Bayesan Decson Theory Hu Jang Deartment of Electrcal Engneerng and Comuter Scence Lassonde School of Engneerng York Unversty, Toronto, Canada
2 Outlne attern Classfcaton roblems Bayesan Decson Theory How to make the otmal decson? Generatve models: Maxmum a osteror MA decson rule: leadng to mnmum error rate classfcaton. Model estmaton: maxmum lkelhood, Bayesan learnng, dscrmnatve tranng, etc.
3 attern Classfcaton roblem Gven a fxed set of fnte number of classes:, 2,, N. We have an unknown attern/object wthout ts class dentty. BUT we can measure observe some features about : s called feature or observaton of the attern. can be scalar or vector or vector sequence can be contnuous or dscrete attern classfcaton roblem:! Determne the class dentty for any a attern based on ts observaton or feature. Fundamental ssues n attern classfcaton How to make an otmal classfcaton? In what sense s t otmal?
4 Examles of attern classfcatoni Seech recognton: attern: voce soken by a human beng Classes: language words/sentences used by the seaker Features: seech sgnal characterstcs measured by a mcrohone " a sequence of feature vectors Each vector: contnuous, hgh-dmensonal, real-valued numbers Natural language understandng: attern: wrtten or soken languages of human Classes: all ossble semantc meanngs or ntentons Features: the used words or word-sequences sentences Dscrete, scalars or vector
5 Examles of attern classfcatonii Image understandng: attern: gven mages Classes: all known object categores Features: color or gray scales n all xels Contnuous, multle vectors/matrx Examles: face recognton, OCR otcal character recognton. Gene fndng n bonformatcs: attern: a newly sequenced DNA sequence Classes: all known genes Features: all nucleotdes n the sequence Dscrete; 4 tyes adenne, guanne, cytosne, thymne roten classfcaton n bonformatcs: attern: roten rmary -D sequence Classes: all known roten famles or domans Features: all amno acds n the sequence: dscrete; 20 tyes
6 Bayesan Decson TheoryI Bayesan decson theory s a fundamental statstcal aroach to all attern classfcaton roblems. attern classfcaton roblem s osed n robablstc terms. Observaton s vewed as random varables vectors, Class d s treated as a dscrete random varable, whch could take values, 2,, N. Therefore, we are nterested n the jont robablty dstrbuton of and whch contans all nfo about and., If all the relevant robablty values and denstes are known n the roblem we have comlete knowledge of the roblem, Bayesan decson theory leads to the otmal classfcaton Otmal " Guarantee mnmum average classfcaton error The mnmum classfcaton error s called the Bayes error.
7 Bayesan Decson TheoryII In attern classfcaton, all relevant robabltes mean: ror robabltes of each class,2,,n: how lkely any a attern comes from each class before observng any features! ror knowledge from revous exerence N All rors sum to one: Class-condtonal robablty densty functons of the observed feature,,,2,,n: how the feature dstrbutes for all atterns belongng to one class. If s contnuous, s a contnuous.d.f. dstrbuton For every class : d If s dscrete, s dscrete robablty mass functon.m.f. dstrbuton. For every class :
8 Examle of class-condtonal.d.f. x2 x x.8 Fgure from Duda et. al., attern classfcaton John Wley & Sons, Inc.
9 Bayes Decson Rule: Maxmum a osteror MA I If not observe any feature of an ncomng unknown attern, classfy t based on ror knowledge only arg max Roughly guess t as the class wth largest ror robablty If observe some features of the unknown atter, we can convert the ror robablty nto a osteror robablty based on the Bayes theorem: osteror ror lkelhood evdence
10 Bayes decson rule: Maxmum a osteror MA II Where ror : robablty of recevng a attern from class before observng anythng. ror knowledge Lkelhood : robablty of observng feature f assume comes from a attern n class. f assume s gven, treat t as a functon of, t s called lkelhood functon osteror : robablty of gettng a attern from class after observng ts features as. Evdence x: a scalar factor to guarantee osteror robabltes sum to one regardng.
11 Bayes decson rule: Maxmum a osteror MA II If we observe some features of an unknown attern, the observaton can convert the ror nto osteror. Intutvely, we can class the attern based on the osteror robabltes, resultng n the maxmum a osteror MA decson rule, also called Bayes decson rule. For an unknown attern, after observng some features, we classfy t nto the class wth the largest osteror robablty: arg max arg max arg max
12 The MA decson rule s otmal I How well the MA decson rule behaves?? Otmalty: assume we have comlete knowledge, ncludng and,2,,n, the MA decson rule s otmal to classfy atterns, whch means t wll acheve the lowest average classfcaton error rate. roof of otmalty of the MA rule: Gven a attern, f ts true class d s, but we classfy t as, then the classfcaton error s counted as l : l 0 whch s also known as 0- loss functon.
13 The MA decson rule s otmal II l R N roof of otmalty of the MA rule cont For a attern, after observng, the osteror s the robablty that the true class d of s. Thus the exected average classfcaton error assocated wth classfyng as s calculated: The otmal classfcaton s to mnmze the above average classfcaton error,.e., f observng, we classfy as to mnmze R! maxmze! the MA decson rule s otmal, whch acheves the mnmum average average error rate. The mnmum error s called Bayes error.
14 The MA decson rule A general decson rule s a mang functon, gven any an observaton, outut a class d : " If we totally have N classes, a decson rule wll artton the entre feature sace of nto N dfferent regons, O, O2,, ON. If s located n the regon O, we classfy t as class. Each regon O could consst of many contguous areas. The MA decson rule the Bayes decson rule s otmal among all ossble decson rules n terms of mnmzng average classfcaton errors condtonal on that we have comlete and recse knowledge about the underlyng roblem. Feature sace Class Class 2 Class N
15 The MA decson rule examle O2 O O2 O Fgure from Duda et. al., attern classfcaton John Wley & Sons, Inc.
16 Classfcaton Error robablty of a decson rule Assume N-class roblem, any a decson rule artton the feature sace nto N regons, O, O2,, ON. denotes the robablty of the observaton of a attern ts true class d s j falls n the regon O. The overall classfcaton error robablty of the decson rule s:, r j O N O N N O O correct error d r, r r r
17 Examle of Error robablty n 2-class case Error 2 " Error " 2 Fgure from Duda et. al., attern classfcaton John Wley & Sons, Inc.
18 Bayes Error Bayes error: error robablty of the Bayes MA decson rule. Snce Bayes decson rule guarantees the mnmum error, the Bayes error s the lower bound of all ossble error robabltes. It s dffcult to calculate the Bayes error, even for the very smle cases because of dscontnuous nature of the decson regons n the ntegral, esecally n hgh dmensons. Some aroxmaton methods to estmate an uer bound. Chernoff bound Bhattacharyya bound Evaluate on an ndeendent test set.
19 Examle: Bayes decson for ndeendent bnary features Bayes decson rule the MA rule s also alcable when feature s dscrete. A smle case Bnomal model: 2-class, 2, feature vector s d-dmensonal vector, whose comonents are bnary-valued and condtonally ndeendent. d x x d x x t d q q x q x d x x x x r and r 0,,,,
20 Examle: Bayes decson for ndeendent bnary features The MA decson rule:. otherwse, classfy to 0, g If ln ln ln ln ln ln we have the decson functon : Equvalently, otherwse, f classfy to λ λ λ λ d d d q q q x q x q x g
21 Mssng features/data I If we know the full robablty structure of a roblem, we can construct the otmal Bayes decson rule. In some ractcal stuatons, for some atterns, we can t observe the full feature vector descrbed n the robablty structure. Only artal nformaton of the feature vector s observed, but some comonents are mssng. How to classfy such corruted nuts to obtan mnmum average error? Let the full feature vector [g,b], g reresents the observed or good features, b reresents the mssng or bad ones. In ths case, the otmal decson rule s constructed based on the osteror, g, as follows: arg max g
22 Mssng features/dataii Where b b b b g b b g b b g b b g b g g b b g g g g d d d, d, 2 d d d,, d,,,
23 Otmal Bayes decson rule s not achevable n ractce The otmal Bayes decson rule s not feasble n ractce. In any ractcal roblem, we can not have a comlete knowledge about the roblem. e.g., the class-condtonal robablty densty functons.d.f.,, are always unavalable and extremely hard to estmate. However, ossble to collect a set of samle data a set of feature observatons for each class n queston. The samle data are always far from enough to estmate a relable.d.f. by usng samle data themselves ONLY, e.g., some nonarametrc methods " samlng densty / hstogram. Queston: How to buld a reasonable classfer based on a lmted set of samle data, nstead of the true.d.f. s?
24 Statstcal Data Modelng For any real roblem, the true.d.f. s are always unknown, nether the functon form nor the arameters. Our aroach statstcal data modelng : based on the avalable samle data set, choose a roer statstcal model to ft nto the avalable data set. Data Modelng stage: once the statstcal model s selected, ts functon form becomes known excet a set of model arameters assocated wth the model are unknown to us. Learnng tranng stage: the unknown arameters can be estmated by fttng the model nto the data set based on certan estmaton crteron. the estmated statstcal model assumed model format + estmated arameters wll gve a arametrc.d.f. to aroxmate the real but unknown.d.f. of each class. Decson test stage: the estmated.d.f. s are lugged nto the otmal Bayes decson rule n lace of the real.d.f. s! lug-n MA decson rule Not otmal any more but erforms reasonably well n ractce Theoretcal analyss:! Statstcal Learnng Theory
25 Data modelng Estmated model.d.f. for class 2 λ2x Estmated model.d.f. for class λx
26 lug-n MA decson rule Once the statstcal models are estmated, they are treated as f they were true dstrbutons of the data, and lug nto the form of the otmal Bayes MA decson rule n lace of the unknown true.d.f. s. The lug-n MA decson rule: arg max arg max arg max arg max Λ Γ
27 Some useful modelsi A roer model must be chosen based on the nature of observaton data. Some useful statstcal models for a varety of data tyes: Normal Gaussan dstrbuton! un-modal contnuous feature scalars Multvarate normal Gaussan dstrbuton! un-modal contnuous feature vectors Gaussan Mxture models GMM! contnuous feature scalars/vectors wth mult-modal dstrbuton nature! For seaker recognton/verfcaton dstrbuton of seech features over a large oulaton
28 Some useful models II Some useful models cont d Markov chan model: dscrete sequental data N-gram model n language modelng Hdden Markov Models HMM: deal for varous knds of sequental observaton data; rovdes better modelng caablty than smle Markov chan model. Model seech sgnals for recognton one of the most successful story of data modelng Model language/text data for art-of-seech taggng, shallow language understandng, etc. Model bologcal data DNA & roten sequence: rofle HMM. Lots of other alcaton domans.
29 Some useful models III Some useful models cont d Random Markov Feld: mult-dmensonal satal data Model mage data: e.g., used for OCR, etc. HMM s a secal case of random Markov feld Grahcal models a.k.a., Bayesan networks, Belef networks Hgh-dmensonal data dscrete or contnuous To model a very comlex stochastc rocess Automatcally learn deendency from data Used wdely n machne learnng, data mnng HMM s also a secal case of grahcal model Neural networks, suort vector machne SVM DON T ft here. Not to model the dstrbuton.d.f. of data drectly. Dscrmnatve method: model the boundares of data sets
30 Generatve vs. dscrmnatve models osteror robablty lays the key role n attern classfcaton. Generatve Models: focus on robablty dstrbuton of data ~ the lug-n MA rule Dscrmnatve Models: drectly model dscrmnant functon: ~ g
Pattern Classification (II) 杜俊
attern lassfcaton II 杜俊 junu@ustc.eu.cn Revew roalty & Statstcs Bayes theorem Ranom varales: screte vs. contnuous roalty struton: DF an DF Statstcs: mean, varance, moment arameter estmaton: MLE Informaton
More informationMachine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing
Machne Learnng 0-70/5 70/5-78, 78, Fall 008 Theory of Classfcaton and Nonarametrc Classfer Erc ng Lecture, Setember 0, 008 Readng: Cha.,5 CB and handouts Classfcaton Reresentng data: M K Hyothess classfer
More informationPattern Classification
attern Classfcaton All materals n these sldes were taken from attern Classfcaton nd ed by R. O. Duda,. E. Hart and D. G. Stork, John Wley & Sons, 000 wth the ermsson of the authors and the ublsher Chater
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationClassification Bayesian Classifiers
lassfcaton Bayesan lassfers Jeff Howbert Introducton to Machne Learnng Wnter 2014 1 Bayesan classfcaton A robablstc framework for solvng classfcaton roblems. Used where class assgnment s not determnstc,.e.
More informationMIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU
Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models
More informationDepartment of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING
MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/
More informationBayesian classification CISC 5800 Professor Daniel Leeds
Tran Test Introducton to classfers Bayesan classfcaton CISC 58 Professor Danel Leeds Goal: learn functon C to maxmze correct labels (Y) based on features (X) lon: 6 wolf: monkey: 4 broker: analyst: dvdend:
More informationHidden Markov Model Cheat Sheet
Hdden Markov Model Cheat Sheet (GIT ID: dc2f391536d67ed5847290d5250d4baae103487e) Ths document s a cheat sheet on Hdden Markov Models (HMMs). It resembles lecture notes, excet that t cuts to the chase
More informationPattern Recognition. Approximating class densities, Bayesian classifier, Errors in Biometric Systems
htt://.cubs.buffalo.edu attern Recognton Aromatng class denstes, Bayesan classfer, Errors n Bometrc Systems B. W. Slverman, Densty estmaton for statstcs and data analyss. London: Chaman and Hall, 986.
More informationFinite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin
Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of
More informationNaïve Bayes Classifier
9/8/07 MIST.6060 Busness Intellgence and Data Mnng Naïve Bayes Classfer Termnology Predctors: the attrbutes (varables) whose values are used for redcton and classfcaton. Predctors are also called nut varables,
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationMachine learning: Density estimation
CS 70 Foundatons of AI Lecture 3 Machne learnng: ensty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square ata: ensty estmaton {.. n} x a vector of attrbute values Objectve: estmate the model of
More informationLearning from Data 1 Naive Bayes
Learnng from Data 1 Nave Bayes Davd Barber dbarber@anc.ed.ac.uk course page : http://anc.ed.ac.uk/ dbarber/lfd1/lfd1.html c Davd Barber 2001, 2002 1 Learnng from Data 1 : c Davd Barber 2001,2002 2 1 Why
More information8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore
8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø
More informationConjugacy and the Exponential Family
CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the
More informationCS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements
CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before
More informationComparing two Quantiles: the Burr Type X and Weibull Cases
IOSR Journal of Mathematcs (IOSR-JM) e-issn: 78-578, -ISSN: 39-765X. Volume, Issue 5 Ver. VII (Se. - Oct.06), PP 8-40 www.osrjournals.org Comarng two Quantles: the Burr Tye X and Webull Cases Mohammed
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationApproximate Inference: Mean Field Methods
School of Comuter Scence Aromate Inference: Mean Feld Methods Probablstc Grahcal Models 10-708 Lecture 17 Nov 12 2007 Recetor A Knase C Gene G Recetor B X 1 X 2 Knase D Knase X 3 X 4 X 5 TF F X 6 Gene
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Exerments-I MODULE II LECTURE - GENERAL LINEAR HYPOTHESIS AND ANALYSIS OF VARIANCE Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 3.
More information9.913 Pattern Recognition for Vision. Class IV Part I Bayesian Decision Theory Yuri Ivanov
9.93 Class IV Part I Bayesan Decson Theory Yur Ivanov TOC Roadmap to Machne Learnng Bayesan Decson Makng Mnmum Error Rate Decsons Mnmum Rsk Decsons Mnmax Crteron Operatng Characterstcs Notaton x - scalar
More informationA total variation approach
Denosng n dgtal radograhy: A total varaton aroach I. Froso M. Lucchese. A. Borghese htt://as-lab.ds.unm.t / 46 I. Froso, M. Lucchese,. A. Borghese Images are corruted by nose ) When measurement of some
More informationINF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018
INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationSmall Sample Problem in Bayes Plug-in Classifier for Image Recognition
mall amle Problem n Bayes Plug-n Classfer for Image Recognton Carlos E. homaz, Duncan F. Glles and Raul Q. Fetosa Imeral College of cence echnology and Medcne, Deartment of Comutng, 80 Queen s Gate, London
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationWeb-Mining Agents Probabilistic Information Retrieval
Web-Mnng Agents Probablstc Informaton etreval Prof. Dr. alf Möller Unverstät zu Lübeck Insttut für Informatonssysteme Karsten Martny Übungen Acknowledgements Sldes taken from: Introducton to Informaton
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationMATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)
1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationClustering & Unsupervised Learning
Clusterng & Unsupervsed Learnng Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 2012 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More informationGenerative classification models
CS 675 Intro to Machne Learnng Lecture Generatve classfcaton models Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Data: D { d, d,.., dn} d, Classfcaton represents a dscrete class value Goal: learn
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationMichael Batty. Alan Wilson Plenary Session Entropy, Complexity, & Information in Spatial Analysis
Alan Wlson Plenary Sesson Entroy, Comlexty, & Informaton n Satal Analyss Mchael Batty m.batty@ucl.ac.uk @jmchaelbatty htt://www.comlexcty.nfo/ htt://www.satalcomlexty.nfo/ for Advanced Satal Analyss CentreCentre
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationStatistical analysis using matlab. HY 439 Presented by: George Fortetsanakis
Statstcal analyss usng matlab HY 439 Presented by: George Fortetsanaks Roadmap Probablty dstrbutons Statstcal estmaton Fttng data to probablty dstrbutons Contnuous dstrbutons Contnuous random varable X
More informationMixture of Gaussians Expectation Maximization (EM) Part 2
Mture of Gaussans Eectaton Mamaton EM Part 2 Most of the sldes are due to Chrstoher Bsho BCS Summer School Eeter 2003. The rest of the sldes are based on lecture notes by A. Ng Lmtatons of K-means Hard
More informationBayesian Learning. Smart Home Health Analytics Spring Nirmalya Roy Department of Information Systems University of Maryland Baltimore County
Smart Home Health Analytcs Sprng 2018 Bayesan Learnng Nrmalya Roy Department of Informaton Systems Unversty of Maryland Baltmore ounty www.umbc.edu Bayesan Learnng ombnes pror knowledge wth evdence to
More informationLarge-Margin HMM Estimation for Speech Recognition
Large-Margn HMM Estmaton for Speech Recognton Prof. Hu Jang Department of Computer Scence and Engneerng York Unversty, Toronto, Ont. M3J 1P3, CANADA Emal: hj@cs.yorku.ca Ths s a jont work wth Chao-Jun
More informationLogistic regression with one predictor. STK4900/ Lecture 7. Program
Logstc regresson wth one redctor STK49/99 - Lecture 7 Program. Logstc regresson wth one redctor 2. Maxmum lkelhood estmaton 3. Logstc regresson wth several redctors 4. Devance and lkelhood rato tests 5.
More informationEvaluation for sets of classes
Evaluaton for Tet Categorzaton Classfcaton accuracy: usual n ML, the proporton of correct decsons, Not approprate f the populaton rate of the class s low Precson, Recall and F 1 Better measures 21 Evaluaton
More informationFuzzy approach to solve multi-objective capacitated transportation problem
Internatonal Journal of Bonformatcs Research, ISSN: 0975 087, Volume, Issue, 00, -0-4 Fuzzy aroach to solve mult-objectve caactated transortaton roblem Lohgaonkar M. H. and Bajaj V. H.* * Deartment of
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationHidden Markov Models
CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte
More informationStatistical Foundations of Pattern Recognition
Statstcal Foundatons of Pattern Recognton Learnng Objectves Bayes Theorem Decson-mang Confdence factors Dscrmnants The connecton to neural nets Statstcal Foundatons of Pattern Recognton NDE measurement
More informationGenerative and Discriminative Models. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
Generatve and Dscrmnatve Models Je Tang Department o Computer Scence & Technolog Tsnghua Unverst 202 ML as Searchng Hpotheses Space ML Methodologes are ncreasngl statstcal Rule-based epert sstems beng
More informationEE513 Audio Signals and Systems. Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky
EE53 Audo Sgnals and Systes Statstcal Pattern Classfcaton Kevn D. Donohue Electrcal and Couter Engneerng Unversty of Kentucy Interretaton of Audtory Scenes Huan erceton and cognton greatly eceeds any couter-based
More informationOther NN Models. Reinforcement learning (RL) Probabilistic neural networks
Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informationClassification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957)
4//06 IAI: Lecture 09 Feedforard Mult-Layer Percetron (Doredná vacvrstvová seť) Lubca Benuskova AIMA 3rd ed. Ch. 8.6.4 8.7.5 Classfcaton (klasfkáca) In machne learnng and statstcs, classfcaton s the roblem
More informationTHERMODYNAMICS. Temperature
HERMODYNMICS hermodynamcs s the henomenologcal scence whch descrbes the behavor of macroscoc objects n terms of a small number of macroscoc arameters. s an examle, to descrbe a gas n terms of volume ressure
More informationSpace of ML Problems. CSE 473: Artificial Intelligence. Parameter Estimation and Bayesian Networks. Learning Topics
/7/7 CSE 73: Artfcal Intellgence Bayesan - Learnng Deter Fox Sldes adapted from Dan Weld, Jack Breese, Dan Klen, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer What s Beng Learned? Space
More informationDiscriminative classifier: Logistic Regression. CS534-Machine Learning
Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng 2 Logstc Regresson Gven tranng set D stc regresson learns the condtonal dstrbuton We ll assume onl to classes and a parametrc form for here s
More informationLecture # 02: Pressure measurements and Measurement Uncertainties
AerE 3L & AerE343L Lecture Notes Lecture # 0: Pressure measurements and Measurement Uncertantes Dr. Hu H Hu Deartment of Aerosace Engneerng Iowa State Unversty Ames, Iowa 500, U.S.A Mechancal Pressure
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationHopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen
Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The
More informationLogistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI
Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton
More informationCSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing
CSC321 Tutoral 9: Revew of Boltzmann machnes and smulated annealng (Sldes based on Lecture 16-18 and selected readngs) Yue L Emal: yuel@cs.toronto.edu Wed 11-12 March 19 Fr 10-11 March 21 Outlne Boltzmann
More informationWhy Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one)
Why Bayesan? 3. Bayes and Normal Models Alex M. Martnez alex@ece.osu.edu Handouts Handoutsfor forece ECE874 874Sp Sp007 If all our research (n PR was to dsappear and you could only save one theory, whch
More informationMaximum Likelihood Estimation (MLE)
Maxmum Lkelhood Estmaton (MLE) Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 01 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationMotion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong
Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationBayesian networks for scenario analysis of nuclear waste repositories
Bayesan networks for scenaro analyss of nuclear waste reostores Edoardo Toson ab Aht Salo a Enrco Zo bc a. Systems Analyss Laboratory Det of Mathematcs and Systems Analyss - Aalto Unversty b. Laboratory
More informationPriority Queuing with Finite Buffer Size and Randomized Push-out Mechanism
ICN 00 Prorty Queung wth Fnte Buffer Sze and Randomzed Push-out Mechansm Vladmr Zaborovsy, Oleg Zayats, Vladmr Muluha Polytechncal Unversty, Sant-Petersburg, Russa Arl 4, 00 Content I. Introducton II.
More informationConfidence intervals for weighted polynomial calibrations
Confdence ntervals for weghted olynomal calbratons Sergey Maltsev, Amersand Ltd., Moscow, Russa; ur Kalambet, Amersand Internatonal, Inc., Beachwood, OH e-mal: kalambet@amersand-ntl.com htt://www.chromandsec.com
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationSMALL AREA ESTIMATES FROM THE AMERICAN COMMUNITY SURVEY USING A HOUSING UNIT MODEL
SMALL AREA ESTIMATES FROM THE AMERICAN COMMUNITY SURVEY USING A HOUSING UNIT MODEL Nanak Chand and Donald Malec U.S. Bureau of the Census Abstract The Amercan Communty Survey (ACS) s desgned to, ultmately,
More informationChapter Newton s Method
Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve
More informationOutline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline
Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number
More informationCommunication with AWGN Interference
Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m
More informationLinear Classification, SVMs and Nearest Neighbors
1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush
More informationExcess Error, Approximation Error, and Estimation Error
E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple
More informationA Quadratic Cumulative Production Model for the Material Balance of Abnormally-Pressured Gas Reservoirs F.E. Gonzalez M.S.
Natural as Engneerng A Quadratc Cumulatve Producton Model for the Materal Balance of Abnormally-Pressured as Reservors F.E. onale M.S. Thess (2003) T.A. Blasngame, Texas A&M U. Deartment of Petroleum Engneerng
More informationExperimental Study on Classification
Chapter 7. Expermental Study on Classfcaton 7.1 Characterzaton of Explosve Materals 7.1.1 Atomc effect number and densty Theoretcally, most explosves fall wthn a relatvely narrow wndow n Z eff and n densty,
More information+, where 0 x N - n. k k
CO 745, Mdterm Len Cabrera. A multle choce eam has questons, each of whch has ossble answers. A student nows the correct answer to n of these questons. For the remanng - n questons, he checs the answers
More informationENG 8801/ Special Topics in Computer Engineering: Pattern Recognition. Memorial University of Newfoundland Pattern Recognition
EG 880/988 - Specal opcs n Computer Engneerng: Pattern Recognton Memoral Unversty of ewfoundland Pattern Recognton Lecture 7 May 3, 006 http://wwwengrmunca/~charlesr Offce Hours: uesdays hursdays 8:30-9:30
More informationA Quadratic Cumulative Production Model for the Material Balance of Abnormally-Pressured Gas Reservoirs F.E. Gonzalez M.S.
Formaton Evaluaton and the Analyss of Reservor Performance A Quadratc Cumulatve Producton Model for the Materal Balance of Abnormally-Pressured as Reservors F.E. onale M.S. Thess (2003) T.A. Blasngame,
More informationClustering & (Ken Kreutz-Delgado) UCSD
Clusterng & Unsupervsed Learnng Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y ), fnd an approxmatng
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationOutline. EM Algorithm and its Applications. K-Means Classifier. K-Means Classifier (Cont.) Introduction of EM K-Means EM EM Applications.
EM Algorthm and ts Alcatons Y L Deartment of omuter Scence and Engneerng Unversty of Washngton utlne Introducton of EM K-Means EM EM Alcatons Image Segmentaton usng EM bect lass Recognton n BIR olor lusterng
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More information( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1
Problem Set 4 Suggested Solutons Problem (A) The market demand functon s the soluton to the followng utlty-maxmzaton roblem (UMP): The Lagrangean: ( x, x, x ) = + max U x, x, x x x x st.. x + x + x y x,
More informationA Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00
A Mathematcal Theory of Communcaton Claude hannon s aer resented by Kate Jenkns 2/19/00 Publshed n two arts, July 1948 and October 1948 n the Bell ystem Techncal Journal Foundng aer of Informaton Theory
More informationConservative Surrogate Model using Weighted Kriging Variance for Sampling-based RBDO
9 th World Congress on Structural and Multdsclnary Otmzaton June 13-17, 011, Shzuoka, Jaan Conservatve Surrogate Model usng Weghted Krgng Varance for Samlng-based RBDO Lang Zhao 1, K.K. Cho, Ikn Lee 3,
More informationUsing Genetic Algorithms in System Identification
Usng Genetc Algorthms n System Identfcaton Ecaterna Vladu Deartment of Electrcal Engneerng and Informaton Technology, Unversty of Oradea, Unverstat, 410087 Oradea, Româna Phone: +40259408435, Fax: +40259408408,
More informationAN ASYMMETRIC GENERALIZED FGM COPULA AND ITS PROPERTIES
Pa. J. Statst. 015 Vol. 31(1), 95-106 AN ASYMMETRIC GENERALIZED FGM COPULA AND ITS PROPERTIES Berzadeh, H., Parham, G.A. and Zadaram, M.R. Deartment of Statstcs, Shahd Chamran Unversty, Ahvaz, Iran. Corresondng
More informationError Probability for M Signals
Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal
More informationIntroduction to Hidden Markov Models
Introducton to Hdden Markov Models Alperen Degrmenc Ths document contans dervatons and algorthms for mplementng Hdden Markov Models. The content presented here s a collecton of my notes and personal nsghts
More informationThe big picture. Outline
The bg pcture Vncent Claveau IRISA - CNRS, sldes from E. Kjak INSA Rennes Notatons classes: C = {ω = 1,.., C} tranng set S of sze m, composed of m ponts (x, ω ) per class ω representaton space: R d (=
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationQUANTITATIVE RISK MANAGEMENT TECHNIQUES USING INTERVAL ANALYSIS, WITH APPLICATIONS TO FINANCE AND INSURANCE
QANTITATIVE RISK MANAGEMENT TECHNIQES SING INTERVA ANAYSIS WITH APPICATIONS TO FINANCE AND INSRANCE Slva DED Ph.D. Bucharest nversty of Economc Studes Deartment of Aled Mathematcs; Romanan Academy Insttute
More informationExpectation Maximization Mixture Models HMMs
-755 Machne Learnng for Sgnal Processng Mture Models HMMs Class 9. 2 Sep 200 Learnng Dstrbutons for Data Problem: Gven a collecton of eamples from some data, estmate ts dstrbuton Basc deas of Mamum Lelhood
More information