AI*IA 2003 Fusion of Multiple Pattern Classifiers PART III
|
|
- Calvin Jenkins
- 5 years ago
- Views:
Transcription
1 AI*IA 23 Fusion of Multile Pattern Classifiers PART III AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 49
2 Methods for fusing multile classifiers Methods for fusing multile classifiers can be classified according to the tye of information roduced by the individual classifiers (Xu et al., 992): Abstract-level oututs: each classifier oututs a unique class label for each inut attern Rank-level oututs: each classifier oututs a list of ossible classes, with ranking, for each inut attern Measurement-level oututs: each classifier oututs class confidence levels for each inut attern For each of the above categories, methods can be further subdivided into: Integration vs. Selection rules and Fixed rules vs. Trained Rules AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 5
3 Methods for fusing multile classifiers Integration Abstract-level Selection Fixed rules Measurements-level Trained rules Rank-level AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 5
4 Fixed Rules at the abstract-level The Majority Voting Rule Let us consider the N abstract ( cris ) classifiers oututs S (),, S (N) associated to the attern x. Majority Rule: Class label c i is assigned to the attern x if c i is the most frequent label in the cris classifiers oututs. Classifier S () =a i S = a, b, N = 3 { } x n Classifier 2 S (2) =b Majority Voting Rule a Classifier 3 S (3) =a AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 52
5 Usually N is odd. Majority Vote Rule The frequency of the winner class must be at least N/2 If the N classifiers make indeendent errors and they have the same error robability e<.5, then it can be shown that the error E of the majority voting rule is monotonically decreasing in N (Hansen and Salamon, 99): N N l i m e k N k ( e ) = > / 2 k N k N Clearly, erformances of majority vote quickly decreases for deendent classifiers AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 53
6 Majority Vote Rule vs. Classifiers Deendency: An Examle 3 Classifiers Two class task Classifiers C and C3 exhibit error correlations > P(,,) P(,,) < Classifier C Abstract C2 Oututs C3 Majority Class True Class AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 54
7 Trainable Rules at the abstract-level Rules based on the Bayes Aroach This kind of trained fusion rules are based on the Bayes aroach. Pattern x is assigned to the class c i if its osterior robability: () (2) ( N) () (2) ( N) j P( c S, S,..., S ) P( c S, S,..., S ) i i { } S c c, 2,..., c k > j i Fusion rules based on the Bayes Formula try to estimate, by an indeendent validation set, these osterior robabilities This fusion rule coincides with the multinomial statistical classifier, that is, the otimal statistical decision rule for discretevalued feature vectors (Raudys and Roli, 23) AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 55
8 Behaviour Knowledge Sace (BKS) In the BKS method, every ossible combination of abstract-level classifiers oututs is regarded as a cell in a look-u table. Each cell contains the number of samles of the validation set characterized by a articular value of class labels. Reject otion by a threshold is used to limit error due to ambiguous cells Classifiers oututs S (), S (2), S (3) Class,,,,,,,,,,,,,,,, P ( c = S =, S =, S = ) = = th () (2) (3) 76 AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 56
9 BKS Small-Samle Size Drawback If k is the number of classes and N is the number of the combined classifiers, BKS requires to estimate k N osterior robabilities. K and N are two critical arameter for the BKS rule, because the number of osterior robabilities to estimate increases very quickly. If this number is too high, we can have serious roblems, because the number of training samle is often small. BKS rule suffers a lot from the small samle size roblem AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 57
10 BKS Imrovements In order to avoid the small samle size roblem: -we can try to reduce the number of the arameters to estimate (Xu et al., 992 ; Kang, and Lee, 999). For examle, under the assumtion of classifier indeendence given the class (Xu et al., 992): P S S c P S c P c S P S () ( N ) ( j ) ( j ) ( j ) (,..., i ) = ( i ) ( i ) ( ) j j bel i P c S S P c S () ( N ) ( j ) ( ) = ( i,..., ) = ( i ) j -We can use noise injection to increase samle size of training set (Roli, Raudys, Marcialis, 22). AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 58
11 BKS Imrovements If the number of classifiers is small, BKS cells can become large and contain vectors of different classes, that is, ambiguous cells can exist. Classifiers oututs S (), S (2), S (3) Class,,,,,,,,,,,,,,,, Raudys and Roli (23) roosed a method to address this issue AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 59
12 Remarks on Abstract Level Fusers Abstract level methods are the most general fusion rules They can be alied to any ensemble of classifiers, even to classifiers of different tyes The majority voting rule is the simlest combining method This allows theoretical analyses (Lam and Suen, 997) When rior erformance is not considered, the requirements of time and memory are negligible As we roceed from simle rules to adative (weighted voting) and trained (BKS, Bayesian rule) the demands on time and memory quickly increase Trained rules imose heavy demands on the quality and size of data set AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 6
13 AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 6 Rank-level Fusion Methods Some classifiers rovide class scores, or some sort of class robabilities In general, if ={c,,c k } is the set of classes, these classifiers can rovide an ordered (ranked) list of class labels. Classifier c c c = = = c c c r r r = = = This information can be used to rank each class.
14 The Borda Count Method: an examle Let N=3 and k=4. = { a, b, c, d }. For a given attern, the ranked oututs of the three classifiers are as follows: Rank value Classifier Classifier 2 Classifier 3 4 c a b 3 b b a 2 d d c a c d AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 62
15 The Borda Count Method: an examle So, we have: r = r + r + r = = 8 ( ) ( 2 ) ( 3 ) a a a a ( ) ( 2 ) ( 3 ) b b b b ( ) ( 2 ) ( 3 ) a c c c ( ) ( 2 ) ( 3 ) a d d d r = r + r + r = = r = r + r + r = = 7 r = r + r + r = = 5 The winner-class is b because it has the maximum overall rank. AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 63
16 Remarks on Rank level Methods Advantages over abstract level (majority vote): ranking is suitable in roblems with many classes, where the correct class may aear often near the to of the list, although not at the to Examle: word recognition with sizeable lexicon Advantages over measurement level: rankings can be referred to soft oututs to avoid lack of consistency when using different classifiers rankings can be referred to soft oututs to simlify the combiner design Drawbacks: Rank-level methods are not suorted by clear theoretical underinnings Results deend on the scale of numbers assigned to the choices AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 64
17 AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 65 Measurement-level Fusion Methods Classifier Classifier j Classifier N k 2... j j j k 2... N N N k Fusion rule x i k a rg m a x i i Normalization of classifiers oututs is not a trivial task when combining classifiers with different outut ranges and different outut tyes (e.g., distances vs. membershi values).
18 Linear Combiners ave = N k i k = k i ( x) w ( x) Simle and Weighted averaging of classifiers oututs Simle average is the otimal fuser for classifiers with the same accuracy and the same air-wise correlations (Roli and Fumera) Weighted average is required for imbalanced classifiers, that is, classifiers with different accuracy and/or different air-wise correlations (Roli and Fumera) Imrovement of weighted average over simle average has been investigated theoretically and by exeriments (Roli and Fumera) AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 66
19 Bias/Variance Analysis in Linear Combiners The added error over the Bayes one can be reduced by linear combinations of classifiers oututs (Tumer and Ghosh; Roli and Fumera) i (x) Otimum boundary Obtained boundary j (x) f i (x) f j (x) Bayes Error Added Error x D i x* b x b D j AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 67
20 Bias/Variance Analysis in Linear Combiners Correlated and Unbiased Classifiers Added error of the simle average of N classifiers (Tumer and Ghosh): E a v e a d d ( N ) + " = E a d d N The reduction of variance comonent of the added error achieved by simle averaging deends on the correlation factor " between the estimation errors Negatively correlated estimation errors allow to achieve a greater imrovement than indeendent errors AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 68
21 Bias/Variance Analysis in Linear Combiners Correlated and Biased Classifiers Added error of the simle average of N classifiers (Tumer and Ghosh): E ( ) 2 + N " = # + $ $ s N 2s ( ) 2 ave add i j Simle averaging is effective for reducing the variance comonent, but not for the bias comonent So, individual classifiers with low biases should be referred Analysis of bias/variance for weighted average can be found in (Fumera and Roli) AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 69
22 Product and Order Statistics Fusers Fusers based on order statistics oerators are max, min, and med: max ( ) N : N x = ( x), i =,, k i i min ( ) : N x = ( x), i =,, k i i ( ) N N j i (* i ) i j= = P, i =,, k Product is obviously sensible to classifiers oututting robability estimates close to zero ( overconfident classifiers) & N N + : N : N 2 ( ) 2 ( ) i x + i x med if is even ( ) ' N i x = ( 2, i =,, k ' N + : N 2 ) ' ( ) i x if N is odd AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 7
23 A Theoretical Framework A theoretical framework for the roduct, min, med and max fusers was established by J. Kittler et al. (998) These rules can be formally derived assuming that the N individual classifiers use distinct feature vectors j (* / X, X,..., X ) i 2 the roduct and min rules are derived under the hyothesis that classifiers are conditionally statistically indeendent sum, max, median rules are derived under the further hyothesis that the a osteriori robabilities estimated by the classifiers do not deviate significantly from the class rior robabilities N AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 7
24 Fusers with Weights A natural extension of many fixed rules is the introduction of weights to the oututs of classifiers (weighted average, weighted majority vote) A simle criterion is to introduce weights roortional to the accuracy of each individual classifier Weights related to classes can also be comuted by extracting them from the confusion matrix of each classifier Methods for robust weights estimation are a matter of ongoing research AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 72
25 Stacked Fusion The k soft oututs of the N individual classifiers, ij (x), i=,,k, j=,,n, can be considered as features of a new classification roblem (classifiers outut feature sace) Classifiers can be regarded as feature extractors! Another classifier can be used as fuser: this is the so-called stacked aroach (D.H. Wolert, 992), or metaclassification (Giacinto and Roli, 997), brute-force aroach (L.I. Kuncheva, 2) To train the metaclassifier, the oututs of the N individual classifiers on an indeendent validation set must be used Exerts Boasting Issue! (Raudys, 23) An indeendent validation set is required AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 73
26 Pros and Cons of the Stacked aroach Pros: the meta-classifier can work in a enriched feature sace No classifiers deendency model is assumed Cons: The dimensionality of the outut sace increases very fast with the number of classes and classifiers. The meta-classifier should be trained with a data set different from the one used for the individual classifiers (Exerts Boasting Issue) The sace of classifiers oututs might not be well-behaved AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 74
27 Classifier Selection Goal: for each inut attern, select the classifier, if any, able to correctly classify it Problem formulation Two dimensional feature sace Partitioned in 4 regions R R4 R3 R2 Easy to see that selection outerforms individual classifiers if I am able to select the best classifier for each region Two critical issues: i)definition of regions; ii)selection algorithm AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 75
28 Classifier Selection Two main aroaches to design a classifier selection system: Static vs. Dynamic Selection Static Selection Regions are defined before classification. For each region, a resonsible classifier is identified Regions can be defined in different ways: Histogram method: Sace artition in bins Clustering algorithms AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 76
29 Dynamic Selection Classifier Selection SELECTION CONDITIONS based on estimates of classifiers accuracies in local regions of feature sace surrounding an unknown test attern X (Neighbourhood(X)) X 2 xxxxxxxx xxxxxx xxx X ^^^^^^^^^^ ^^^^^^^^ ^^^ X Test Pattern X Neighbourhood(X) Validation Patterns See works of Woods et al.; Giacinto and Roli AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 77
30 Selection Condition: An Examle X 2 xxxxxxxx xxxxxx xxx X ^^^^^^^^^^ ^^^^^^^^ ^^^ X Test Pattern X Neighbourhood(X) Validation Patterns Woods et al. (997) simly comuted classifier local accuracy as the ercentage of correctly classified atterns in the neighbourhood Giacinto and Roli (997, etc) roosed some robabilistic measures: j ˆ( P correct X *) = j N Pˆ ( ) (* X * ) W n= i n i n N W n= n AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 78
31 Some remarks on classifier selection Theoretically, local fusion rules can outerform global ones Selection can effectively handle correlated classifiers In ractice, large and reresentative data sets are necessary to design a good selection system Further work is necessary to develo robust methods for estimating classifier local accuracy Works on Adative Mixtures of Local Exerts (M. Jordan, et al.), not discussed for the sake of brevity, can be regarded as methods for dynamic classifier selection AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 79
32 Final Remarks on Fixed vs. Trained Fusers Fixed rules Simlicity Low memory and time requirements Well-suited for ensembles of classifiers with indeendent/low correlated errors and similar erformances Trained rules -Flexibility: otentially better erformances than fixed rules -Trained rules are claimed to be more suitable than fixed ones for classifiers correlated or exhibiting different erformances High memory and time requirements Heavy demands on the quality and size of the training set AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 8
33 MCS DESIGN Parts 2 and 3 showed that designer of MCS has a toolbox containing a large number of instruments for generating and fusing classifiers. Two main design aroaches have been defined so far Coverage otimisation methods Decision otimisation methods However, combinations of the two above methods and hybrid, ad hoc, methods are often used AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 8
34 Feature Design MCS DESIGN Ensemble Design Classifier Design Fuser Design Performance Evaluation Performance Evaluation Coverage otimisation methods: a simle fuser is given without any design. The goal is to create a set of comlementary classifiers that can be fused otimally -Bagging, Random Subsace, etc. Decision otimisation methods: a set of carefully designed and otimised classifiers is given and unchangeable, the goal is to otimise the fuser AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 82
35 MCS DESIGN Decision otimisation method to MCS design is often used when reviously carefully designed classifiers are available, or valid roblem and designer knowledge is available Coverage otimisation method makes sense when creating carefully designed, strong, classifiers is difficult, or time consuming Integration of the two basic aroaches is often used However, no design method guarantees to obtain the otimal ensemble for a given fuser or a given alication (Roli and Giacinto, 22) The best MCS can only be determined by erformance evaluation. AI*IA 23 Tutorial on Fusion of Multile Pattern Classifiers by F. Roli 83
Distributed Rule-Based Inference in the Presence of Redundant Information
istribution Statement : roved for ublic release; distribution is unlimited. istributed Rule-ased Inference in the Presence of Redundant Information June 8, 004 William J. Farrell III Lockheed Martin dvanced
More information4. Score normalization technical details We now discuss the technical details of the score normalization method.
SMT SCORING SYSTEM This document describes the scoring system for the Stanford Math Tournament We begin by giving an overview of the changes to scoring and a non-technical descrition of the scoring rules
More informationRadial Basis Function Networks: Algorithms
Radial Basis Function Networks: Algorithms Introduction to Neural Networks : Lecture 13 John A. Bullinaria, 2004 1. The RBF Maing 2. The RBF Network Architecture 3. Comutational Power of RBF Networks 4.
More informationHotelling s Two- Sample T 2
Chater 600 Hotelling s Two- Samle T Introduction This module calculates ower for the Hotelling s two-grou, T-squared (T) test statistic. Hotelling s T is an extension of the univariate two-samle t-test
More informationApproximating min-max k-clustering
Aroximating min-max k-clustering Asaf Levin July 24, 2007 Abstract We consider the roblems of set artitioning into k clusters with minimum total cost and minimum of the maximum cost of a cluster. The cost
More informationGeneral Linear Model Introduction, Classes of Linear models and Estimation
Stat 740 General Linear Model Introduction, Classes of Linear models and Estimation An aim of scientific enquiry: To describe or to discover relationshis among events (variables) in the controlled (laboratory)
More informationFeedback-error control
Chater 4 Feedback-error control 4.1 Introduction This chater exlains the feedback-error (FBE) control scheme originally described by Kawato [, 87, 8]. FBE is a widely used neural network based controller
More informationLinear Combiners for Fusion of Pattern Classifiers
International Summer School on eural ets duardo R. Caianiello" SMBL MTHODS FOR LARIG MACHIS Vietri sul Mare(Salerno) ITALY 22-28 September 2002 Linear Combiners for Fusion of Pattern Classifiers Lecturer
More informationFinite Mixture EFA in Mplus
Finite Mixture EFA in Mlus November 16, 2007 In this document we describe the Mixture EFA model estimated in Mlus. Four tyes of deendent variables are ossible in this model: normally distributed, ordered
More informationSolved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points.
Solved Problems Solved Problems P Solve the three simle classification roblems shown in Figure P by drawing a decision boundary Find weight and bias values that result in single-neuron ercetrons with the
More informationModeling and Estimation of Full-Chip Leakage Current Considering Within-Die Correlation
6.3 Modeling and Estimation of Full-Chi Leaage Current Considering Within-Die Correlation Khaled R. eloue, Navid Azizi, Farid N. Najm Deartment of ECE, University of Toronto,Toronto, Ontario, Canada {haled,nazizi,najm}@eecg.utoronto.ca
More informationCombining Logistic Regression with Kriging for Mapping the Risk of Occurrence of Unexploded Ordnance (UXO)
Combining Logistic Regression with Kriging for Maing the Risk of Occurrence of Unexloded Ordnance (UXO) H. Saito (), P. Goovaerts (), S. A. McKenna (2) Environmental and Water Resources Engineering, Deartment
More informationAn Analysis of Reliable Classifiers through ROC Isometrics
An Analysis of Reliable Classifiers through ROC Isometrics Stijn Vanderlooy s.vanderlooy@cs.unimaas.nl Ida G. Srinkhuizen-Kuyer kuyer@cs.unimaas.nl Evgueni N. Smirnov smirnov@cs.unimaas.nl MICC-IKAT, Universiteit
More informationUnsupervised Hyperspectral Image Analysis Using Independent Component Analysis (ICA)
Unsuervised Hyersectral Image Analysis Using Indeendent Comonent Analysis (ICA) Shao-Shan Chiang Chein-I Chang Irving W. Ginsberg Remote Sensing Signal and Image Processing Laboratory Deartment of Comuter
More informationUncorrelated Multilinear Principal Component Analysis for Unsupervised Multilinear Subspace Learning
TNN-2009-P-1186.R2 1 Uncorrelated Multilinear Princial Comonent Analysis for Unsuervised Multilinear Subsace Learning Haiing Lu, K. N. Plataniotis and A. N. Venetsanooulos The Edward S. Rogers Sr. Deartment
More informationEvaluating Circuit Reliability Under Probabilistic Gate-Level Fault Models
Evaluating Circuit Reliability Under Probabilistic Gate-Level Fault Models Ketan N. Patel, Igor L. Markov and John P. Hayes University of Michigan, Ann Arbor 48109-2122 {knatel,imarkov,jhayes}@eecs.umich.edu
More informationProbability Estimates for Multi-class Classification by Pairwise Coupling
Probability Estimates for Multi-class Classification by Pairwise Couling Ting-Fan Wu Chih-Jen Lin Deartment of Comuter Science National Taiwan University Taiei 06, Taiwan Ruby C. Weng Deartment of Statistics
More informationGENERATING FUZZY RULES FOR PROTEIN CLASSIFICATION E. G. MANSOORI, M. J. ZOLGHADRI, S. D. KATEBI, H. MOHABATKAR, R. BOOSTANI AND M. H.
Iranian Journal of Fuzzy Systems Vol. 5, No. 2, (2008). 21-33 GENERATING FUZZY RULES FOR PROTEIN CLASSIFICATION E. G. MANSOORI, M. J. ZOLGHADRI, S. D. KATEBI, H. MOHABATKAR, R. BOOSTANI AND M. H. SADREDDINI
More informationA Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression
Journal of Modern Alied Statistical Methods Volume Issue Article 7 --03 A Comarison between Biased and Unbiased Estimators in Ordinary Least Squares Regression Ghadban Khalaf King Khalid University, Saudi
More informationSIMULATED ANNEALING AND JOINT MANUFACTURING BATCH-SIZING. Ruhul SARKER. Xin YAO
Yugoslav Journal of Oerations Research 13 (003), Number, 45-59 SIMULATED ANNEALING AND JOINT MANUFACTURING BATCH-SIZING Ruhul SARKER School of Comuter Science, The University of New South Wales, ADFA,
More informationSelection of Classifiers based on Multiple Classifier Behaviour
Selection of Classifiers based on Multiple Classifier Behaviour Giorgio Giacinto, Fabio Roli, and Giorgio Fumera Dept. of Electrical and Electronic Eng. - University of Cagliari Piazza d Armi, 09123 Cagliari,
More informationA Simple Weight Decay Can Improve. Abstract. It has been observed in numerical simulations that a weight decay can improve
In Advances in Neural Information Processing Systems 4, J.E. Moody, S.J. Hanson and R.P. Limann, eds. Morgan Kaumann Publishers, San Mateo CA, 1995,. 950{957. A Simle Weight Decay Can Imrove Generalization
More informationChapter 1: PROBABILITY BASICS
Charles Boncelet, obability, Statistics, and Random Signals," Oxford University ess, 0. ISBN: 978-0-9-0005-0 Chater : PROBABILITY BASICS Sections. What Is obability?. Exeriments, Outcomes, and Events.
More informationOn split sample and randomized confidence intervals for binomial proportions
On slit samle and randomized confidence intervals for binomial roortions Måns Thulin Deartment of Mathematics, Usala University arxiv:1402.6536v1 [stat.me] 26 Feb 2014 Abstract Slit samle methods have
More informationStatistics II Logistic Regression. So far... Two-way repeated measures ANOVA: an example. RM-ANOVA example: the data after log transform
Statistics II Logistic Regression Çağrı Çöltekin Exam date & time: June 21, 10:00 13:00 (The same day/time lanned at the beginning of the semester) University of Groningen, Det of Information Science May
More informationFuzzy Automata Induction using Construction Method
Journal of Mathematics and Statistics 2 (2): 395-4, 26 ISSN 1549-3644 26 Science Publications Fuzzy Automata Induction using Construction Method 1,2 Mo Zhi Wen and 2 Wan Min 1 Center of Intelligent Control
More informationSystem Reliability Estimation and Confidence Regions from Subsystem and Full System Tests
009 American Control Conference Hyatt Regency Riverfront, St. Louis, MO, USA June 0-, 009 FrB4. System Reliability Estimation and Confidence Regions from Subsystem and Full System Tests James C. Sall Abstract
More informationShadow Computing: An Energy-Aware Fault Tolerant Computing Model
Shadow Comuting: An Energy-Aware Fault Tolerant Comuting Model Bryan Mills, Taieb Znati, Rami Melhem Deartment of Comuter Science University of Pittsburgh (bmills, znati, melhem)@cs.itt.edu Index Terms
More informationNumerical Linear Algebra
Numerical Linear Algebra Numerous alications in statistics, articularly in the fitting of linear models. Notation and conventions: Elements of a matrix A are denoted by a ij, where i indexes the rows and
More informationUse of Transformations and the Repeated Statement in PROC GLM in SAS Ed Stanek
Use of Transformations and the Reeated Statement in PROC GLM in SAS Ed Stanek Introduction We describe how the Reeated Statement in PROC GLM in SAS transforms the data to rovide tests of hyotheses of interest.
More informationComputer arithmetic. Intensive Computation. Annalisa Massini 2017/2018
Comuter arithmetic Intensive Comutation Annalisa Massini 7/8 Intensive Comutation - 7/8 References Comuter Architecture - A Quantitative Aroach Hennessy Patterson Aendix J Intensive Comutation - 7/8 3
More informationEstimation of the large covariance matrix with two-step monotone missing data
Estimation of the large covariance matrix with two-ste monotone missing data Masashi Hyodo, Nobumichi Shutoh 2, Takashi Seo, and Tatjana Pavlenko 3 Deartment of Mathematical Information Science, Tokyo
More informationUncorrelated Multilinear Discriminant Analysis with Regularization and Aggregation for Tensor Object Recognition
Uncorrelated Multilinear Discriminant Analysis with Regularization and Aggregation for Tensor Object Recognition Haiing Lu, K.N. Plataniotis and A.N. Venetsanooulos The Edward S. Rogers Sr. Deartment of
More informationStatics and dynamics: some elementary concepts
1 Statics and dynamics: some elementary concets Dynamics is the study of the movement through time of variables such as heartbeat, temerature, secies oulation, voltage, roduction, emloyment, rices and
More informationInformation collection on a graph
Information collection on a grah Ilya O. Ryzhov Warren Powell February 10, 2010 Abstract We derive a knowledge gradient olicy for an otimal learning roblem on a grah, in which we use sequential measurements
More informationUncorrelated Multilinear Discriminant Analysis with Regularization and Aggregation for Tensor Object Recognition
TNN-2007-P-0332.R1 1 Uncorrelated Multilinear Discriminant Analysis with Regularization and Aggregation for Tensor Object Recognition Haiing Lu, K.N. Plataniotis and A.N. Venetsanooulos The Edward S. Rogers
More informationA Qualitative Event-based Approach to Multiple Fault Diagnosis in Continuous Systems using Structural Model Decomposition
A Qualitative Event-based Aroach to Multile Fault Diagnosis in Continuous Systems using Structural Model Decomosition Matthew J. Daigle a,,, Anibal Bregon b,, Xenofon Koutsoukos c, Gautam Biswas c, Belarmino
More information2. Sample representativeness. That means some type of probability/random sampling.
1 Neuendorf Cluster Analysis Assumes: 1. Actually, any level of measurement (nominal, ordinal, interval/ratio) is accetable for certain tyes of clustering. The tyical methods, though, require metric (I/R)
More informationBayesian Model Averaging Kriging Jize Zhang and Alexandros Taflanidis
HIPAD LAB: HIGH PERFORMANCE SYSTEMS LABORATORY DEPARTMENT OF CIVIL AND ENVIRONMENTAL ENGINEERING AND EARTH SCIENCES Bayesian Model Averaging Kriging Jize Zhang and Alexandros Taflanidis Why use metamodeling
More informationAdaptive estimation with change detection for streaming data
Adative estimation with change detection for streaming data A thesis resented for the degree of Doctor of Philosohy of the University of London and the Diloma of Imerial College by Dean Adam Bodenham Deartment
More informationBayesian Networks Practice
Bayesian Networks Practice Part 2 2016-03-17 Byoung-Hee Kim, Seong-Ho Son Biointelligence Lab, CSE, Seoul National University Agenda Probabilistic Inference in Bayesian networks Probability basics D-searation
More informationUsing a Computational Intelligence Hybrid Approach to Recognize the Faults of Variance Shifts for a Manufacturing Process
Journal of Industrial and Intelligent Information Vol. 4, No. 2, March 26 Using a Comutational Intelligence Hybrid Aroach to Recognize the Faults of Variance hifts for a Manufacturing Process Yuehjen E.
More informationShort course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda
Short course A vademecum of statistical attern recognition techniques with alications to image and video analysis Lecture 6 The Kalman filter. Particle filters Massimo Piccardi University of Technology,
More informationCharacterizing the Behavior of a Probabilistic CMOS Switch Through Analytical Models and Its Verification Through Simulations
Characterizing the Behavior of a Probabilistic CMOS Switch Through Analytical Models and Its Verification Through Simulations PINAR KORKMAZ, BILGE E. S. AKGUL and KRISHNA V. PALEM Georgia Institute of
More informationOne-way ANOVA Inference for one-way ANOVA
One-way ANOVA Inference for one-way ANOVA IPS Chater 12.1 2009 W.H. Freeman and Comany Objectives (IPS Chater 12.1) Inference for one-way ANOVA Comaring means The two-samle t statistic An overview of ANOVA
More informationBayesian classification CISC 5800 Professor Daniel Leeds
Bayesian classification CISC 5800 Professor Daniel Leeds Classifying with robabilities Examle goal: Determine is it cloudy out Available data: Light detector: x 0,25 Potential class (atmosheric states):
More informationSAS for Bayesian Mediation Analysis
Paer 1569-2014 SAS for Bayesian Mediation Analysis Miočević Milica, Arizona State University; David P. MacKinnon, Arizona State University ABSTRACT Recent statistical mediation analysis research focuses
More informationResearch of power plant parameter based on the Principal Component Analysis method
Research of ower lant arameter based on the Princial Comonent Analysis method Yang Yang *a, Di Zhang b a b School of Engineering, Bohai University, Liaoning Jinzhou, 3; Liaoning Datang international Jinzhou
More informationThe Binomial Approach for Probability of Detection
Vol. No. (Mar 5) - The e-journal of Nondestructive Testing - ISSN 45-494 www.ndt.net/?id=7498 The Binomial Aroach for of Detection Carlos Correia Gruo Endalloy C.A. - Caracas - Venezuela www.endalloy.net
More informationYixi Shi. Jose Blanchet. IEOR Department Columbia University New York, NY 10027, USA. IEOR Department Columbia University New York, NY 10027, USA
Proceedings of the 2011 Winter Simulation Conference S. Jain, R. R. Creasey, J. Himmelsach, K. P. White, and M. Fu, eds. EFFICIENT RARE EVENT SIMULATION FOR HEAVY-TAILED SYSTEMS VIA CROSS ENTROPY Jose
More informationScaling Multiple Point Statistics for Non-Stationary Geostatistical Modeling
Scaling Multile Point Statistics or Non-Stationary Geostatistical Modeling Julián M. Ortiz, Steven Lyster and Clayton V. Deutsch Centre or Comutational Geostatistics Deartment o Civil & Environmental Engineering
More informationSupplementary Materials for Robust Estimation of the False Discovery Rate
Sulementary Materials for Robust Estimation of the False Discovery Rate Stan Pounds and Cheng Cheng This sulemental contains roofs regarding theoretical roerties of the roosed method (Section S1), rovides
More informationNonlinear Estimation. Professor David H. Staelin
Nonlinear Estimation Professor Davi H. Staelin Massachusetts Institute of Technology Lec22.5-1 [ DD 1 2] ˆ = 1 Best Fit, "Linear Regression" Case I: Nonlinear Physics Data Otimum Estimator P() ˆ D 1 D
More informationDETC2003/DAC AN EFFICIENT ALGORITHM FOR CONSTRUCTING OPTIMAL DESIGN OF COMPUTER EXPERIMENTS
Proceedings of DETC 03 ASME 003 Design Engineering Technical Conferences and Comuters and Information in Engineering Conference Chicago, Illinois USA, Setember -6, 003 DETC003/DAC-48760 AN EFFICIENT ALGORITHM
More informationResearch Note REGRESSION ANALYSIS IN MARKOV CHAIN * A. Y. ALAMUTI AND M. R. MESHKANI **
Iranian Journal of Science & Technology, Transaction A, Vol 3, No A3 Printed in The Islamic Reublic of Iran, 26 Shiraz University Research Note REGRESSION ANALYSIS IN MARKOV HAIN * A Y ALAMUTI AND M R
More informationq-ary Symmetric Channel for Large q
List-Message Passing Achieves Caacity on the q-ary Symmetric Channel for Large q Fan Zhang and Henry D Pfister Deartment of Electrical and Comuter Engineering, Texas A&M University {fanzhang,hfister}@tamuedu
More informationA Bound on the Error of Cross Validation Using the Approximation and Estimation Rates, with Consequences for the Training-Test Split
A Bound on the Error of Cross Validation Using the Aroximation and Estimation Rates, with Consequences for the Training-Test Slit Michael Kearns AT&T Bell Laboratories Murray Hill, NJ 7974 mkearns@research.att.com
More informationHow to Estimate Expected Shortfall When Probabilities Are Known with Interval or Fuzzy Uncertainty
How to Estimate Exected Shortfall When Probabilities Are Known with Interval or Fuzzy Uncertainty Christian Servin Information Technology Deartment El Paso Community College El Paso, TX 7995, USA cservin@gmail.com
More informationSTA 250: Statistics. Notes 7. Bayesian Approach to Statistics. Book chapters: 7.2
STA 25: Statistics Notes 7. Bayesian Aroach to Statistics Book chaters: 7.2 1 From calibrating a rocedure to quantifying uncertainty We saw that the central idea of classical testing is to rovide a rigorous
More informationInformation collection on a graph
Information collection on a grah Ilya O. Ryzhov Warren Powell October 25, 2009 Abstract We derive a knowledge gradient olicy for an otimal learning roblem on a grah, in which we use sequential measurements
More informationOne step ahead prediction using Fuzzy Boolean Neural Networks 1
One ste ahead rediction using Fuzzy Boolean eural etworks 1 José A. B. Tomé IESC-ID, IST Rua Alves Redol, 9 1000 Lisboa jose.tome@inesc-id.t João Paulo Carvalho IESC-ID, IST Rua Alves Redol, 9 1000 Lisboa
More informationNotes on Instrumental Variables Methods
Notes on Instrumental Variables Methods Michele Pellizzari IGIER-Bocconi, IZA and frdb 1 The Instrumental Variable Estimator Instrumental variable estimation is the classical solution to the roblem of
More informationJohn Weatherwax. Analysis of Parallel Depth First Search Algorithms
Sulementary Discussions and Solutions to Selected Problems in: Introduction to Parallel Comuting by Viin Kumar, Ananth Grama, Anshul Guta, & George Karyis John Weatherwax Chater 8 Analysis of Parallel
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analysis of Variance and Design of Exeriment-I MODULE II LECTURE -4 GENERAL LINEAR HPOTHESIS AND ANALSIS OF VARIANCE Dr. Shalabh Deartment of Mathematics and Statistics Indian Institute of Technology Kanur
More informationOptimal Recognition Algorithm for Cameras of Lasers Evanescent
Otimal Recognition Algorithm for Cameras of Lasers Evanescent T. Gaudo * Abstract An algorithm based on the Bayesian aroach to detect and recognise off-axis ulse laser beams roagating in the atmoshere
More informationA New Asymmetric Interaction Ridge (AIR) Regression Method
A New Asymmetric Interaction Ridge (AIR) Regression Method by Kristofer Månsson, Ghazi Shukur, and Pär Sölander The Swedish Retail Institute, HUI Research, Stockholm, Sweden. Deartment of Economics and
More informationAggregate Prediction With. the Aggregation Bias
100 Aggregate Prediction With Disaggregate Models: Behavior of the Aggregation Bias Uzi Landau, Transortation Research nstitute, Technion-srael nstitute of Technology, Haifa Disaggregate travel demand
More informationLearning Sequence Motif Models Using Gibbs Sampling
Learning Sequence Motif Models Using Gibbs Samling BMI/CS 776 www.biostat.wisc.edu/bmi776/ Sring 2018 Anthony Gitter gitter@biostat.wisc.edu These slides excluding third-arty material are licensed under
More informationOn a Markov Game with Incomplete Information
On a Markov Game with Incomlete Information Johannes Hörner, Dinah Rosenberg y, Eilon Solan z and Nicolas Vieille x{ January 24, 26 Abstract We consider an examle of a Markov game with lack of information
More informationLower Confidence Bound for Process-Yield Index S pk with Autocorrelated Process Data
Quality Technology & Quantitative Management Vol. 1, No.,. 51-65, 15 QTQM IAQM 15 Lower onfidence Bound for Process-Yield Index with Autocorrelated Process Data Fu-Kwun Wang * and Yeneneh Tamirat Deartment
More informationEstimation of component redundancy in optimal age maintenance
EURO MAINTENANCE 2012, Belgrade 14-16 May 2012 Proceedings of the 21 st International Congress on Maintenance and Asset Management Estimation of comonent redundancy in otimal age maintenance Jorge ioa
More informationAn Ant Colony Optimization Approach to the Probabilistic Traveling Salesman Problem
An Ant Colony Otimization Aroach to the Probabilistic Traveling Salesman Problem Leonora Bianchi 1, Luca Maria Gambardella 1, and Marco Dorigo 2 1 IDSIA, Strada Cantonale Galleria 2, CH-6928 Manno, Switzerland
More informationState Estimation with ARMarkov Models
Deartment of Mechanical and Aerosace Engineering Technical Reort No. 3046, October 1998. Princeton University, Princeton, NJ. State Estimation with ARMarkov Models Ryoung K. Lim 1 Columbia University,
More informationEmpirical Bayesian EM-based Motion Segmentation
Emirical Bayesian EM-based Motion Segmentation Nuno Vasconcelos Andrew Liman MIT Media Laboratory 0 Ames St, E5-0M, Cambridge, MA 09 fnuno,lig@media.mit.edu Abstract A recent trend in motion-based segmentation
More informationConvolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.
Convolutional Codes Goals Lecture Be able to encode using a convolutional code Be able to decode a convolutional code received over a binary symmetric channel or an additive white Gaussian channel Convolutional
More informationPrincipal Components Analysis and Unsupervised Hebbian Learning
Princial Comonents Analysis and Unsuervised Hebbian Learning Robert Jacobs Deartment of Brain & Cognitive Sciences University of Rochester Rochester, NY 1467, USA August 8, 008 Reference: Much of the material
More informationRUN-TO-RUN CONTROL AND PERFORMANCE MONITORING OF OVERLAY IN SEMICONDUCTOR MANUFACTURING. 3 Department of Chemical Engineering
Coyright 2002 IFAC 15th Triennial World Congress, Barcelona, Sain RUN-TO-RUN CONTROL AND PERFORMANCE MONITORING OF OVERLAY IN SEMICONDUCTOR MANUFACTURING C.A. Bode 1, B.S. Ko 2, and T.F. Edgar 3 1 Advanced
More informationCHAPTER 5 STATISTICAL INFERENCE. 1.0 Hypothesis Testing. 2.0 Decision Errors. 3.0 How a Hypothesis is Tested. 4.0 Test for Goodness of Fit
Chater 5 Statistical Inference 69 CHAPTER 5 STATISTICAL INFERENCE.0 Hyothesis Testing.0 Decision Errors 3.0 How a Hyothesis is Tested 4.0 Test for Goodness of Fit 5.0 Inferences about Two Means It ain't
More informationSTABILITY ANALYSIS TOOL FOR TUNING UNCONSTRAINED DECENTRALIZED MODEL PREDICTIVE CONTROLLERS
STABILITY ANALYSIS TOOL FOR TUNING UNCONSTRAINED DECENTRALIZED MODEL PREDICTIVE CONTROLLERS Massimo Vaccarini Sauro Longhi M. Reza Katebi D.I.I.G.A., Università Politecnica delle Marche, Ancona, Italy
More informationPlotting the Wilson distribution
, Survey of English Usage, University College London Setember 018 1 1. Introduction We have discussed the Wilson score interval at length elsewhere (Wallis 013a, b). Given an observed Binomial roortion
More informationThe analysis and representation of random signals
The analysis and reresentation of random signals Bruno TOÉSNI Bruno.Torresani@cmi.univ-mrs.fr B. Torrésani LTP Université de Provence.1/30 Outline 1. andom signals Introduction The Karhunen-Loève Basis
More informationBayesian Spatially Varying Coefficient Models in the Presence of Collinearity
Bayesian Satially Varying Coefficient Models in the Presence of Collinearity David C. Wheeler 1, Catherine A. Calder 1 he Ohio State University 1 Abstract he belief that relationshis between exlanatory
More informationarxiv: v2 [stat.ml] 7 May 2015
Semi-Orthogonal Multilinear PCA with Relaxed Start Qiuan Shi and Haiing Lu Deartment of Comuter Science Hong Kong Batist University, Hong Kong, China csshi@com.hkbu.edu.hk, haiing@hkbu.edu.hk arxiv:1504.08142v2
More informationOn parameter estimation in deformable models
Downloaded from orbitdtudk on: Dec 7, 07 On arameter estimation in deformable models Fisker, Rune; Carstensen, Jens Michael Published in: Proceedings of the 4th International Conference on Pattern Recognition
More informationarxiv: v3 [physics.data-an] 23 May 2011
Date: October, 8 arxiv:.7v [hysics.data-an] May -values for Model Evaluation F. Beaujean, A. Caldwell, D. Kollár, K. Kröninger Max-Planck-Institut für Physik, München, Germany CERN, Geneva, Switzerland
More informationNamed Entity Recognition using Maximum Entropy Model SEEM5680
Named Entity Recognition using Maximum Entroy Model SEEM5680 Named Entity Recognition System Named Entity Recognition (NER): Identifying certain hrases/word sequences in a free text. Generally it involves
More informationPARTIAL FACE RECOGNITION: A SPARSE REPRESENTATION-BASED APPROACH. Luoluo Liu, Trac D. Tran, and Sang Peter Chin
PARTIAL FACE RECOGNITION: A SPARSE REPRESENTATION-BASED APPROACH Luoluo Liu, Trac D. Tran, and Sang Peter Chin Det. of Electrical and Comuter Engineering, Johns Hokins Univ., Baltimore, MD 21218, USA {lliu69,trac,schin11}@jhu.edu
More informationPreconditioning techniques for Newton s method for the incompressible Navier Stokes equations
Preconditioning techniques for Newton s method for the incomressible Navier Stokes equations H. C. ELMAN 1, D. LOGHIN 2 and A. J. WATHEN 3 1 Deartment of Comuter Science, University of Maryland, College
More informationTests for Two Proportions in a Stratified Design (Cochran/Mantel-Haenszel Test)
Chater 225 Tests for Two Proortions in a Stratified Design (Cochran/Mantel-Haenszel Test) Introduction In a stratified design, the subects are selected from two or more strata which are formed from imortant
More informationAn Improved Calibration Method for a Chopped Pyrgeometer
96 JOURNAL OF ATMOSPHERIC AND OCEANIC TECHNOLOGY VOLUME 17 An Imroved Calibration Method for a Choed Pyrgeometer FRIEDRICH FERGG OtoLab, Ingenieurbüro, Munich, Germany PETER WENDLING Deutsches Forschungszentrum
More informationOutline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding
Outline EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Error detection using arity Hamming code for error detection/correction Linear Feedback Shift
More informationUsing the Divergence Information Criterion for the Determination of the Order of an Autoregressive Process
Using the Divergence Information Criterion for the Determination of the Order of an Autoregressive Process P. Mantalos a1, K. Mattheou b, A. Karagrigoriou b a.deartment of Statistics University of Lund
More informationSlides Prepared by JOHN S. LOUCKS St. Edward s s University Thomson/South-Western. Slide
s Preared by JOHN S. LOUCKS St. Edward s s University 1 Chater 11 Comarisons Involving Proortions and a Test of Indeendence Inferences About the Difference Between Two Poulation Proortions Hyothesis Test
More informationSingularity Issues in Fixture Fault Diagnosis for Multi-Station Assembly Processes
Yu Ding Abhishek Guta Deartment of Industrial Engineering, Texas A& University, TAU 33 College Station, TX 77843-33 Daniel W. Aley Deartment of Industrial Engineering and anagement Sciences, Northwestern
More informationTowards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK
Towards understanding the Lorenz curve using the Uniform distribution Chris J. Stehens Newcastle City Council, Newcastle uon Tyne, UK (For the Gini-Lorenz Conference, University of Siena, Italy, May 2005)
More informationFuzzy Methods. Additions to Chapter 5: Fuzzy Arithmetic. Michael Hanss.
Fuzzy Methods Additions to Chater 5: Fuzzy Arithmetic Michael Hanss Part I: A short review of the Institute of Engineering Comutational Mechanics University of Stuttgart Germany Examle : q = f( ) = 2 2
More informationSampling. Inferential statistics draws probabilistic conclusions about populations on the basis of sample statistics
Samling Inferential statistics draws robabilistic conclusions about oulations on the basis of samle statistics Probability models assume that every observation in the oulation is equally likely to be observed
More informationFault Tolerant Quantum Computing Robert Rogers, Thomas Sylwester, Abe Pauls
CIS 410/510, Introduction to Quantum Information Theory Due: June 8th, 2016 Sring 2016, University of Oregon Date: June 7, 2016 Fault Tolerant Quantum Comuting Robert Rogers, Thomas Sylwester, Abe Pauls
More informationIntroduction to Probability and Statistics
Introduction to Probability and Statistics Chater 8 Ammar M. Sarhan, asarhan@mathstat.dal.ca Deartment of Mathematics and Statistics, Dalhousie University Fall Semester 28 Chater 8 Tests of Hyotheses Based
More informationSemi-Orthogonal Multilinear PCA with Relaxed Start
Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 2015) Semi-Orthogonal Multilinear PCA with Relaxed Start Qiuan Shi and Haiing Lu Deartment of Comuter Science
More information