Feature Selection. Pattern Recognition X. Michal Haindl. Feature Selection. Outline
|
|
- Percival Bates
- 5 years ago
- Views:
Transcription
1 Feature election Outline Pattern Recognition X motivation technical recognition problem dimensionality reduction ց class separability increase ր data compression (e.g. required communication channel capacity) or a given amount o data, number o eatures ց a perormance estimate accuracy ր physical physical measurement (e.g. R soil moisture, vegetation cover) data enhancement Michal Haindl Faculty o Inormation Technology, KTI Czech Technical University in Prague Institute o Inormation Theory and Automation Academy o ciences o the Czech Republic Prague, Czech Republic Evropský sociální ond. Praha & EU: Investujeme do vaší budoucnosti MI-ROZ /Z Feature election c M. Haindl MI-ROZ /12 Outline c M. Haindl MI-ROZ /12 January 16, 2012 Outline motivation technical recognition problem dimensionality reduction ց class separability increase ր data compression (e.g. required communication channel capacity) or a given amount o data, number o eatures ց a perormance estimate accuracy ր physical physical measurement (e.g. R soil moisture, vegetation cover) data enhancement 1 Feature election Probabilistic Dependence Measures c M. Haindl MI-ROZ /12 c M. Haindl MI-ROZ /12
2 Feature election / Extraction Feature election some inormation discarded J(Ẍ) = max J() eature extraction Ẍ = (X), l l, l < l all inormation used, compression by mapping motivation technical recognition problem dimensionality reduction ց class separability increase ր data compression (e.g. required communication channel capacity) or a given amount o data, number o eatures ց a perormance estimate accuracy ր physical physical measurement (e.g. R soil moisture, vegetation cover) data enhancement eective mathematical theory only or linear transormation Gaussian data c M. Haindl MI-ROZ /12 Feature election / Extraction Feature election / Extraction sensor eature selector/extractor classiier some inormation discarded J(Ẍ) = max J() eature extraction Ẍ = (X), l l, l < l all inormation used, compression by mapping some inormation discarded J(Ẍ) = maxj() eature extraction Ẍ = (X), l l, l < l all inormation used, compression by mapping
3 Feature election Feature election / Extraction speciication the eature evaluation criterion J(X) the dimensionality o the eature space l the optimization procedure the FE orm o mapping (X) (extractor) J(X) deined in terms o unknown model characteristics P(ω i ),p(x ω i ) estimates error sources suboptimal criterion unctions suboptimal search strategies pd estimation errors (small sample size) numerical errors itting errors c M. Haindl MI-ROZ /12 some inormation discarded J(Ẍ) = max J() eature extraction Ẍ = (X), l l, l < l all inormation used, compression by mapping E - perormance optimization - measurement cost reduction E - no direct relation with classiication error FE - no physical eature interpretation Feature election Approaches Feature election Entropies Feature-et earch Algorithms Monte Carlo Techniques (simulated annealing, genetic algorithms) c M. Haindl MI-ROZ /12 speciication the eature evaluation criterion J(X) the dimensionality o the eature space l the optimization procedure the FE orm o mapping (X) (extractor) J(X) deined in terms o unknown model characteristics P(ω i ),p(x ω i ) estimates error sources suboptimal criterion unctions suboptimal search strategies pd estimation errors (small sample size) numerical errors itting errors c M. Haindl MI-ROZ /12
4 2 i J can be expressed in the orm o averaged divergence i.e. J F = ( P(ω 1 X) P(ω 2 X) )P(ω 2 X)p(X)dX (s) convex unction, = lim s (s) s then P(error) < (0)P(ω 2)+ P(ω 1 ) J F (0)+ (1) e.g. the averaged divergence the averaged Matusita distance J T J T (s) = ( s 1) 2, J F = J 2 T, (0) = 1, (1) = 0, = 1 P(error) 1 2 (1 J 2 T ) c M. Haindl MI-ROZ /12 K = 2 P(error) = 1 2 [1 P(ω 1 )p(x ω 1 ) P(ω 2 )p(x ω 2 ) dx] maxp(error) i p(x ω i ) completely overlap similarly any measure between two pd s J(Ẍ) = (P(ω i ),p(x ω i ),i = 1,2)dX satisying J 0 J = 0 or overlapping p(x ω i ) J = max or nonoverlapping p(x ω i ) can be used or eature selection c M. Haindl MI-ROZ /12 2 i J can be expressed in the orm o averaged divergence i.e. J F = ( P(ω 1 X) P(ω 2 X) )P(ω 2 X)p(X)dX (s) convex unction, = lim s (s) s then P(error) < (0)P(ω 2)+ P(ω 1 ) J F (0)+ (1) e.g. the averaged divergence the averaged Matusita distance J T J T (s) = ( s 1) 2, J F = J 2 T, (0) = 1, (1) = 0, = 1 P(error) 1 2 (1 J 2 T ) c M. Haindl MI-ROZ /12 K = 2 P(error) = 1 2 [1 P(ω 1 )p(x ω 1 ) P(ω 2 )p(x ω 2 ) dx] maxp(error) i p(x ω i ) completely overlap similarly any measure between two pd s J(Ẍ) = (P(ω i ),p(x ω i ),i = 1,2)dX satisying J 0 J = 0 or overlapping p(x ω i ) J = max or nonoverlapping p(x ω i ) can be used or eature selection c M. Haindl MI-ROZ /12
5 Entropy Measures Gaussian Density observe X and compute P(ω i X) to determine an inormation gain i P(ω i X) = P(ω j X) j i then minimal inormation gain and max. entropy (uncertainty) average generalized entropy o degree α [ K ] JE α = (2 1 α 1) 1 P α (ω i X) 1 p(x)dx i=1 hannon α = 1 J (X ) = min X J (X) K J = P(ω i X) log 2 [P(ω i X)]p(X)dX i=1 Cherno s < 0,1 > J C = 1 2 s(1 s)(µ 2 µ 1 ) T [(1 s)σ 1 +sσ 2 ] 1 (µ 2 µ 1 ) Bhattacharyya ln (1 s)σ 1 +sσ 2 Σ 1 1 s Σ 2 s J B = 1 4 (µ 2 µ 1 ) T [Σ 1 +Σ 2 ] 1 (µ 2 µ 1 )+ 1 2 ln 1 2 (Σ 1 +Σ 2 ) Σ1 Σ 2 c M. Haindl MI-ROZ /12 c M. Haindl MI-ROZ /12 Feature-et earch Algorithms Probabilistic Dependence Measures or a given l < l direct search evaluation o eectiveness o ( ) l l = l! (l l)! l! i p(x ω i ) = p(x) X,ω i independent, no learning about ω i rom X maxj dependence between r. variable X and a realization o ω i is measured by the distance between p(x),p(x ω i ) i one o p(x ω i ) p(x) all prob. distance measures suit overall dependence e.g. Patrick-Fisher J R = K i=1 { P(ω i ) (p(x ω i ) p(x)) 2 dx }1 2 c M. Haindl MI-ROZ /12 c M. Haindl MI-ROZ /12
6 Feature-et earch Algorithms or a given l < l direct search evaluation o eectiveness o ( ) l l = l! (l l)! l! NAA ) Earth Observer 1 Hyperion spectral channels 1, ( combinatorial problem excessive even or moderate l, l c M. Haindl MI-ROZ /12
Branch-and-Bound Algorithm. Pattern Recognition XI. Michal Haindl. Outline
Branch-and-Bound Algorithm assumption - can be used if a feature selection criterion satisfies the monotonicity property monotonicity property - for nested feature sets X j related X 1 X 2... X l the criterion
More informationNeural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2
Neural Nets in PR NM P F Outline Motivation: Pattern Recognition XII human brain study complex cognitive tasks Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague
More informationSet Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline
Set Theory A, B sets e.g. A = {ζ 1,...,ζ n } A = { c x y d} S space (universe) A,B S Outline Pattern Recognition III Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague
More informationNotation. Pattern Recognition II. Michal Haindl. Outline - PR Basic Concepts. Pattern Recognition Notions
Notation S pattern space X feature vector X = [x 1,...,x l ] l = dim{x} number of features X feature space K number of classes ω i class indicator Ω = {ω 1,...,ω K } g(x) discriminant function H decision
More informationMI-RUB Testing II Lecture 11
MI-RUB Testing II Lecture 11 Pavel Strnad pavel.strnad@fel.cvut.cz Dept. of Computer Science, FEE CTU Prague, Karlovo nám. 13, 121 35 Praha, Czech Republic MI-RUB, WS 2011/12 Evropský sociální fond Praha
More informationMI-RUB Testing Lecture 10
MI-RUB Testing Lecture 10 Pavel Strnad pavel.strnad@fel.cvut.cz Dept. of Computer Science, FEE CTU Prague, Karlovo nám. 13, 121 35 Praha, Czech Republic MI-RUB, WS 2011/12 Evropský sociální fond Praha
More informationBootstrap metody II Kernelové Odhady Hustot
Bootstrap metody II Kernelové Odhady Hustot Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České
More informationFeature selection and extraction Spectral domain quality estimation Alternatives
Feature selection and extraction Error estimation Maa-57.3210 Data Classification and Modelling in Remote Sensing Markus Törmä markus.torma@tkk.fi Measurements Preprocessing: Remove random and systematic
More information10. Joint Moments and Joint Characteristic Functions
10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent the inormation contained in the joint p.d. o two r.vs.
More informationStatistika pro informatiku
Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 5 Evropský sociální
More informationEngineering Decisions
GSOE9 vicj@cse.unsw.edu.au www.cse.unsw.edu.au/~gs9 Outline Decision problem classes Decision problems can be classiied based on an agent s epistemic state: Decisions under certainty: the agent knows the
More informationMI-RUB Exceptions Lecture 7
MI-RUB Exceptions Lecture 7 Pavel Strnad pavel.strnad@fel.cvut.cz Dept. of Computer Science, FEE CTU Prague, Karlovo nám. 13, 121 35 Praha, Czech Republic MI-RUB, WS 2011/12 Evropský sociální fond Praha
More informationStatistika pro informatiku
Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 2 Evropský sociální
More informationMarkovské řetězce se spojitým parametrem
Markovské řetězce se spojitým parametrem Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České
More informationPrinciples of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata
Principles of Pattern Recognition C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata e-mail: murthy@isical.ac.in Pattern Recognition Measurement Space > Feature Space >Decision
More informationBinary Decision Diagrams
Binary Decision Diagrams Logic Circuits Design Seminars WS2010/2011, Lecture 2 Ing. Petr Fišer, Ph.D. Department of Digital Design Faculty of Information Technology Czech Technical University in Prague
More informationFeature selection. c Victor Kitov August Summer school on Machine Learning in High Energy Physics in partnership with
Feature selection c Victor Kitov v.v.kitov@yandex.ru Summer school on Machine Learning in High Energy Physics in partnership with August 2015 1/38 Feature selection Feature selection is a process of selecting
More informationNonlinearOptimization
1/35 NonlinearOptimization Pavel Kordík Department of Computer Systems Faculty of Information Technology Czech Technical University in Prague Jiří Kašpar, Pavel Tvrdík, 2011 Unconstrained nonlinear optimization,
More informationINPUT GROUND MOTION SELECTION FOR XIAOWAN HIGH ARCH DAM
3 th World Conerence on Earthquake Engineering Vancouver, B.C., Canada August -6, 24 Paper No. 2633 INPUT GROUND MOTION LECTION FOR XIAOWAN HIGH ARCH DAM CHEN HOUQUN, LI MIN 2, ZHANG BAIYAN 3 SUMMARY In
More informationLinear Quadratic Regulator (LQR) Design I
Lecture 7 Linear Quadratic Regulator LQR) Design I Dr. Radhakant Padhi Asst. Proessor Dept. o Aerospace Engineering Indian Institute o Science - Bangalore LQR Design: Problem Objective o drive the state
More informationThe integral test and estimates of sums
The integral test Suppose f is a continuous, positive, decreasing function on [, ) and let a n = f (n). Then the series n= a n is convergent if and only if the improper integral f (x)dx is convergent.
More informationLinear Quadratic Regulator (LQR) I
Optimal Control, Guidance and Estimation Lecture Linear Quadratic Regulator (LQR) I Pro. Radhakant Padhi Dept. o Aerospace Engineering Indian Institute o Science - Bangalore Generic Optimal Control Problem
More informationUniversidad Carlos III de Madrid
Universidad Carlos III de Madrid Exercise 3 5 6 Total Points Department de Economics Mathematicas I Final Exam January 0th 07 Exam time: hours. LAST NAME: FIRST NAME: ID: DEGREE: GROUP: () Consider the
More informationQuantum computing. Jan Černý, FIT, Czech Technical University in Prague. České vysoké učení technické v Praze. Fakulta informačních technologií
České vysoké učení technické v Praze Fakulta informačních technologií Katedra teoretické informatiky Evropský sociální fond Praha & EU: Investujeme do vaší budoucnosti MI-MVI Methods of Computational Intelligence(2010/2011)
More informationENSC327 Communications Systems 2: Fourier Representations. School of Engineering Science Simon Fraser University
ENSC37 Communications Systems : Fourier Representations School o Engineering Science Simon Fraser University Outline Chap..5: Signal Classiications Fourier Transorm Dirac Delta Function Unit Impulse Fourier
More informationProbabilistic Model of Error in Fixed-Point Arithmetic Gaussian Pyramid
Probabilistic Model o Error in Fixed-Point Arithmetic Gaussian Pyramid Antoine Méler John A. Ruiz-Hernandez James L. Crowley INRIA Grenoble - Rhône-Alpes 655 avenue de l Europe 38 334 Saint Ismier Cedex
More informationA Brief Survey on Semi-supervised Learning with Graph Regularization
000 00 002 003 004 005 006 007 008 009 00 0 02 03 04 05 06 07 08 09 020 02 022 023 024 025 026 027 028 029 030 03 032 033 034 035 036 037 038 039 040 04 042 043 044 045 046 047 048 049 050 05 052 053 A
More informationMath 754 Chapter III: Fiber bundles. Classifying spaces. Applications
Math 754 Chapter III: Fiber bundles. Classiying spaces. Applications Laurențiu Maxim Department o Mathematics University o Wisconsin maxim@math.wisc.edu April 18, 2018 Contents 1 Fiber bundles 2 2 Principle
More information3. Several Random Variables
. Several Random Variables. Two Random Variables. Conditional Probabilit--Revisited. Statistical Independence.4 Correlation between Random Variables. Densit unction o the Sum o Two Random Variables. Probabilit
More informationMaximization of Multi - Information
Maximization of Multi - Information Week of Doctoral Students 2007 Jozef Juríček http://www.adultpdf.com Academy of Sciences of the Czech Republic Created by Image To PDF trial version, Institute to remove
More informationMultilevel Logic Synthesis Algebraic Methods
Multilevel Logic Synthesis Algebraic Methods Logic Circuits Design Seminars WS2010/2011, Lecture 6 Ing. Petr Fišer, Ph.D. Department of Digital Design Faculty of Information Technology Czech Technical
More informationROBUST STABILITY AND PERFORMANCE ANALYSIS OF UNSTABLE PROCESS WITH DEAD TIME USING Mu SYNTHESIS
ROBUST STABILITY AND PERFORMANCE ANALYSIS OF UNSTABLE PROCESS WITH DEAD TIME USING Mu SYNTHESIS I. Thirunavukkarasu 1, V. I. George 1, G. Saravana Kumar 1 and A. Ramakalyan 2 1 Department o Instrumentation
More informationInformation Theory Primer:
Information Theory Primer: Entropy, KL Divergence, Mutual Information, Jensen s inequality Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationReliability Assessment with Correlated Variables using Support Vector Machines
Reliability Assessment with Correlated Variables using Support Vector Machines Peng Jiang, Anirban Basudhar, and Samy Missoum Aerospace and Mechanical Engineering Department, University o Arizona, Tucson,
More informationAnswer Key-Math 11- Optional Review Homework For Exam 2
Answer Key-Math - Optional Review Homework For Eam 2. Compute the derivative or each o the ollowing unctions: Please do not simpliy your derivatives here. I simliied some, only in the case that you want
More information5. Density evolution. Density evolution 5-1
5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 8: Importance Sampling 8.1 Importance Sampling Importance sampling
More informationChapter 8: Converter Transfer Functions
Chapter 8. Converter Transer Functions 8.1. Review o Bode plots 8.1.1. Single pole response 8.1.2. Single zero response 8.1.3. Right hal-plane zero 8.1.4. Frequency inversion 8.1.5. Combinations 8.1.6.
More informationCOMP 408/508. Computer Vision Fall 2017 PCA for Recognition
COMP 408/508 Computer Vision Fall 07 PCA or Recognition Recall: Color Gradient by PCA v λ ( G G, ) x x x R R v, v : eigenvectors o D D with v ^v (, ) x x λ, λ : eigenvalues o D D with λ >λ v λ ( B B, )
More informationDATA ASSIMILATION IN A COMBINED 1D-2D FLOOD MODEL
Proceedings o the International Conerence Innovation, Advances and Implementation o Flood Forecasting Technology, Tromsø, DATA ASSIMILATION IN A COMBINED 1D-2D FLOOD MODEL Johan Hartnac, Henri Madsen and
More informationEEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1
EEL 851: Biometrics An Overview of Statistical Pattern Recognition EEL 851 1 Outline Introduction Pattern Feature Noise Example Problem Analysis Segmentation Feature Extraction Classification Design Cycle
More informationCLASSIFICATION OF MULTIPLE ANNOTATOR DATA USING VARIATIONAL GAUSSIAN PROCESS INFERENCE
26 24th European Signal Processing Conerence EUSIPCO CLASSIFICATION OF MULTIPLE ANNOTATOR DATA USING VARIATIONAL GAUSSIAN PROCESS INFERENCE Emre Besler, Pablo Ruiz 2, Raael Molina 2, Aggelos K. Katsaggelos
More informationMinimum Error-Rate Discriminant
Discriminants Minimum Error-Rate Discriminant In the case of zero-one loss function, the Bayes Discriminant can be further simplified: g i (x) =P (ω i x). (29) J. Corso (SUNY at Buffalo) Bayesian Decision
More informationProbability, Statistics, and Reliability for Engineers and Scientists MULTIPLE RANDOM VARIABLES
CHATER robability, Statistics, and Reliability or Engineers and Scientists MULTILE RANDOM VARIABLES Second Edition A. J. Clark School o Engineering Department o Civil and Environmental Engineering 6a robability
More informationCombining Classifiers and Learning Mixture-of-Experts
38 Combining Classiiers and Learning Mixture-o-Experts Lei Xu Chinese University o Hong Kong Hong Kong & Peing University China Shun-ichi Amari Brain Science Institute Japan INTRODUCTION Expert combination
More informationLecture : Feedback Linearization
ecture : Feedbac inearization Niola Misovic, dipl ing and Pro Zoran Vuic June 29 Summary: This document ollows the lectures on eedbac linearization tought at the University o Zagreb, Faculty o Electrical
More informationFisher Consistency of Multicategory Support Vector Machines
Fisher Consistency o Multicategory Support Vector Machines Yueng Liu Department o Statistics and Operations Research Carolina Center or Genome Sciences University o North Carolina Chapel Hill, NC 7599-360
More informationEquidistant Polarizing Transforms
DRAFT 1 Equidistant Polarizing Transorms Sinan Kahraman Abstract arxiv:1708.0133v1 [cs.it] 3 Aug 017 This paper presents a non-binary polar coding scheme that can reach the equidistant distant spectrum
More informationProbability Theory for Machine Learning. Chris Cremer September 2015
Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares
More informationIn many diverse fields physical data is collected or analysed as Fourier components.
1. Fourier Methods In many diverse ields physical data is collected or analysed as Fourier components. In this section we briely discuss the mathematics o Fourier series and Fourier transorms. 1. Fourier
More informationInformatics 2B: Learning and Data Lecture 10 Discriminant functions 2. Minimal misclassifications. Decision Boundaries
Overview Gaussians estimated from training data Guido Sanguinetti Informatics B Learning and Data Lecture 1 9 March 1 Today s lecture Posterior probabilities, decision regions and minimising the probability
More informationNo. of dimensions 1. No. of centers
Contents 8.6 Course of dimensionality............................ 15 8.7 Computational aspects of linear estimators.................. 15 8.7.1 Diagonalization of circulant andblock-circulant matrices......
More informationLEARNING & LINEAR CLASSIFIERS
LEARNING & LINEAR CLASSIFIERS 1/26 J. Matas Czech Technical University, Faculty of Electrical Engineering Department of Cybernetics, Center for Machine Perception 121 35 Praha 2, Karlovo nám. 13, Czech
More informationDecision Level Fusion: An Event Driven Approach
2018 26th European Signal Processing Conerence (EUSIPCO) ecision Level Fusion: An Event riven Approach Siddharth Roheda epartment o Electrical and Computer Engineering sroheda@ncsu.edu Hamid Krim epartment
More informationEstimation of Sample Reactivity Worth with Differential Operator Sampling Method
Progress in NUCLEAR SCIENCE and TECHNOLOGY, Vol. 2, pp.842-850 (2011) ARTICLE Estimation o Sample Reactivity Worth with Dierential Operator Sampling Method Yasunobu NAGAYA and Takamasa MORI Japan Atomic
More informationLogistic Regression. William Cohen
Logistic Regression William Cohen 1 Outline Quick review classi5ication, naïve Bayes, perceptrons new result for naïve Bayes Learning as optimization Logistic regression via gradient ascent Over5itting
More informationIntroduction to Machine Learning
Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB
More informationSome New Information Inequalities Involving f-divergences
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 12, No 2 Sofia 2012 Some New Information Inequalities Involving f-divergences Amit Srivastava Department of Mathematics, Jaypee
More informationAnalog Communication (10EC53)
UNIT-1: RANDOM PROCESS: Random variables: Several random variables. Statistical averages: Function o Random variables, moments, Mean, Correlation and Covariance unction: Principles o autocorrelation unction,
More informationSpatial Transformer. Ref: Max Jaderberg, Karen Simonyan, Andrew Zisserman, Koray Kavukcuoglu, Spatial Transformer Networks, NIPS, 2015
Spatial Transormer Re: Max Jaderberg, Karen Simonyan, Andrew Zisserman, Koray Kavukcuoglu, Spatial Transormer Networks, NIPS, 2015 Spatial Transormer Layer CNN is not invariant to scaling and rotation
More informationRobust Fault Detection for Uncertain System with Communication Delay and Measurement Missing
7 nd International Conerence on Mechatronics and Inormation echnology (ICMI 7) Robust ault Detection or Uncertain System with Communication Delay and Measurement Missing Chunpeng Wang* School o Electrical
More informationBinary Pressure-Sensitive Paint
Binary Pressure-ensitive Paint It is well documented in the literature that pressure sensitive paints exhibit undesirable sensitivity to variations in temperature and illumination. In act these variations
More informationExpectation Propagation for Approximate Bayesian Inference
Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given
More informationComputational Intelligence Methods
Computational Intelligence Methods Ant Colony Optimization, Partical Swarm Optimization Pavel Kordík, Martin Šlapák Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-MVI, ZS 2011/12,
More informationKyle Reing University of Southern California April 18, 2018
Renormalization Group and Information Theory Kyle Reing University of Southern California April 18, 2018 Overview Renormalization Group Overview Information Theoretic Preliminaries Real Space Mutual Information
More informationarxiv: v1 [gr-qc] 18 Feb 2009 Detecting the Cosmological Stochastic Background of Gravitational Waves with FastICA
1 arxiv:0902.3144v1 [gr-qc] 18 Feb 2009 Detecting the Cosmological Stochastic Background o Gravitational Waves with FastICA Luca Izzo 1,2,3, Salvatore Capozziello 1 and MariaFelicia De Laurentis 1 1 Dipartimento
More informationThe achievable limits of operational modal analysis. * Siu-Kui Au 1)
The achievable limits o operational modal analysis * Siu-Kui Au 1) 1) Center or Engineering Dynamics and Institute or Risk and Uncertainty, University o Liverpool, Liverpool L69 3GH, United Kingdom 1)
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 1: Introduction
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 1: Introduction 19 Winter M106 Class: MWF 12:50-1:55 pm @ 200
More informationZáklady teorie front II
Základy teorie front II Aplikace Poissonova procesu v teorii front Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních
More informationСИБИРСКИЕ ЭЛЕКТРОННЫЕ МАТЕМАТИЧЕСКИЕ ИЗВЕСТИЯ
... S e MR ISSN 1813-3304 СИБИРСКИЕ ЭЛЕКТРОННЫЕ МАТЕМАТИЧЕСКИЕ ИЗВЕСТИЯ Siberian Electronic Mathematical Reports http://semr.math.nsc.ru Том 13, стр. 75 88 (2016) УДК 517.95;534.21 DOI 10.17377/semi.2016.13.006
More informationComputing and Communications 2. Information Theory -Entropy
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy
More informationA COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION
A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION Theodoros Tsiligkaridis, Brian M Sadler and Alfred O Hero III, University of Michigan, EECS Dept and Dept Statistics,
More informationAlpha-Divergence for Classification, Indexing and Retrieval 0 (Revised 2)
Alpha-Divergence or Classiication, Indexing and etrieval (evised ) Alred O. Hero, Bing Ma, Olivier Michel, and John Gorman Communications and Signal Processing Laboratory Technical eport CSPL-38 May (revised
More informationInformation geometry for bivariate distribution control
Information geometry for bivariate distribution control C.T.J.Dodson + Hong Wang Mathematics + Control Systems Centre, University of Manchester Institute of Science and Technology Optimal control of stochastic
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationDiscrete Mathematics. On the number of graphs with a given endomorphism monoid
Discrete Mathematics 30 00 376 384 Contents lists available at ScienceDirect Discrete Mathematics journal homepage: www.elsevier.com/locate/disc On the number o graphs with a given endomorphism monoid
More informationApplication of Automated Geoimage analysis methods
Application o Automated Geoimage analysis methods or Agro-Ecological Assessment o Lands 649 Bulgarian Journal o Agricultural Science, 17 (No 5) 2011, 649-654 Agricultural Academy Application o Automated
More informationStatistical Machine Learning Lectures 4: Variational Bayes
1 / 29 Statistical Machine Learning Lectures 4: Variational Bayes Melih Kandemir Özyeğin University, İstanbul, Turkey 2 / 29 Synonyms Variational Bayes Variational Inference Variational Bayesian Inference
More information18-660: Numerical Methods for Engineering Design and Optimization
8-66: Numerical Methods or Engineering Design and Optimization Xin Li Department o ECE Carnegie Mellon University Pittsburgh, PA 53 Slide Overview Linear Regression Ordinary least-squares regression Minima
More informationSAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software
SAMSI Astrostatistics Tutorial More Markov chain Monte Carlo & Demo of Mathematica software Phil Gregory University of British Columbia 26 Bayesian Logical Data Analysis for the Physical Sciences Contents:
More informationTraffic models on a network of roads
Traic models on a network o roads Alberto Bressan Department o Mathematics, Penn State University bressan@math.psu.edu Center or Interdisciplinary Mathematics Alberto Bressan (Penn State) Traic low on
More informationScalable robust hypothesis tests using graphical models
Scalable robust hypothesis tests using graphical models Umamahesh Srinivas ipal Group Meeting October 22, 2010 Binary hypothesis testing problem Random vector x = (x 1,...,x n ) R n generated from either
More informationEvaluating Probabilistic Queries over Imprecise Data
Evaluating Probabilistic Queries over Imprecise Data Reynold Cheng Dmitri V. Kalashnikov Sunil Prabhakar Department o Computer Science, Purdue University Email: {ckcheng,dvk,sunil}@cs.purdue.edu http://www.cs.purdue.edu/place/
More informationParticle lter for mobile robot tracking and localisation
Particle lter for mobile robot tracking and localisation Tinne De Laet K.U.Leuven, Dept. Werktuigkunde 19 oktober 2005 Particle lter 1 Overview Goal Introduction Particle lter Simulations Particle lter
More informationBayesian Decision Theory Lecture 2
Bayesian Decision Theory Lecture 2 Jason Corso SUNY at Buffalo 14 January 2009 J. Corso (SUNY at Buffalo) Bayesian Decision Theory Lecture 2 14 January 2009 1 / 58 Overview and Plan Covering Chapter 2
More informationSGN (4 cr) Chapter 5
SGN-41006 (4 cr) Chapter 5 Linear Discriminant Analysis Jussi Tohka & Jari Niemi Department of Signal Processing Tampere University of Technology January 21, 2014 J. Tohka & J. Niemi (TUT-SGN) SGN-41006
More informationLecture 2: Convergence of Random Variables
Lecture 2: Convergence of Random Variables Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Introduction to Stochastic Processes, Fall 2013 1 / 9 Convergence of Random Variables
More informationStochastic Behavioral Modeling of Analog/Mixed-Signal Circuits by Maximizing Entropy
Stochastic Behavioral Modeling o Analog/Mixed-Signal Circuits by Maximizing Entropy Rahul Krishnan, Wei Wu, Fang Gong, Lei He Electrical Engineering University o Caliornia, Los Angeles Los Angeles, CA,
More informationSection 3.4: Concavity and the second Derivative Test. Find any points of inflection of the graph of a function.
Unit 3: Applications o Dierentiation Section 3.4: Concavity and the second Derivative Test Determine intervals on which a unction is concave upward or concave downward. Find any points o inlection o the
More informationEx x xf xdx. Ex+ a = x+ a f x dx= xf x dx+ a f xdx= xˆ. E H x H x H x f x dx ˆ ( ) ( ) ( ) μ is actually the first moment of the random ( )
Fall 03 Analysis o Eperimental Measurements B Eisenstein/rev S Errede The Epectation Value o a Random Variable: The epectation value E[ ] o a random variable is the mean value o, ie ˆ (aa μ ) For discrete
More informationMachine Learning Lecture Notes
Machine Learning Lecture Notes Predrag Radivojac January 25, 205 Basic Principles of Parameter Estimation In probabilistic modeling, we are typically presented with a set of observations and the objective
More informationToday. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods
Optimization Last time Root inding: deinition, motivation Algorithms: Bisection, alse position, secant, Newton-Raphson Convergence & tradeos Eample applications o Newton s method Root inding in > 1 dimension
More informationProbability and statistics; Rehearsal for pattern recognition
Probability and statistics; Rehearsal for pattern recognition Václav Hlaváč Czech Technical University in Prague Czech Institute of Informatics, Robotics and Cybernetics 166 36 Prague 6, Jugoslávských
More informationReliability-Based Load and Resistance Factor Design (LRFD) Guidelines for Stiffened Panels and Grillages of Ship Structures
Reliability-Based Load and Resistance actor Design (LRD) Guidelines or Stiened Panels and Grillages o Ship Structures Ibrahim A. Assakka 1, Bilal M. Ayyub 2, Paul E. Hess, III, 3 and Khaled Atua 4 ABSTRACT
More information16.4. Power Series. Introduction. Prerequisites. Learning Outcomes
Power Series 6.4 Introduction In this Section we consider power series. These are examples of infinite series where each term contains a variable, x, raised to a positive integer power. We use the ratio
More informationBayesian Decision and Bayesian Learning
Bayesian Decision and Bayesian Learning Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 30 Bayes Rule p(x ω i
More informationDisparity as a Separate Measurement in Monocular SLAM
Disparity as a Separate Measurement in Monocular SLAM Talha Manzoor and Abubakr Muhammad 2 Abstract In this paper, we investigate the eects o including disparity as an explicit measurement not just or
More information4 Expectation & the Lebesgue Theorems
STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does
More information