BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

Similar documents
1 Introduction to Wavelet Analysis

An Introduction to Wavelets and some Applications

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

Wavelet-Based Nonparametric Modeling of Hierarchical Functions in Colon Carcinogenesis

Multivariate Bayes Wavelet Shrinkage and Applications

Sparse linear models

Direct Learning: Linear Classification. Donglin Zeng, Department of Biostatistics, University of North Carolina

EXTENDING PARTIAL LEAST SQUARES REGRESSION

Lectures notes. Rheology and Fluid Dynamics

Digital Image Processing

Introduction to Discrete-Time Wavelet Transform

Pattern Recognition and Machine Learning

Space-Frequency Atoms

Module 7:Data Representation Lecture 35: Wavelets. The Lecture Contains: Wavelets. Discrete Wavelet Transform (DWT) Haar wavelets: Example

WAVELET TRANSFORMS IN TIME SERIES ANALYSIS

Expression Data Exploration: Association, Patterns, Factors & Regression Modelling

Wavelets. Introduction and Applications for Economic Time Series. Dag Björnberg. U.U.D.M. Project Report 2017:20

Wavelets in Pattern Recognition

Signal Denoising with Wavelets

Nonparametric Bayesian Methods (Gaussian Processes)

Lecture Notes 5: Multiresolution Analysis

Wavelet Transform. Figure 1: Non stationary signal f(t) = sin(100 t 2 ).

Lecture 16: Multiresolution Image Analysis

Multiresolution Analysis

Wavelets Marialuce Graziadei

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION Vol. VI - System Identification Using Wavelets - Daniel Coca and Stephen A. Billings

Sparse linear models and denoising

Bayesian Wavelet-Based Functional Mixed Models

Digital Image Processing Lectures 15 & 16

Introduction to Signal Processing

ECE 901 Lecture 16: Wavelet Approximation Theory

Niklas Grip, Department of Mathematics, Luleå University of Technology. Last update:

Space-Frequency Atoms

STA 4273H: Statistical Machine Learning

Introduction Wavelet shrinage methods have been very successful in nonparametric regression. But so far most of the wavelet regression methods have be

Nontechnical introduction to wavelets Continuous wavelet transforms Fourier versus wavelets - examples

Wavelet Neural Networks for Nonlinear Time Series Analysis

( nonlinear constraints)

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Consistent high-dimensional Bayesian variable selection via penalized credible regions

A Tutorial on Wavelets and their Applications. Martin J. Mohlenkamp

Bayesian Variable Selection Under Collinearity

Lecture 7 Multiresolution Analysis

MLISP: Machine Learning in Signal Processing Spring Lecture 10 May 11

EE123 Digital Signal Processing

Multiresolution schemes

Multiresolution schemes

INTRODUCTION TO. Adapted from CS474/674 Prof. George Bebis Department of Computer Science & Engineering University of Nevada (UNR)

Stat 5101 Lecture Notes

A discrete wavelet transform traffic model with application to queuing critical time scales

Digital Image Processing

Bayesian Methods for Machine Learning

MLISP: Machine Learning in Signal Processing Spring Lecture 8-9 May 4-7

Wavelets For Computer Graphics

Course content (will be adapted to the background knowledge of the class):

1. Fourier Transform (Continuous time) A finite energy signal is a signal f(t) for which. f(t) 2 dt < Scalar product: f(t)g(t)dt

Discrete Wavelet Transform

We have to prove now that (3.38) defines an orthonormal wavelet. It belongs to W 0 by Lemma and (3.55) with j = 1. We can write any f W 1 as

Module 4 MULTI- RESOLUTION ANALYSIS. Version 2 ECE IIT, Kharagpur

Fractal functional regression for classification of gene expression data by wavelets

Ch 4. Linear Models for Classification

STAT 518 Intro Student Presentation

Two Channel Subband Coding

Curve Fitting Re-visited, Bishop1.2.5

Sparse Linear Models (10/7/13)

Denoising and Compression Using Wavelets

Linear Models for Regression

Independent Component Analysis. Contents

STA 414/2104: Machine Learning

Aspects of Feature Selection in Mass Spec Proteomic Functional Data

ECE472/572 - Lecture 13. Roadmap. Questions. Wavelets and Multiresolution Processing 11/15/11

Factor Analysis and Kalman Filtering (11/2/04)

STA 4273H: Statistical Machine Learning

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

Mathematical Formulation of Our Example

Machine Learning Techniques for Computer Vision

Introduction to time-frequency analysis Centre for Doctoral Training in Healthcare Innovation

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US

GWAS V: Gaussian processes

Invariant Scattering Convolution Networks

Introduction to Machine Learning

Chapter 7 Wavelets and Multiresolution Processing. Subband coding Quadrature mirror filtering Pyramid image processing

Wavelets and Multiresolution Processing

Probability and Information Theory. Sargur N. Srihari

Local Polynomial Wavelet Regression with Missing at Random

PCA and admixture models

Introduction to Probability and Statistics (Continued)

Introduction to Wavelet. Based on A. Mukherjee s lecture notes

STATISTICAL LEARNING SYSTEMS

Bayesian Variable Selection Under Collinearity

STA 216, GLM, Lecture 16. October 29, 2007

Wavelet Analysis. Willy Hereman. Department of Mathematical and Computer Sciences Colorado School of Mines Golden, CO Sandia Laboratories

Time series denoising with wavelet transform

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

Boundary functions for wavelets and their properties

Wavelets and applications

MULTI-SCALE IMAGE DENOISING BASED ON GOODNESS OF FIT (GOF) TESTS

Bayes methods for categorical data. April 25, 2017

Analysis of Fractals, Image Compression and Entropy Encoding

Transcription:

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Part 3: Functional Data & Wavelets Marina Vannucci Rice University, USA PASI-CIMAT 4/28-3/2 Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 / 37

Part 3: Functional Data & Wavelets Functional data Brief intro to wavelets Curve regression models Curve classification Applications to Near Infrared spectral data from Chemometrics Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 2 / 37

Functional Data Overall objective: Methods for prediction or classification or clustering based on functional data (multiple curves). Regression: a continuos response is observed. Classification: class membership observed in a training set. Clustering: group membership needs to be uncovered. Data are curves. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 3 / 37

Functional Data Motivating Examples from NIR Calibration Experiment: 4 biscuit doughs made with variations in quantities of fat, flour, sugar and water in recipe. Fat 5-2%, Sugar -23%, Flour 44-54%, Water -7% Data: Composition (by weight) of the 4 constituents and spectral data at 7 wavelengths (from nm to 2498nm in steps of 2nm) for each dough piece. Aim: Use the spectral data to predict the composition. Wavelets as a dimension reduction tool that preserves local features. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 4 / 37

Functional Data 2 Biscuit training.5.5 5 2 25 3 Biscuit validation 2.5 2.5.5 5 2 25 Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 5 / 37

Functional Data Another Example NIR spectra used in analysis of food and drink and pharmaceutical products, measured at hundreds of wavelengths. 3 varieties of wheat, 94 observations. NIR spectra with wavelengths from 85 to 48nm in steps of 2nm. Aim: Use the spectral data to predict the wheat variety. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 6 / 37

Functional Data 3.4 Variety 3.4 Variety 2 3.2 3.2 Absorbance 3 2.8 Absorbance 3 2.8 2.6 2.6 2.4 85 9 95 5 Wavelength(nm) 2.4 85 9 95 5 Wavelength(nm) 3.4 Variety 3 3.2 Absorbance 3 2.8 2.6 2.4 85 9 95 5 Wavelength(nm) Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 7 / 37

Functional Data Our Approach Use wavelets as dimension reduction tool that preserves local features. Apply DWT to curves. Wavelet coefficients summarise curve features in an efficient way, they are localized (unlike Fourier coefficients) and can capture noise in the data. Develop Bayesian methods in wavelet domain for simultaneous feature selection and prediction or classification. We use mixture priors and Bayes methods to look for wavelet component that well describe the variation in the responses. Wavelet coeffs selection is done by assigning a probability to every possible subset and then searching for subsets with high probability. Small coefficients can be important. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 8 / 37

Wavelets Mayor Milestones on Wavelets The basic idea behind wavelets is to represent a generic function with simpler functions (building blocks) at different scales and locations. 87: Fourier orthogonal decomposition of periodic signals 946: Gabor windowed Fourier transform (STFT) 984: A. Grossmann and J. Morlet introduce the continuous wavelet transform for the analysis of seismic signals. 985: Y. Meyer defines discrete orthonormal wavelet bases. 989: S. Mallat links wavelets to the theory of multiresolution analysis (MRA) a framework that allows to construct orthonormal bases. A discrete wavelet transform is defined as a simple recursive algorithm to compute the wavelet decomposition of a signal from its approximation at a given scale. 989: I. Daubechies constructs wavelets with compact support and a varying degree of smoothness. 992: D. Donoho and I. Johnstone use wavelets to remove noise from data. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 9 / 37

Wavelets Wavelets vs Fourier Representations Given f and a basis {f,..., f n } series expansion f(x) = i a i f i (x)dx a i =< f, f i >= f(x)f i (x)dx Fourier transforms measure the frequency content of a signal. Wavelet transforms provide a time-scale analysis. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 / 37

Wavelets Wavelets as Small Waves Mother wavelet ψ as an oscillatory function with zero mean ψ(x) : R ψ(x)dx = ψ(x).8.6.4.2.2 Mexican hat wavelet 2ψ(2x); a=/2, b=.5.5.5.4 5 5 x 5 5 x (/ 2)ψ(x/2); a=2, b=.8.6.4.2.2.4 5 5 x ψ(x+4); a=, b=4.8.6.4.2.2.4 5 5 x Wavelets as orthonormal basis ψ j,k (x) = 2 j/2 ψ(2 j x k) with j, k scale and translation parameters. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 / 37

Wavelets Examples of Wavelet Families Haar wavelets. The simplest family of wavelets, already known before the formulation of the wavelet theory (Haar, 99). ψ(x) = for x < /2; for /2 x < and otherwise. Haar wavelets constructed from ψ via dilations and translations Given a generic function f, Haar wavelets approximate f with piecewise constant functions (not continuous) Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 2 / 37

Wavelets Daubechies Wavelets 2 Daubechies 2.5 Daubechies 4 ψ(x) and φ(x).5.5.5.5.5.5 2 3 x 2 4 6 x.5 Daubechies 7.5 Daubechies ψ(x) and φ(x).5.5.5.5.5 5 x.5 5 5 x Compact support (good time localization) Vanishing moments x l ψ(x)dx =, l =,,...,N ensure decay as < f,ψ j,k > C2 jn, (good for compression) Various degrees of smoothness. For large N, φ C µn, µ.2. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 3 / 37

Wavelets Properties of Wavelets Wavelet series: f(x) = j,k < f,ψ j,k > ψ j,k (x) {W f (j, k) = f,ψ j,k = f(x)ψ j,k (x)dx} j,k Z describing features of f at different locations and scales. Properties: Small waves with zero mean Time-frequency localization Good at describing non-stationarity and discontinuities Multi-scale decomposition of functions (MRA) - sparsity, shrinkage Recursive relationships among coefficients DWT Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 4 / 37

Wavelets Wavelets in Practice (DWT) Let s consider a vector Y of observations of f at n equispaced points y i = f(t i ), i =,..., n, with n = 2 m Discrete Wavelet Transforms operate via recursive filters applied to Y H data c (m) c (m ) c (m r) G ց G H H G ց ց d (m ) d (m r) H : c m,k = l h l 2k c m,l, G : d m,k = l g l 2k c m,l,... Then, in practice, Y DWT d (Y) = (c (m r), d (m r), d (m r ),..., d (m ) ) discrete approx of < f,ψ j,k > at scales m,...,m r Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 5 / 37

Wavelets Example 8 6 4 f+ε 2 2 4..2.3.4.5.6.7.8.9 x 5 5 DWT 5 5 2 3 4 5 6 7 8 9 Coefficient index data Y DWT d (Y) = (c (m r), d (m r), d (m r ),..., d (m ) ) Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 6 / 37

Wavelets Matrix Notation of the DWT y i = f(t i ), i =,..., n, with n = 2 m DWT: Y DWT d (Y) = WY, W determined by ψ, W W = I Variances and covariances of DWT coefficients: Σ d (Y) = WΣ Y W for given Σ Y (i, j) = [γ( i j )] γ(τ) autocovariance function of the process generating the data VANNUCCI & CORRADI (JRSS, Series B, 999). Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 7 / 37

Wavelets Fields of Application Applied Mathematics: Partial/ordinary differential equations solutions; representation of basic operators... Engineering: Signal and image processing (compression, smoothing)... Statistics: Smoothing of noisy data; nonparametric density and regression estimation; stochastic processes representation, time series, functional data... Physics, Biology, Genetics: Turbolence, DNA sequence analysis, magnetic resonance imaging... Music, Human Vision, Computer Graphics... Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 8 / 37

Curve Regression Curve Regression The basic setup is a multivariate linear regression model, with n observations on a q-variate response and p explanatory variables Y = n α + XB + E Y(n q) responses, X(n p) predictors Concern is with functional predictor data, ie the situation where each row of X is a vector of observations of a curve x(t) at p equally spaced points. Interest is in situations where p is large variable selection and/or dimension reduction methods are needed Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 9 / 37

Curve Regression Wavelet Transformation A wavelet transform is applied to each row of X Y = n α + XW WB + E, W W = I Reflectance.6.8..2.4 4 6 8 2 22 24. DWT.5.5.5 5 5 2 25 Reflectance.5 DWT.5 Reflectance.8.6.4.2 4 6 8 2 22 24 4 6 8 2 22 24 Wavelength DWT.5 5 5 2 25.4.2.2 5 5 2 25 Wavelet coefficient Y = n α + D B + E with B = WB and D = XW a matrix of wavelet coefficients. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 2 / 37

Curve Regression Prior Model If H is the var-cov matrix of the prior on B then H = [WHW ] We compute transformed covariance priors on B as WHW using results of Vannucci and Corradi, JRSSB (999). Priors on α and Σ are unchanged ( B B, H H) Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 2 / 37

Curve Regression Selection Prior Mixture priors to represent negligible coefficients. A latent binary vector γ identifies different models γ = (γ,...,γ p ), γ j =,, γ j = D j in Coefficients are drawn from a mixture distribution B B N(Hγ, Σ) B :,j ( γ j )I + γ j N(, h jj Σ) π(γ) as single Bernoulli s or Beta-Binomial or related predictors Prob(γ j = ) = w j, w j = w, j =,..., p Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 22 / 37

Curve Regression Metropolis Search MCMC used to search the posterior g(γ) π(γ Y, D) looking for good models. At each step the algorithm generates γ new from γ old by one of two possible moves: Add or Delete a component by choosing at random one component in γ old and changing its value. Swap two components by choosing independently at random a and a in γ old and changing both of them. The new candidate γ new is accepted with probability min{ g(γnew ) g(γ old ), } Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 23 / 37

Curve Regression Prior Specification Priors on α and Σ vague and largely uninformative α α N(h, Σ), h, Σ IW(δ, Q) Choices for Hγ: Hγ = C [(D D) ]γ Hγ = C [diag(d D) ]γ H = CI H = AR(σ, ρ) with EB estimates of hyperparameters, L( ; D, Y) = K q/2 Q n/2 I n + K YQ Y (δ+q+n )/2 K = I + DHD Choice of w: Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 24 / 37

Curve Regression Diagonal Elements of H = WHW 5 45 Prior variance of regression coefficient 4 35 3 25 2 5 5 5 2 25 Wavelet coefficient Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 25 / 37

Curve Regression Posterior Inference The stochastic search results in a list of visited models (γ (), γ (),...) and their corresponding relative posterior probabilities p(γ () D, Y), p(γ () D, Y)... Select variables: - in the best models, i.e. the γ s with highest p(γ D, Y) or - with largest marginal posterior probabilities Prediction on future Y f : - via Bayesian model averaging (BMA) as posterior weighted average of model predictions - via single model predictions as LS/Bayes predictions on single best models or LS/Bayes predictions with threshold models (eg, median model) obtained from estimated marginal probabilities of inclusion. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 26 / 37

Application to NIR Spectral Data Biscuit Doughs 4 parallel MCMC with, iterations and about 4, successful moves per chain. H(i, j) = σ 2 ρ i j,σ 2 = 254,ρ =.32 PLS(MSE):.5,.583,.375,.5 PCR(MSE):.6,.64,.388,.6 Stepwise MLR:.44,.88,.722,.22 BMA: 5 models and 29 coeffs.63,.449,.348,.5 Modal (MSE): coeffs ((%, %, %, 37%, 6%, 6%, %, 2%) from coarsest to finest level).59,.466,.35,.47 Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 27 / 37

Application to NIR Spectral Data Diagnostic Plots (a) Number of ones 2 8 6 4 2 2 3 4 5 6 7 8 9 Iteration number x 4 (b) Log probs 2 3 4 2 3 4 5 Iteration number 6 7 8 9 x 4 Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 28 / 37

Application to NIR Spectral Data Marginal Plots (i) (ii).8.8 Probability.6.4.6.4.2.2 5 5 2 25 5 5 2 25 (iii) (iv).8.8 Probability.6.4.6.4.2.2 5 5 2 25 Wavelet coefficient 5 5 2 25 Wavelet coefficient Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 29 / 37

Application to NIR Spectral Data IDWT to Columns of LS Estimate of B 2 Fat 4 Sugar 2 Coefficient 2 2 4 5 6 7 8 4 4 5 6 7 8 4 Flour 2 Water 2 Coefficient 2 4 4 5 6 7 8 Wavelength 2 4 5 6 7 8 Wavelength Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 3 / 37

Curve Classification Probit Models for Classification Z n categorical response vector (J categories). X n p predictor matrix. We have functional predictor data. Probit models use data augmentation and latent data to write Y i = α + X i B + ǫ i, ǫ i N(, Σ), i =,..., n Relationship between the realization z i and the unobserved Y i { if yi,j < for every j z i = j if y i,j = max {y i,k} k J... now transform to wavelets, use selection prior on wavelet coefficients and MCMC for inference in a probit model... Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 3 / 37

Curve Classification Results for Wheat Data 3 classes, 94 samples, NIR transmission spectra at p= wavelengths from 85 to 48nm. Variety 2 3 Total Training 32 22 7 7 Validation 7 6 23 Fearn et al.(2) Bayesian decision approach that balances costs for variables against the loss of misclassification. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 32 / 37

Curve Classification Mis-classification Errors Prediction Var. Sel./# PC. Error Best Model 5 3/23 Marginal model 9 3/23 BD(Fearn et al.) 6 5/23 BD(Fearn et al.) 2 3/23 LDA with PCA 4 8 PCs 4/23 QDA with PCA 8 PCs 7/23 Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 33 / 37

Curve Classification (a) (b) Marginal probability.8.6.4.2.8.6.4.2 2 4 6 8 2 4 6 8 (c) (d) Marginal probability.8.6.4.2.8.6.4.2 2 4 6 8 Wavelet coefficient 2 4 6 8 Wavelet coefficient Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 34 / 37

Curve Classification (a) 2 (b) 5 β β 5 85 9 95 5 2 85 9 95 5 (c) 2 (d) 5 β 2 β 2 5 85 9 95 5 wavelength 2 85 9 95 5 wavelength Plots (a) and (c) are obtained using 4 wavelet coefficients and plots (b) and (d) using 9 coefficients. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 35 / 37

Curve Classification Main References Mallat, S.G. (989). Multiresolution approximations and wavelet orthonormal bases of L 2 (R). Transactions of the American Mathematical Society, 35(), 69-87. Daubechies, I. (992). Ten lectures on wavelets. SIAM. Brown, P.J., Fearn, T. and Vannucci, M. (2). Bayesian wavelet regression on curves with application to a spectroscopic calibration problem. Journal of the American Statistical Association, 96, 398-48. Vannucci, M., Sha, N. and Brown, P.J. (25). NIR and mass spectra classification: Bayesian methods for wavelet-based feature selection. Chemometrics and Intelligent Laboratory Systems, 77, 39 48. Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 36 / 37

Matlab Code Code from my Website bvsme wav: Bayesian Variable Selection applied to wavelet coefficients Metropolis search non-diagonal selection prior Bernoulli priors or Beta-Binomial prior Predictions by LS and BMA http://stat.rice.edu/ marina Marina Vannucci (Rice University, USA) Bayesian Variable Selection (Part ) PASI-CIMAT 4/28-3/2 37 / 37