Quality & Information Content Of CHRIS Hyper-Spectral Data B. Aiazzi, S. Baronti, P. Marcoionni, I. Pippi, M. Selva Institute of Applied Physics Nello Carrara IFAC-CNR, Florence Italy
Information-Theoretic Assessment: Summary! Quality of hyperspectral data: capability to fulfil the user s expectations, together with objective assessments, related to the true information content of the data.! Definition of a procedure for estimating the information content of the data that is related to noise, SNR and entropy of the digitised images.! Modelling of the observation noise and adoption of an advanced decorrelation algorithm to accurately estimated the entropy of data source.! Estimation of the entropy of the noise-free data source by inverting the model of an uncorrelated non-gaussian source with a stationary white Gaussian noise superimposed.! Evaluation of the most significant bands to improve the efficiency of application tasks.
Information-Theoretic Assessment: Motivations Digital sensors yield 1-D, 2-D, and M-D signals measuring some physical properties of interest to users. All data are affected by noise that inflates entropy and degrades the quality. Which amount of information is due to the noise-free signal only regardless to the noise? What is the amount of useful information of one pixel (either scalar, or vector) that can be utilized for any application task by user? By Information-Theoretic Assessment we can model noise contribution and try to remove it from the entropy rate of the noisy source to yield the entropy rate of the noise-free source.
Information-Theoretic Assessment: Modelling Assume a data source as a correlated random process with superimposed correlated signal-independent noise. The signal is split from the noise contribution: (,,) = (,,) + (,,) g ijk fijk nijk g(i,j,k) is the recorded noisy sample at the i-th row, j th column. k-th band, f(i,j,k) is the noise-free sample and n(i,j,k) is the random noise process, independent of f, stationary, correlated, Gaussian-distributed, with zero mean and variance " n2. About noise, in details: ni (, jk, ) = n() i n( j) n( k) x n () i = ρ n ( i 1) + ε () i x x x x ρ x is the correlation coefficient in the row direction ε x white gaussian noise y λ
Information-Theoretic Assessment: Flowchart Noisy residuals Decorrelation DPCM Generalized Gaussian PDF estimation Noisy image Noise parameters estimation Variance & shape factor of noisy residuals Noise PDF Deconvolution Noise parameters Noise-free residuals PDF Noise-free information estimation! Source decorrelation is obtained applying an advanced prediction via DPCM & using entropy coding to yield bit-rate close to entropy rate.! Generalized Gaussian modeling of signal contribution.! Noise parameters estimation via scatterplot method.! Deconvolution.
Noisy residuals Decorrelation DPCM Generalized Gaussian PDF estimation Noisy image Noise parameters estimation Variance & shape factor of noisy residuals Noise PDF Deconvolution Noise parameters Noise-free residuals PDF Noise-free information estimation Predictive DPCM Since Differential f and Pulse n are assumed Code Modulation to be independent (DPCM) utilizes of each a other, causal the prediction relationship to achieve among a the statistical variances decorrelation of the three types (2-D, of 3-D) prediction of the errors data. is: Prediction errors e g, i.e., differences between 2 original 2 g(i,j,k) 2 and predicted pixel σ values are entropy e ( coded. e = σ g e + σ f en g i, jk, ) gi (, jk, ) gi ˆ(, jk, ) Use a linear regression predictor, σ en can be estimated from the noise parameters: Moreover e g is the sum of error prediction of free-noise 2 2 2 2 2 signal and error σ prediction σ (1 ρ )(1 of noise: ρ )(1 ) e n = n x y eg = ef + en ρ λ
Generalized Gaussian PDF A suitable PDF model for noise, prediction error e g and prediction error of noise free image e f, is achieved by varying ν (shape factor) and " Generalized Gaussian (standard PDF deviation) of Generalized Gaussian Density (GGD): ν ν η( ν, σ) p - ( ) (, ) x σν, x = ηνσ 2 ( 1/ ν ) e Γ Γ() (, σ ) η ν 1 = σ is the Gamma function Γ Γ ( 3/ ν ) ( 1/ ν ) z 1 t () z = t e dt, z > 0 Γ 0 Unity-variance GGD function plotted for several ν s For ν= 1: Laplacian ESRIN Frascati, PDF, Italy, 21 ν 23 = 2: March Gaussian 2005 PDF
Noisy residuals Decorrelation DPCM Generalized Gaussian PDF estimation Noisy image Noise parameters estimation Variance & shape factor of noisy residuals Noise PDF Deconvolution Noise parameters Noise-free residuals PDF Noise-free information estimation Generalized Gaussian PDF Estimation of e g After the DPCM decorrelation, we know the entropy rate (R g ) and the standard deviation (" g ) of noisy image, but for modelling PDF with Generalized Gaussian PDF, we have to estimate # g. Fitting the entropy of the modelled source to that of the empirical data by the entropy function ς, thus defined: 1/2 νg Γ( 3/ ν ) g 1 Rg log2σ g = log 2 + ζ 3/2 H ( ν g) 2 ( 1/ ν ) ν g ln 2 Γ g we find # g, inverting this last function.
Noisy residuals Decorrelation DPCM Generalized Gaussian PDF estimation Noisy image Noise parameters estimation Variance & shape factor of noisy residuals Noise PDF Deconvolution Noise parameters Noise-free residuals PDF Noise-free information estimation An Example Noise of Parameters $-" Scatterplot Estimation via Scatterplot The parameters of the noisy image may be estimated in homogeneous areas. The estimate of " n is found as the y-intercept of the horizontal regression line drawn on the scatterplot of " g versus $ g of points in homogeneous areas. The estimate of % x is found as the slope coefficient of the regression line drawn on the scatterplot of cross-deviation along x versus " g of points in homogeneous areas.
Noisy residuals Decorrelation DPCM Generalized Gaussian PDF estimation Noisy image Noise parameters estimation Variance & shape factor of noisy residuals Noise PDF Deconvolution Noise parameters Noise-free residuals PDF Noise-free information estimation Deconvolution p eg The PDF of residuals found by DPCM is given by the linear convolution of the unknown p e f with a zero mean 2 PDF, having σ en variance, and ν = 2 : p p p ( x) = ( x) ( x) σ, ν σ, ν σ,2.0 e e e e e g g f f n By deconvolving noise from we obtain. p en p eg ν e f
Noisy residuals Decorrelation DPCM Generalized Gaussian PDF estimation Noisy image Noise parameters estimation Variance & shape factor of noisy residuals Noise PDF Deconvolution Noise parameters Noise-free residuals PDF Noise-free information estimation Noise-free Information Estimation ν e f Processing the found we obtain the entropic factor c f and so we can measure the entropy rate of noise-free image: 1 2 2 2 2 2 f = log 2 { f σe (1 ρx )(1 ρy )(1 ρ ) σ n } R c λ 2 g
Information-Theoretic Assessment: Implementation Results Retrieve past results from disk Decorrelation DPCM Hyperspectral input data Process noise std dev Deconvolution Info assessment result Process noise corr. coefficients Flexible architecture & Run single routine. & Process the information assessment retrieving some data from past elaboration.
Information-Theoretic Assessment: Evaluation on CHRIS Data over San Rossore Test Site MODE 3 land channel, 18 bands, resolution 17 m, FZA=0 ' 748 rows, 744 columns
Analysis Processi ng Effect September 2003
Evaluation of Noise Standard Deviation The calibration process reduces the amount of noise.
Evaluation on Noise CCs ρ x and ρ y Filtering in the calibration process little increases the ρ x value and quite reduces the ρ y.
Evaluation of noise CC ρ λ Since the calibration process doesn t consider the λ direction, it reduces the ρ λ value.
Noise Shape Factor Evaluation In the deconvolution: p p p ( x) = ( x) ( x) σ, ν σ, ν σ,2.0 e e e e e g g f f n,2.0 Very good matching between the theoretical and the estimated value.
Information Content Evaluation The plots show a coherent behavior between the entropy of the observed source and of the noise-free source. ESRIN Frascati, Italy, 21-23 March 2005
Information Assessment Comparison: R g The bit-rate of noisy data is nearly the same.
Information Assessment Comparison : R e n The entropy of noise is lower for calibrated data.
Information Assessment Comparison: R f The true information content appears nearly the same.
Spatial Analysis on L1B Data! Sea! Land! River 1! River 2
Evaluation of Noise Standard Deviation at Different Landscape A good matching between the estimated noise standard deviation has been verified in radiometrically similar sub images.
July 2003 January 2004 September 2003 Temporal Analysis on L1B Data
Temporal Analysis on L1B Data: Results on noise
Conclusions! The proposed method may yield an estimation of the information content of any digitized signal of 1-D, 2-D and M-D, in general.! A procedure for information-theoretic assessment has been developed and applied to both calibrated and raw hyperspectral data coming from push-broom sensor.! Noise estimation has been performed on different landscapes.! Preliminary analysis on multitemporal data has been considered.! Future work will investigate on the information content and the noise model of multi-angle hyperspectral data..