A BAYESIAN APPROACH TO SHRINKAGE ESTIMATORS

Similar documents
STK4011 and STK9011 Autumn 2016

STK3100 and STK4100 Autumn 2017

STK3100 and STK4100 Autumn 2018

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation

Econometric Methods. Review of Estimation

Training Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ

Chapter 5 Properties of a Random Sample

Point Estimation: definition of estimators

Chapter 14 Logistic Regression Models

BASIC PRINCIPLES OF STATISTICS

2SLS Estimates ECON In this case, begin with the assumption that E[ i

CHAPTER 6. d. With success = observation greater than 10, x = # of successes = 4, and

Lecture 3. Sampling, sampling distributions, and parameter estimation

Comparison of Dual to Ratio-Cum-Product Estimators of Population Mean

Chapter 5 Properties of a Random Sample

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

STRONG CONSISTENCY FOR SIMPLE LINEAR EV MODEL WITH v/ -MIXING

Multivariate Transformation of Variables and Maximum Likelihood Estimation

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018

Lecture 9. Some Useful Discrete Distributions. Some Useful Discrete Distributions. The observations generated by different experiments have

Bayes Estimator for Exponential Distribution with Extension of Jeffery Prior Information

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

Parameter Estimation

Random Variables. ECE 313 Probability with Engineering Applications Lecture 8 Professor Ravi K. Iyer University of Illinois

X ε ) = 0, or equivalently, lim

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

D KL (P Q) := p i ln p i q i

1 Solution to Problem 6.40

Chapter 8: Statistical Analysis of Simulated Data

Qualifying Exam Statistical Theory Problem Solutions August 2005

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

STA302/1001-Fall 2008 Midterm Test October 21, 2008

Dimensionality Reduction and Learning

Two Fuzzy Probability Measures

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

Lecture Notes Types of economic variables

VOL. 3, NO. 11, November 2013 ISSN ARPN Journal of Science and Technology All rights reserved.

The Empirical Performances of the Selection Criteria for Nonparametric Regression Using Smoothing Spline

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Special Instructions / Useful Data

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

TESTS BASED ON MAXIMUM LIKELIHOOD

Application of Generating Functions to the Theory of Success Runs

Lecture 3 Probability review (cont d)

Median as a Weighted Arithmetic Mean of All Sample Observations

LINEAR REGRESSION ANALYSIS

ENGI 3423 Simple Linear Regression Page 12-01

Lecture Note to Rice Chapter 8

Probability and Statistics. What is probability? What is statistics?

Lecture 9: Tolerant Testing

Lecture Notes Forecasting the process of estimating or predicting unknown situations

CHAPTER VI Statistical Analysis of Experimental Data

Bias Correction in Estimation of the Population Correlation Coefficient

Continuous Random Variables: Conditioning, Expectation and Independence

Simulation Output Analysis

Maximum Likelihood Estimation

Module 7. Lecture 7: Statistical parameter estimation

Comparison of Parameters of Lognormal Distribution Based On the Classical and Posterior Estimates

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

Chapter 4 Multiple Random Variables

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Chapter 3 Experimental Design Models

ESS Line Fitting

Introduction to local (nonparametric) density estimation. methods

Functions of Random Variables

Generative classification models

Some Statistical Inferences on the Records Weibull Distribution Using Shannon Entropy and Renyi Entropy

Entropy, Relative Entropy and Mutual Information

9 U-STATISTICS. Eh =(m!) 1 Eh(X (1),..., X (m ) ) i.i.d

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes Interval Estimation for binomial proportion and difference of two binomial proportions with Simulation Study

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Parameter, Statistic and Random Samples

Linear Regression with One Regressor

Estimation of the Loss and Risk Functions of Parameter of Maxwell Distribution

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

Chapter 3 Sampling For Proportions and Percentages

Application of Calibration Approach for Regression Coefficient Estimation under Two-stage Sampling Design

Chapter 2 General Linear Hypothesis and Analysis of Variance

Mean is only appropriate for interval or ratio scales, not ordinal or nominal.

CHAPTER 3 POSTERIOR DISTRIBUTIONS

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Sampling Theory MODULE X LECTURE - 35 TWO STAGE SAMPLING (SUB SAMPLING)

Chapter 13, Part A Analysis of Variance and Experimental Design. Introduction to Analysis of Variance. Introduction to Analysis of Variance

Recursive linear estimation for discrete time systems in the presence of different multiplicative observation noises

Lecture 02: Bounding tail distributions of a random variable

Objectives of Multiple Regression

Summary of the lecture in Biostatistics

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

Chain Rules for Entropy

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

Simple Linear Regression

9.1 Introduction to the probit and logit models

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Channel Models with Memory. Channel Models with Memory. Channel Models with Memory. Channel Models with Memory

Transcription:

A BAYESIAN APPROACH TO SHRINKAGE ESTIMATORS Fle das Neves RIZZO Crstae Alvarega GAJO Deval Jaques de SOUZA Lucas Motero CHAVES ABSTRACT: Estmators obtaed by shrkg the least squares estmator are becomg wdely used sce the work of Ste, the early 60 s, where t s reseted a estmator for the mea of multvarate ormal that domates the samle mea, ad the work of Hoerl ad Keard, the early 70 s, o rdge estmators I ths work we reset a aroach usg Bayesa ad emrcal Bayesa rocedures to obta some mortat shrkage estmators KEYWORDS: shrkage estmators, emrcal Bayes, mmum mea square estmator Itroducto Two ladmark aers, oe by James-Ste (96), troducg the James-Ste estmators, ad aother by Hoerl ad Keard (970), troducg rdge estmators, exlore the dea that shrkg the usual least squares estmator, besdes the fact of troducg some bas, has the advatage of decreasg the mea square error (MSE) Both these works exlore the dea s to shrkg the estmates Ths s a coservatve rocedure used may areas of statstc Nowadays, that s kow as based estmato theory ad s used whe the least squared estmator ca t be used reaso, for examle, of multcollearty The theory of shrkage estmators s a very actve area of research ad we ca ot, as examles, the LASSO theory (TIBSHIRANI, 996) ad Elastc et theory (ZOU-HASTIE, 005) I ths short ote, we reset a systematc aroach to shrkage estmators usg Bayesa ad emrcal Bayes rocedures Examles for rdge regresso estmato ad James-Ste estmator are reset These rocedures are very well kow ad sread the lterature, but authors could t fd ay systematc smlar aroach Sce emrcal Bayes rocedure s ot of commo use, we revew, secto, a classcal examle We ed ths work wth a comutatoal smulato showg the behavor of some shrkage estmators Shrkage estmators Uversty Federal of Lavras UFLA, Deartmet of Exact Sceces, CP 37, CEP: 3700-000, Lavras, MG, Brazl: rzzofle@gmalcom; crstaegajo@yahoocombr;devaljaques@dexuflabr; lucas@dexuflabr Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 585

Ubased estmators of a vector of arameter have a roblem that, geeral, s ot ˆ β ˆ β ˆ,, β a ubased estmator for the vector k metoed Cosder ( ) ( ) β β,, β,the, k E βˆ k k k E β E β var β + k var β + β ( ) ( E β ) We face here a uexected fact: eve though ˆ β s a ubased estmator of β, ˆβ s a based estmator for β, tedg to overestmate t Ths fact has motvated James-Ste to obta hs famous estmator The dea was to shrk the least squares estmator by a factor that deeds of the value observed Hoerl-Keard, workg wth multle lear regressos the resece of multcollearty, roosed a estmato deoted rdge regresso estmator, that s a kd of shrkage estmator Nowadays there s a huge theory about these estmators Emrcal Bayes method Whe the kowledge of the ror dstrbuto arameters s comlete or ukow, some formato may be obtaed from the data tself ad, ths way, t s ossble to make ferece about these arameters Ths s the so-called Bayesa emrcal methodto llustrate the emrcal Bayes methodology, cosder the classcal examle: Let X a Posso radom varable wth arameter µ ad ror π ( µ ) comletely ukow Gve a sze oe samle X x, the corresodg osteror dstrbuto, accordg to Bayes theorem, s x e µ µ π ( µ ) x e π ( µ x) x! where ( ) µ µ g x π ( µ ) d µ g x x ( ) 0! I relato to the mea square error, the Bayes estmator ˆB µ for µ s the mea of the osteror π ( µ x) : x e ˆ [ ] µ µ µ B E µ x µ π ( µ ) dµ x! g x x ( x )! g ( x ) e µ + + + µ x! g ( x) ( x + )! g ( x + ) 0 ( ) ( ) ( ) x +! g x + π ( µ ) d µ x! g ( x) 0 444444443 586 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

Now, oe ca use the kow ast observatos g ( x + ) Sce ( ) x x to estmate ( ),, g x ad g x s a dscrete dstrbuto t ca be estmated by the relatve frequeces card x gˆ ( x) card ; x x + gˆ ( x + ) where card{a} s the umber of elemets of set A The, the emrcal Bayes estmator s ( x + ) gˆ ( x + ) ˆ µ B gˆ x { ; x} { } 3 A aroxmato of the mmum mea squares error lear estmator for the mea Cosder deedet observatos ( ) Y,,,, wth model Y µ + ε, where µ s the arameter to be estmated ad the ε ' s are the assocated deedet errors The least square estmator for µ s ˆ µ Y Y Let s cosder all estmators of the form δ c( Y,, Y ) cy We wat to determe the artcular value of c that mmzes the mea squared error To do that we take the fucto MSE ( c ) E ( cy ) µ c E Y c µ E Y + µ σ c + µ c µ + µ The mmum value of ( ) MSE c defes the estmator ˆ µ µ, Y σ + µ Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 587

that deeds o oulato arameters µ ad σ, both ukow The atural way to byass ths roblem s to relace them by ther atural ubased estmators resultg ad the estmator Y ad S, Y Y ˆ µ Y S S + Y + Y that s a aroxmato of mmum mea square error lear estmator for the mea () I secto 8 we comare, va smulato, the behavor of these estmators 4 Shrkage estmators obtaed from ormalty The smlest examle of a shrkage estmator obtaed from ormalty s: suose Y ~ N ( θ,) ad ror ~ N ( 0,) mea, the osteror dstrbuto s y θ If a samle y ( y y ) π ( θ ) f ( y θ ) π ( θ ),, s take ad y, s t ( ) ex y θ ex θ y ex θ + + The the Bayes estmator s ˆ θ B Y, clearly a shrkage estmator + A more sohstcated argumet s to obta the estmator () usg emrcal Bayes aroach ~, µ ~ ormal 0, τ ad a samle If Y ormal ( µ σ ) wth ror ( ) ( ) y y,, y s observed, the osteror s ormal (GRUBER, 00-50), τ y π ( µ y ) ex µ τ σ τ σ + τ + σ The, usg the mea of the osteror as the Bayes estmator for µ 588 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

τ Y ˆ µ B Y τ + σ σ + τ We do t kowτ ad the dea s to use emrcal Bayes argumet The redctve dstrbuto s a ormal wth mea zero ad varace σ + τ (aedx) Observg that f Z ~ ormal ( 0, σ ) as ( [ ]) σ E Z E Z E Z, by the methods of momets, σ σ ca be estmated by oly oe observato, ˆ σ Z The we ca estmate + τ σ by Y It s reasoable to suose τ >> The we ca use Y as estmator of τ Relacg the arameters ˆ µ B by ts resectve estmators we get aga the estmator () The estmator () ca also be obtaed usg mxture Let s suose that, the model Y µ + ε, ~ (, ) ormal 0, τ Y ormal µ σ ad µ has a ror dstrbuto ( ) µ µ + µ, E cy E c Y cy We wat to comute the mea square error ( ) cosderg both Y ad µ as radom varables Comutg these exected values: f y Y y σ σ π f ( µ ) ex µ τ µ πτ ( ) ex ( µ ) ( ) ( ) µ ( ) E c Y cyµ + µ c y cyµ + µ f Y y f µ dµ dy Comutg each tegral: E c Y ( ) c y ex y µ ex µ dµ dy σ σ τ πτ π Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 589

c y ex y dy σ σ π τ τ + + c σ + τ ( ) E µ Y µ y ex y µ ex µ dy dµ σ σ τ πτ π µ ex µ E [ y] dµ πτ τ { µ Var [ µ ] ( E[ µ ]) E µ + τ 443 0 All together: E c Y cyµ + µ σ c τ cτ τ + + So, the mmum value of the mea squared error ( ) c E cy µ s obtaed whe τ The, the MSE estmator for µ s: σ + τ ˆ τ µ Y σ + τ Relacg the arameters by ts estmators we, aga, get the estmator () * 0 5 A Bayesa terretato of rdge regresso estmators Cosder the multle lear regresso 590 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

Y ( ) X β + ε, ε ~ N 0, σ I We wll suoseσ kow ad a ror dstrbuto for the arameter vector β gve by a -varate ormal ~, σ β σ λ N 0 I λ The osteror dstrbuto gve data Y ad desg matrx X s f, β β σ, X, λ f Y Y X, β, σ f β β σ, λ ( Y ) ( ) ( ) σ ex ( Y X β ) ( Y X β ) σ λ σ ex ( Y X β ) ( Y X β ) σ ex β β σ σ ( + ) σ ex ( X β ) ( X β ) λβ β Y Y σ ( + ) σ ex [ X β β X β X X β λβ Y Y Y Y + + β] σ ( + ) σ ex X β β X + β [ X Xβ + λi] β Y Y Y Y σ To exlct the dstrbuto t s ecessary to comlete the square ( ) ( ) Y X β Y X β + λβ β Y Y Y X β β X Y + β X X β + λβ β ( ) Y Y Y X β + β X X + λ I β So, we eed to determe W such that ( β W ) ( X X + λi )( β W ) Y Y Y X β + β ( X X + λi ) β β ( + λ ) β ( + λ ) β + ( + λ ) X X I W X X I W X X I W The W ( X X + λi ) X Y ad Y X W ( X X I ) β + λ β ( Y X ) ( Y X ) + Y Y + ( X X + I ) W ( X X + I ) Y Y + ( X X + I ) W ( X X + I ) + W ( X X + λi ) W W ( X X + λi ) W β β λβ β β λ β λ β β λ β λ β ( β ) ( λ )( β ) ( λ ) W X X + I W + Y Y W X X + I W Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 59

Therefore (,, fβ β σ Y X ) ex ( β W ) ( X X + λi )( β W ) + Y Y W ( X X + λi ) W ( β W ) ( X X λi )( β W ) + The osteror s a multvarate ormal wth mea ( ) the Bayes estmator s ( ) ˆ( β λ) X X + λi X Y W X X + λi X Y, therefore Ths s the rdge regresso estmator wth rdge arameter λ (Hoerl-Keard, 970) For λ 0 we have the ordary least square estmator ad ˆ β ( λ) < ˆ β (0) ad, for λ > 0 ˆ( β λ) s a shrkage estmator 6 The James-Ste estmator as a emrcal Bayes estmator Cosder the radom ormal vector ( ) X X,,, ~ (, ) X X N θ I ad suose deedet ormal rors dstrbutos for the arameters: θ ~ N(0, σ ) If oe observato X ( X,, X ) s take, the Bayes estmator s ˆ σ θ X X σ + σ + As σ s ukow, usg a emrcal Bayes rocedure t ca be estmate from the X X,, X The redctve dstrbuto s a multvarate deedet ormal data ( ) wth varace ( I ) σ +, N 0, ( σ + ) Cosder the radom varables X Z Z ~ (0,) N σ + Z X X X σ + ( σ + ) As sum of square deedet stadard ormal, t has ch-square dstrbuto wth X X arameter, that s Y ~ χ ( ) σ + X X As Y, takg exectato, σ + X X Y σ + E E X X ( σ ) Y σ E + + Y As 59 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

Y ~ Gama α, β y E y e dy Y y 0 Γ 4444443 fy ( y) Γ y y e dy 0 Γ Γ 444444443 Γ Γ Gama( α, β ) Γ Γ It follows that, E X X σ + ad, therefore, E X X σ + By the methods of momets we ca estmate σ + by As X X ˆ σ θ x, x σ + σ + relacg σ + by, the result s the famous James - Ste estmator (James-Ste, X X 96) ( ˆ θ JS ) X X X vectoral form ˆ θjs X X X Observe that the James Ste estmator s a shrkage estmator for > Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 593

7 A o-ormal examle I the examle (CASELLA-BERGER, 00, 95), suose X,, X d Beroull ( ) It s well kow that the maxmum lkelhood ad ubased estmator for the arameter s The MSE for ˆ s ˆ X X ( ˆ ) var [ ˆ ] E ( ˆ ) E ( ) ( ) var X To comute the Bayes estmator we cosder that, ad use a (, ) Y X s Bomal (, ) Beta α β as the ror for The, the osteror s gve by: y+ α y+ β π ( y) ( ) B y y ( + α, + β ) So, the osteror s a Beta ( y α, y β ) arameter s ts mea: + + ad the Bayes estmator for the y ˆ B + α α + β + The mea squared error for ths estmator s: ( ˆ ) B Var [ ˆ ] ( [ ˆ ]) B + E B ( ) ( α β ) E Observe that, takg + α + + + α + β + α β, we get that the MSE s costat, that s, t 4 y + does t deed of the arameter I ths case the estmator ˆ 4 B + (MOOD et al, 963, theorem 4, age 350) ad s mmax 594 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

( ) E ˆ B ( ) + 4 + ( + ) + + ( ) 4 4 + ( + ) ( + ) ( ) 4 + E ˆ B Now, as ( ) 4( + ) ad E ( ˆ ) Fgure hels to choose betwee these two estmators ( ) Fgure - Comarso of MSE of ˆ (blue) ad ˆ B (red) for dfferet samle szes For small values of, clearly ˆ B s a better choce, excet f there s a strog reaso to beleve that s ear 0 or For hgh values of, the choce s the estmator ˆ, excet for a strog beleve that s ear 8 Comarg the estmator ˆµ ad the seudo-estmator ˆµ To emhasze the level of comlexty mled by the shrkg rocedure we µ comare the behavor of the seudo estmator ˆ µ Y ad the shrkage σ + µ estmator Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 595

ˆ µ Y S + Y Cosderg the dstrbutos Normal, Uform ad Double Exoetal, we comute the mea squared error usg 0000 samles of szes 0, 40 ad 00 wth varaces 4, 6 ad 36 The results are show Fgures, 3 ad 4 mea squared error 000 05 Normal var4 0-4 - 0 4 mea squared error 00 04 08 Normal var6 0-4 - 0 4 mea squared error 00 0 0 Normal var36 0-4 - 0 4 real mea real mea real mea mea squared error 000 006 0 Normal var4 40-4 - 0 4 mea squared error 00 0 04 Normal var6 40-4 - 0 4 mea squared error 00 06 Normal var36 40-4 - 0 4 real mea real mea real mea mea squared error 000 003 Normal var4 00-4 - 0 4 mea squared error 000 00 00 Normal var6 00-4 - 0 4 mea squared error 00 0 04 Normal var36 00-4 - 0 4 real mea real mea real mea Fgure - MSE for Y (blue sold), ˆµ (red dotted) ad dstrbuto wth dfferet varaces ad samle szes ˆµ (gree log dash) for Normal 596 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

mea squared error 000 05 Uform var4 0-4 - 0 4 mea squared error 00 04 08 Uform var6 0-4 - 0 4 mea squared error 00 0 0 Uform var36 0-4 - 0 4 real mea real mea real mea mea squared error 000 006 0 Uform var4 40-4 - 0 4 mea squared error 00 0 04 Uform var6 40-4 - 0 4 mea squared error 00 06 Uform var36 40-4 - 0 4 real mea real mea real mea mea squared error 000 003 Uform var4 00-4 - 0 4 mea squared error 000 00 00 Uform var6 00-4 - 0 4 mea squared error 00 0 04 Uform var36 00-4 - 0 4 real mea real mea real mea Fgure 3 - MSE for Y (blue sold), ˆµ (red dotted) ad dstrbuto wth dfferet varaces ad samle szes ˆµ (gree log dash) for Uform Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 597

m ea s quared error 000 05 Double Exoetal var4 0-4 - 0 4 real mea m ea s quared error 00 04 08 Double Exoetal var6 0-4 - 0 4 real mea m ea s quared error 00 0 0 Double Exoetal var36 0-4 - 0 4 real mea m ea s quared error 000 008 Double Exoetal var4 40-4 - 0 4 real mea m ea s quared error 00 03 Double Exoetal var6 40-4 - 0 4 real mea m ea s quared error 00 06 Double Exoetal var36 40-4 - 0 4 real mea m ea s quared error 000 003 Double Exoetal var4 00-4 - 0 4 real mea m ea s quared error 000 00 00 Double Exoetal var6 00-4 - 0 4 real mea m ea s quared error 00 0 04 Double Exoetal var36 00-4 - 0 4 real mea Fgure 4 - MSE for Y (blue sold), ˆµ (red dotted) ad ˆµ (gree log dash) for Double Exoetal dstrbuto wth dfferet varaces ad samle szes 598 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

About fgures, 3 ad 4, observe that the seudo-estmator ˆµ behaves as t s suosed to, that s, excet for some fluctuato whe the real mea s far from the org, t has MSE lower tha the estmator Y O the other had, the estmator ˆµ has a comlex behavor ad oly for the real mea ear the org t has MSE lower tha the estmator Y We could t fd or guess a good reaso for that strage behavor Cocluso The shrkage rocedure shows to be fully comatble wth Bayesa ad emrcal Bayesa method The major roblem, whe shrkg a estmator, s to decde the amout of reducto to be aled The Bayesa method ca hel, gve a terretato of the ecessary reducto to be used The comlexty troduced by shrkage s far from bee comletely uderstood RIZZO, F N; GAJO, C A; SOUZA, D J; CHAVES, L M Uma abordagem Bayesaa do estmador de ecolhmeto Rev Bras Bom São Paulo, v33, 4, 485-60, 05 RESUMO: Estmadores obtdos or ecolhmeto do estmador de mímos quadrados usual têm sdo amlamete utlzados, desde os trabalhos de James-Ste (96) e Hoerl e Keard (970), sobre estmadores de cumeera Neste trabalho, aresetamos uma abordagem ara algus mortates estmadores de ecolhmeto, utlzado rocedmetos Bayesaos e Bayesaos emírcos PALAVRAS-CHAVE: Estmador de ecolhmeto, Bayes emírco, estmador de quadrados mímos Refereces CARLIN, B P; LOUIS, T A Bayes ad emrcal Bayes methods for data aalyss New York:Chama & Hall/CRC, 000 CASELLA, G; BERGER, R L Iferêca estatístca ed São Paulo: Cegage Learg, 00 573 GRUBER, M H J Imrovgeffcecy by shrkage: the James-Ste ad rdge regresso estmators New York: CRC, 998 648 GRUBER, M H J Regresso estmators: a comaratve study Baltmore: The Johs Hoks Press/CRC, 00 4 HOERL, A E; KENNARD, R W, Based Estmato for Noorthogoal Problems, Techometrcs, v,, 55-67, 970 MOOD, A M; BOES, F A; GRAYBILL, F A Itroducto to the theory of statstcs 3rd ed Columbus: McGraw-Hll, 974 564 JAMES, W; STEIN, C Estmato wth Quadratc Loss Proc 4 th Berkeley Symos Math Statst Ad Prob, v, 96, Uv Calfora Press, Berkeley, Calf, 36-379, 96 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 599

TIBSHIRANI, R Regresso shrkage ad selecto va the Lasso JR Stats Soc, v58,, 67-88, 996 ZOU, H; HASTIE, T Regularzato ad varable selecto va the elastc et JR Stats Soc, v 68,, 30-30, 005 Receved 0505 Aroved after revsed 0805 600 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05

Aedx The redctve for ormal ad mea ror ormal f ( y) ( y µ ) π ( µ ) dµ ex ( y µ ) ex µ µ d πσ σ πτ τ ( µ µ ) µ µ π στ ex y y + d π σ τ yµ y µ π στ ex + + π σ τ σ σ dµ y yµτ ex ex µ π στ σ µ d π σ τ ( σ + τ ) σ τ + y yτ yτ ex ex µ π στ σ dµ π σ τ ( σ + τ ) ( σ + τ ) σ + τ 4 y y τ σ τ ex + πσ τ σ ( σ + τ ) σ τ σ + τ yτ ex µ µ d σ τ σ τ ( σ τ ) π + σ + τ σ + τ Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 60

60 Rev Bras Bom, São Paulo, v33, 4, 485-60, 05 ( ) 4 4 ex τ τ σ σ τ σ σ τ π τ τ + + + y ex ex σ τ σ τ π σ σ τ π τ + + + + y y