GEE ESTIMATORS IN MIXTURE MODEL WITH VARYING CONCENTRATIONS

Size: px
Start display at page:

Download "GEE ESTIMATORS IN MIXTURE MODEL WITH VARYING CONCENTRATIONS"

Transcription

1 ACTA UIVERSITATIS LODZIESIS FOLIA OECOOMICA 3( Olesii Doronin *, Rostislav Maiboroda ** GEE ESTIMATORS I MIXTURE MODEL WITH VARYIG COCETRATIOS Abstract. We discuss a seiparaetric ixture odel where soe coponents are paraeterized with coon Euclidean paraeter and others are fully unnown. We introduce GEE (generalized estiating equations approach and adaptive GEE-based approach for paraeter estiation. Derived estiators are consistent and asyptotically noral, and they are optiized in ters of their dispersion atrices. Proposed techniques are tested on siulated saples. eywords: ixture odel, seiparaetric estiation, GEE. 1. ITRODUCTIO The cuulative distribution function (CDF of one observation in a ixture odel is expressed by a linear cobination of soe CDFs F 1,..., F M with 1 M M M probabilities p,..., p, p 1 (i.e. F x p F ( x. ote that 1 ( 1 F is called the CDF of the -th ixture coponent, and p the coponent concentration. In ixture odel with varying concentrations p depends on the observation index: p, j 1,. Thus, p j M F j ( x p j F ( x, j 1, j 1. We consider the case when soe paraetric odel is nown for the first coponents: F ( x F, 1,. Paraeter t is assued to be d Euclidean: t. The true value of t we designate as and assue that it is unnown. The CDFs of the last M ixture coponents are assued to * Ph.D. student, Departent of Probability Theory, Statistics and Actuarial Matheatics, Mechanics and Matheatics Faculty, Taras Shevcheno ational University of yiv. ** Ph.D., Departent of Probability Theory, Statistics and Actuarial Matheatics, Mechanics and Matheatics Faculty, Taras Shevcheno ational University of yiv. [15]

2 16 Olesii Doronin, Rostislav Maiboroda be fully unnown. We also assue that concentrations are nown. Our goal is to estiate. To do this, we derive consistent and asyptotically noral estiators, and optiize the in ters of their dispersion atrices. p j 2. OPARAMETRIC ESTIMATE FOR DISTRIBUTIO FUCTIO CDF of the -th coponent ay be estiated through the weighted epirical distribution function: ˆ 1 F ( x : a j I{ x}. j a j Weights are taen as the solution of the iniization proble of axial variance of unbiased estiates of F (x for all possible CDFs F (i.e. 1 M a p e where p : 1 M M ( p j, : p T p, j 1,, 1, M e : ( { i } i 1,...,M. See Maiboroda et al. (2008 for details. ote that weights a j can be negative. Thus, we can iprove Fˆ ( x by introducing iproved epirical distribution function (see Maiboroda et al. (2005: Fˆ j 1 ( x : in(1, ax F( y. y x 3. GEE ESTIMATE Consider soe set of easurable functions 1 ( d g ;,..., g ( ;. Theoretical oent g F ( dx ay be estiated by the weighted epirical oent as 1 gˆ ( : a j g ( j;. j1 Define the joint weighted epirical oent of g ˆ( : 1 gˆ (. ˆ g ( as

3 GEE estiators In Mixture Model Definition. GEE estiator ˆ for is the easurable function fro saple 1,..., such that gˆ ( ˆ 0. ext we assue that P[ t : gˆ( ˆ 0] 1 as. Exaple. Moent estiators can be represented as GEE estiators. Let h 1,...,h be the set of estiating functions. Denote theoretical oent of h (x as ( : h ( x F ( dx;, H,. g : h ( x H (, 1,. ˆ 1 ( ˆ 1 h : H where 1 Define estiating functions as GEE estiator 1 H is the inversed function to Analogous iproved oent estiate with ˆ can be represented as ( 1 H t : H ( t. can be introduced. Consistency for oent estiators is shown in theore 3.1 fro Doronin (2014a. hˆ : h ( x Fˆ ( dx 4. ASYMPTOTICS OF GEE ESTIMATOR Assue that CDFs F 1,..., F M are absolutely continuous with respect to siga-finite easure on the space of observations. Denote densities of each df df ( x coponent's distributions as f ( x :, 1,, f ( x :, d( x d( x 1, M. Introduce the atrix of estiating functions G g1( x; d G( x :. g x ( ; Expectation of G(x fro the -th coponent designate as : G( x F ( dx, 1,..., M Introduce the following notations.., l 1 l r s r, s: ( r, s : l a j j a j p j p li j, r, s 1, M., 1, 1, l1,

4 18 Olesii Doronin, Rostislav Maiboroda l : ( a j j a j p j, 1, M. 1, l :, l1, li 1 M f ( x, l1, R( x :. 1 T M T dd Z : G( x R( x G( x ( dx G G. 1 V : g t r, s r r, s dd t F ( dx 1. Theore 4.1. (Theore 3.4 fro Doronin (2014a Let ˆ be GEE estiator in introduced definitions, and U be soe open neighborhood of the true paraeter value. Assue the following.: (i ˆ converges in probability to as. ' T (ii Derivatives g g / t exist and are integrable (i.e. ' Et[ g ( ; ] for t U, where Et denotes expectation under condition that the true paraeter value is t, and are the foral rando values with distrubutions F. (iii Functions g ( E [ g ( ; ] are continuous on U. (iv E [suptu g ( ; ]. (v Liit atrix exist and is nonsingular. (vi Matrices r,s and exist. (vii Matrix V is nonsingular. (viii GEE is unbiased, i.e. E [ ( ; ] 0 1 t g t for t U. Then ( ˆ converges in distribution to Gaussian distribution with 1 T zero ean and covariance atrix V ZV. s 5. LOWER BOUD OF DISPERSIO MATRIX FOR GEE ESTIMATOR Assue that the atrix Z and nonsingular atrix V exist. Without loss of generality we can assue that two conditions for GEE estiator ˆ are fulfilled: (i1 g F ( dx 0, 1, (unbiasedness; (i2 atrix V is the unit atrix.

5 GEE estiators In Mixture Model Consider the iniization proble of dispersion atrix Z in Loewner ordering (i.e. A B if A B is non-negatively defined over all g d satisfying conditions (i1, (i2. Thus, we have to iniize c T Zc for all c. The solution of this proble is the set of estiating functions g (x;, which give us the lower bound of dispersion atrix (2014a. 6. ADAPTIVE ESTIMATE Z (see theore 4.1 fro Doronin Unfortunately, it is ipossible to use in practice the optial estiating functions g, which give the lower bound of dispersion atrix. The first reason is that they depend on unnown densities f (x, 1,. The second one is the difficulty to solve the GEE in the general case. Therefore, we consider the adaptive approach. dl Each function g can be approxiated as B u where B L is soe atrix of coefficients to be found, and u is the vector of soe predefined basis functions (e.g. B-splines. Under conditions (i1, (i2 equation ( 0 we can approxiate as g ˆ 1 gˆ ( B uˆ ( ( t. The solution of this approxiated equation is t B ˆ (. u Thus, one can start with soe consistent estiate ~ and define adaptive estiate as ˆ ~ : 1 B ~ ˆ (. u 1 Consistency and asyptotic norality of introduced adaptive estiate is shown in lea 3.3 fro Doronin (2014b. 7. UMERICAL RESULTS We chose a three-coponent ixture odel to siulate. All coponents are taen Gaussian, with paraeter values (, as ( 3.2, ( 3.2, (0. 2, for each coponent, respectively. The first two coponents are assued to be

6 20 Olesii Doronin, Rostislav Maiboroda T paraeterized with ( 1, 2, (different eans, coon standard deviation. Distribution of the third coponent is assued to be fully unnown. Concentrations were also generated as the pseudo-rando values, derived by 2 3 forula /( 1 p j s j s j s j s j where s j is taen fro unifor distribution on [0,1]. Series of saples with sizes 50,, 250, 500, 750, 0, 2000, 5000 were siulated, 2000 saples in each series. Vectors of basis functions u for adaptive estiate were chosen as the set of unifor cubic B-splines with nots at points i, where and are the ean and standard deviation of the -th coponent, respectively, i 5,...,5. Matrices B were chosen to iniize dispersion atrix. Results are shown in Figure 1. COCLUSIOS The ixture odel with varying concentrations is considered. Several estiators for this odel are introduced (oent, GEE, adaptive. The proposed estiators are consistent and asyptotically noral under soe conditions. Perforance of oent and adaptive estiators are copared on siulated saples. Dispersion of introduced estiators converges to its theoretical asyptotic value for saples with 0 and ore observations. REFERECES Doronin O. (2014a, Lower bound of dispersion atrix for seiparaetric estiation in ixture odel. "Theory of Probability and Matheatical Statistics", no. 90, p Doronin O. (2014b, Adaptive estiation in seiparaetric odel of ixture with varying concentrations. "Theory of Probability and Matheatical Statistics", no. 91, p Doronin O. (2012, Robust Estiates for Mixtures with Gaussian Coponent. "Bulletin of Taras Shevcheno ational University of yiv. Series: Physics & Matheatics" (in Urainian, vol. 1, p Maiboroda R., Sugaova O. (2008, Estiation and classification by observations fro ixtures. yiv University Publishers, yiv (in Urainian. Maiboroda R., ubaichu O. (2005, Iproved estiators for oents constructed fro observations of a ixture. "Theory of Probability and Matheatical Statistics", no. 70, p Maiboroda R., Sugaova O., Doronin A. (2013, Generalized estiating equations for ixtures with varying concentrations. "The Canadian Journal of Statistics", no. 41, vol. 2, p

7 GEE estiators In Mixture Model MSE ( ˆ MSE ( ˆ RobVar ( ˆ RobVar ( ˆ

8 22 Olesii Doronin, Rostislav Maiboroda MSE (ˆ RobVar (ˆ Here MSE is the ean squared error of the paraeter estiate ultiplied by nuber of observations. RobVar is the robust estiate of MSE through the interquartile range of paraeter estiate. Sybol indicates the oent estiates (lower line for iproved and upper line for uniproved, and adaptive estiates. White sybols indicate theoretical dispersion. Sybol indicates the lower bound. Figure 1. Dispersion of estiates Source: plots are generated by Wolfra Matheatica using our own script.

9 GEE estiators In Mixture Model Olesii Doronin, Rostislav Maiboroda ESTYMATORY GEE W MODELU MIESZAYM ZE ZMIEYMI WSPÓŁCZYIAMI OCETRACJI Streszczenie. W pracy oówiono seiparaetryczny odel ieszany, w tóry pewne współczynnii są paraetryzowane za poocą wspólnego paraetru eulidesowego, natoiast inne są zupełnie nieznane. Wprowadzono etodę estyacji paraetrów opartą na podejściu GEE (uogólnionych równań estyujących oraz adaptacyjny podejściu GEE. Proponowane estyatory zostały przeanalizowane w badaniu syulacyjny. Słowa luczowe: odel ieszany, estyacja seiparaetryczna, GEE.

10 24 Olesii Doronin, Rostislav Maiboroda

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

Testing equality of variances for multiple univariate normal populations

Testing equality of variances for multiple univariate normal populations University of Wollongong Research Online Centre for Statistical & Survey Methodology Working Paper Series Faculty of Engineering and Inforation Sciences 0 esting equality of variances for ultiple univariate

More information

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS ISSN 1440-771X AUSTRALIA DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS An Iproved Method for Bandwidth Selection When Estiating ROC Curves Peter G Hall and Rob J Hyndan Working Paper 11/00 An iproved

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

CS Lecture 13. More Maximum Likelihood

CS Lecture 13. More Maximum Likelihood CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood

More information

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

IN modern society that various systems have become more

IN modern society that various systems have become more Developent of Reliability Function in -Coponent Standby Redundant Syste with Priority Based on Maxiu Entropy Principle Ryosuke Hirata, Ikuo Arizono, Ryosuke Toohiro, Satoshi Oigawa, and Yasuhiko Takeoto

More information

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters journal of ultivariate analysis 58, 96106 (1996) article no. 0041 The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Paraeters H. S. Steyn

More information

Bayes Decision Rule and Naïve Bayes Classifier

Bayes Decision Rule and Naïve Bayes Classifier Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.

More information

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013). A Appendix: Proofs The proofs of Theore 1-3 are along the lines of Wied and Galeano (2013) Proof of Theore 1 Let D[d 1, d 2 ] be the space of càdlàg functions on the interval [d 1, d 2 ] equipped with

More information

TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES

TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES S. E. Ahed, R. J. Tokins and A. I. Volodin Departent of Matheatics and Statistics University of Regina Regina,

More information

Non-Parametric Non-Line-of-Sight Identification 1

Non-Parametric Non-Line-of-Sight Identification 1 Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,

More information

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential

More information

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition

More information

3.3 Variational Characterization of Singular Values

3.3 Variational Characterization of Singular Values 3.3. Variational Characterization of Singular Values 61 3.3 Variational Characterization of Singular Values Since the singular values are square roots of the eigenvalues of the Heritian atrices A A and

More information

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t. CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when

More information

Support recovery in compressed sensing: An estimation theoretic approach

Support recovery in compressed sensing: An estimation theoretic approach Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de

More information

An Improved Particle Filter with Applications in Ballistic Target Tracking

An Improved Particle Filter with Applications in Ballistic Target Tracking Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing

More information

arxiv: v1 [cs.ds] 3 Feb 2014

arxiv: v1 [cs.ds] 3 Feb 2014 arxiv:40.043v [cs.ds] 3 Feb 04 A Bound on the Expected Optiality of Rando Feasible Solutions to Cobinatorial Optiization Probles Evan A. Sultani The Johns Hopins University APL evan@sultani.co http://www.sultani.co/

More information

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality

More information

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are, Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations

More information

Introduction to Machine Learning. Recitation 11

Introduction to Machine Learning. Recitation 11 Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,

More information

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul

More information

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

Optimal Jamming Over Additive Noise: Vector Source-Channel Case Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper

More information

Moments of the product and ratio of two correlated chi-square variables

Moments of the product and ratio of two correlated chi-square variables Stat Papers 009 50:581 59 DOI 10.1007/s0036-007-0105-0 REGULAR ARTICLE Moents of the product and ratio of two correlated chi-square variables Anwar H. Joarder Received: June 006 / Revised: 8 October 007

More information

Estimating Parameters for a Gaussian pdf

Estimating Parameters for a Gaussian pdf Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3

More information

Stochastic Subgradient Methods

Stochastic Subgradient Methods Stochastic Subgradient Methods Lingjie Weng Yutian Chen Bren School of Inforation and Coputer Science University of California, Irvine {wengl, yutianc}@ics.uci.edu Abstract Stochastic subgradient ethods

More information

International Journal of Scientific & Engineering Research, Volume 4, Issue 9, September ISSN

International Journal of Scientific & Engineering Research, Volume 4, Issue 9, September ISSN International Journal of Scientific & Engineering Research, Volue 4, Issue 9, Septeber-3 44 ISSN 9-558 he unscented Kalan Filter for the Estiation the States of he Boiler-urbin Model Halieh Noorohaadi,

More information

Multivariate Methods. Matlab Example. Principal Components Analysis -- PCA

Multivariate Methods. Matlab Example. Principal Components Analysis -- PCA Multivariate Methos Xiaoun Qi Principal Coponents Analysis -- PCA he PCA etho generates a new set of variables, calle principal coponents Each principal coponent is a linear cobination of the original

More information

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Best Arm Identification: A Unified Approach to Fixed Budget and Fixed Confidence

Best Arm Identification: A Unified Approach to Fixed Budget and Fixed Confidence Best Ar Identification: A Unified Approach to Fixed Budget and Fixed Confidence Victor Gabillon Mohaad Ghavazadeh Alessandro Lazaric INRIA Lille - Nord Europe, Tea SequeL {victor.gabillon,ohaad.ghavazadeh,alessandro.lazaric}@inria.fr

More information

Interactive Markov Models of Evolutionary Algorithms

Interactive Markov Models of Evolutionary Algorithms Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

arxiv: v1 [stat.ot] 7 Jul 2010

arxiv: v1 [stat.ot] 7 Jul 2010 Hotelling s test for highly correlated data P. Bubeliny e-ail: bubeliny@karlin.ff.cuni.cz Charles University, Faculty of Matheatics and Physics, KPMS, Sokolovska 83, Prague, Czech Republic, 8675. arxiv:007.094v

More information

Ensemble Based on Data Envelopment Analysis

Ensemble Based on Data Envelopment Analysis Enseble Based on Data Envelopent Analysis So Young Sohn & Hong Choi Departent of Coputer Science & Industrial Systes Engineering, Yonsei University, Seoul, Korea Tel) 82-2-223-404, Fax) 82-2- 364-7807

More information

SPECTRUM sensing is a core concept of cognitive radio

SPECTRUM sensing is a core concept of cognitive radio World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile

More information

THE KALMAN FILTER: A LOOK BEHIND THE SCENE

THE KALMAN FILTER: A LOOK BEHIND THE SCENE HE KALMA FILER: A LOOK BEHID HE SCEE R.E. Deain School of Matheatical and Geospatial Sciences, RMI University eail: rod.deain@rit.edu.au Presented at the Victorian Regional Survey Conference, Mildura,

More information

Use of PSO in Parameter Estimation of Robot Dynamics; Part One: No Need for Parameterization

Use of PSO in Parameter Estimation of Robot Dynamics; Part One: No Need for Parameterization Use of PSO in Paraeter Estiation of Robot Dynaics; Part One: No Need for Paraeterization Hossein Jahandideh, Mehrzad Navar Abstract Offline procedures for estiating paraeters of robot dynaics are practically

More information

Bootstrapping Dependent Data

Bootstrapping Dependent Data Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly

More information

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique

More information

On the block maxima method in extreme value theory

On the block maxima method in extreme value theory On the bloc axiethod in extree value theory Ana Ferreira ISA, Univ Tecn Lisboa, Tapada da Ajuda 349-7 Lisboa, Portugal CEAUL, FCUL, Bloco C6 - Piso 4 Capo Grande, 749-6 Lisboa, Portugal anafh@isa.utl.pt

More information

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna

More information

On Conditions for Linearity of Optimal Estimation

On Conditions for Linearity of Optimal Estimation On Conditions for Linearity of Optial Estiation Erah Akyol, Kuar Viswanatha and Kenneth Rose {eakyol, kuar, rose}@ece.ucsb.edu Departent of Electrical and Coputer Engineering University of California at

More information

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY Physical and Matheatical Sciences 04,, p. 7 5 ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD M a t h e a t i c s Yu. A. HAKOPIAN, R. Z. HOVHANNISYAN

More information

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION A eshsize boosting algorith in kernel density estiation A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION C.C. Ishiekwene, S.M. Ogbonwan and J.E. Osewenkhae Departent of Matheatics, University

More information

Topic 5a Introduction to Curve Fitting & Linear Regression

Topic 5a Introduction to Curve Fitting & Linear Regression /7/08 Course Instructor Dr. Rayond C. Rup Oice: A 337 Phone: (95) 747 6958 E ail: rcrup@utep.edu opic 5a Introduction to Curve Fitting & Linear Regression EE 4386/530 Coputational ethods in EE Outline

More information

STOPPING SIMULATED PATHS EARLY

STOPPING SIMULATED PATHS EARLY Proceedings of the 2 Winter Siulation Conference B.A.Peters,J.S.Sith,D.J.Medeiros,andM.W.Rohrer,eds. STOPPING SIMULATED PATHS EARLY Paul Glasseran Graduate School of Business Colubia University New Yor,

More information

Biostatistics Department Technical Report

Biostatistics Department Technical Report Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent

More information

Optimal Jackknife for Discrete Time and Continuous Time Unit Root Models

Optimal Jackknife for Discrete Time and Continuous Time Unit Root Models Optial Jackknife for Discrete Tie and Continuous Tie Unit Root Models Ye Chen and Jun Yu Singapore Manageent University January 6, Abstract Maxiu likelihood estiation of the persistence paraeter in the

More information

Estimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples

Estimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples Open Journal of Statistics, 4, 4, 64-649 Published Online Septeber 4 in SciRes http//wwwscirporg/ournal/os http//ddoiorg/436/os4486 Estiation of the Mean of the Eponential Distribution Using Maiu Ranked

More information

Lower Bounds for Quantized Matrix Completion

Lower Bounds for Quantized Matrix Completion Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &

More information

AN EFFICIENT CLASS OF CHAIN ESTIMATORS OF POPULATION VARIANCE UNDER SUB-SAMPLING SCHEME

AN EFFICIENT CLASS OF CHAIN ESTIMATORS OF POPULATION VARIANCE UNDER SUB-SAMPLING SCHEME J. Japan Statist. Soc. Vol. 35 No. 005 73 86 AN EFFICIENT CLASS OF CHAIN ESTIMATORS OF POPULATION VARIANCE UNDER SUB-SAMPLING SCHEME H. S. Jhajj*, M. K. Shara* and Lovleen Kuar Grover** For estiating the

More information

SEISMIC FRAGILITY ANALYSIS

SEISMIC FRAGILITY ANALYSIS 9 th ASCE Specialty Conference on Probabilistic Mechanics and Structural Reliability PMC24 SEISMIC FRAGILITY ANALYSIS C. Kafali, Student M. ASCE Cornell University, Ithaca, NY 483 ck22@cornell.edu M. Grigoriu,

More information

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators Suppleentary Inforation for Design of Bending Multi-Layer Electroactive Polyer Actuators Bavani Balakrisnan, Alek Nacev, and Elisabeth Sela University of Maryland, College Park, Maryland 074 1 Analytical

More information

Seismic Analysis of Structures by TK Dutta, Civil Department, IIT Delhi, New Delhi.

Seismic Analysis of Structures by TK Dutta, Civil Department, IIT Delhi, New Delhi. Seisic Analysis of Structures by K Dutta, Civil Departent, II Delhi, New Delhi. Module 5: Response Spectru Method of Analysis Exercise Probles : 5.8. or the stick odel of a building shear frae shown in

More information

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

Asymptotic Theory for Multiple-Sample Semiparametric Density Ratio Models and Its Application To Mortality Forecasting

Asymptotic Theory for Multiple-Sample Semiparametric Density Ratio Models and Its Application To Mortality Forecasting ABSTRACT Title of dissertation: Asyptotic Theory for Multiple-Saple Seiparaetric Density Ratio Models and Its Application To Mortality Forecasting Guanhua Lu Doctor of Philosophy, 2007 Dissertation directed

More information

Bayes Estimation of the Logistic Distribution Parameters Based on Progressive Sampling

Bayes Estimation of the Logistic Distribution Parameters Based on Progressive Sampling Appl. Math. Inf. Sci. 0, No. 6, 93-30 06) 93 Applied Matheatics & Inforation Sciences An International Journal http://dx.doi.org/0.8576/ais/0063 Bayes Estiation of the Logistic Distribution Paraeters Based

More information

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics Tail Estiation of the Spectral Density under Fixed-Doain Asyptotics Wei-Ying Wu, Chae Young Li and Yiin Xiao Wei-Ying Wu, Departent of Statistics & Probability Michigan State University, East Lansing,

More information

OBJECTIVES INTRODUCTION

OBJECTIVES INTRODUCTION M7 Chapter 3 Section 1 OBJECTIVES Suarize data using easures of central tendency, such as the ean, edian, ode, and idrange. Describe data using the easures of variation, such as the range, variance, and

More information

Measures of average are called measures of central tendency and include the mean, median, mode, and midrange.

Measures of average are called measures of central tendency and include the mean, median, mode, and midrange. CHAPTER 3 Data Description Objectives Suarize data using easures of central tendency, such as the ean, edian, ode, and idrange. Describe data using the easures of variation, such as the range, variance,

More information

Research in Area of Longevity of Sylphon Scraies

Research in Area of Longevity of Sylphon Scraies IOP Conference Series: Earth and Environental Science PAPER OPEN ACCESS Research in Area of Longevity of Sylphon Scraies To cite this article: Natalia Y Golovina and Svetlana Y Krivosheeva 2018 IOP Conf.

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

Variance Reduction. in Statistics, we deal with estimators or statistics all the time

Variance Reduction. in Statistics, we deal with estimators or statistics all the time Variance Reduction in Statistics, we deal with estiators or statistics all the tie perforance of an estiator is easured by MSE for biased estiators and by variance for unbiased ones hence it s useful to

More information

Combining Classifiers

Combining Classifiers Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/

More information

Weighted Superimposed Codes and Constrained Integer Compressed Sensing

Weighted Superimposed Codes and Constrained Integer Compressed Sensing Weighted Superiposed Codes and Constrained Integer Copressed Sensing Wei Dai and Olgica Milenovic Dept. of Electrical and Coputer Engineering University of Illinois, Urbana-Chapaign Abstract We introduce

More information

Constructing Locally Best Invariant Tests of the Linear Regression Model Using the Density Function of a Maximal Invariant

Constructing Locally Best Invariant Tests of the Linear Regression Model Using the Density Function of a Maximal Invariant Aerican Journal of Matheatics and Statistics 03, 3(): 45-5 DOI: 0.593/j.ajs.03030.07 Constructing Locally Best Invariant Tests of the Linear Regression Model Using the Density Function of a Maxial Invariant

More information

Nonlinear Log-Periodogram Regression for Perturbed Fractional Processes

Nonlinear Log-Periodogram Regression for Perturbed Fractional Processes Nonlinear Log-Periodogra Regression for Perturbed Fractional Processes Yixiao Sun Departent of Econoics Yale University Peter C. B. Phillips Cowles Foundation for Research in Econoics Yale University First

More information

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13 CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture

More information

Bayesian Approach for Fatigue Life Prediction from Field Inspection

Bayesian Approach for Fatigue Life Prediction from Field Inspection Bayesian Approach for Fatigue Life Prediction fro Field Inspection Dawn An and Jooho Choi School of Aerospace & Mechanical Engineering, Korea Aerospace University, Goyang, Seoul, Korea Srira Pattabhiraan

More information

Best Procedures For Sample-Free Item Analysis

Best Procedures For Sample-Free Item Analysis Best Procedures For Saple-Free Ite Analysis Benjain D. Wright University of Chicago Graha A. Douglas University of Western Australia Wright s (1969) widely used "unconditional" procedure for Rasch saple-free

More information

Nonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy

Nonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy Storage Capacity and Dynaics of Nononotonic Networks Bruno Crespi a and Ignazio Lazzizzera b a. IRST, I-38050 Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I-38050 Povo (Trento) Italy INFN Gruppo

More information

PROXSCAL. Notation. W n n matrix with weights for source k. E n s matrix with raw independent variables F n p matrix with fixed coordinates

PROXSCAL. Notation. W n n matrix with weights for source k. E n s matrix with raw independent variables F n p matrix with fixed coordinates PROXSCAL PROXSCAL perfors ultidiensional scaling of proxiity data to find a leastsquares representation of the obects in a low-diensional space. Individual differences odels can be specified for ultiple

More information

SMOOTH ESTIMATION OF ROC CURVE IN THE PRESENCE OF AUXILIARY INFORMATION

SMOOTH ESTIMATION OF ROC CURVE IN THE PRESENCE OF AUXILIARY INFORMATION J Syst Sci Coplex 4: 99 944 SMOOTH ESTIMATION OF ROC CURVE IN THE PRESENCE OF AUXILIARY INFORMATION Yong ZHOU Haibo ZHOU Yunbei MA DOI:.7/s44--795-7 Received: May 7 / Revised: August 8 c The Editorial

More information

A Poisson process reparameterisation for Bayesian inference for extremes

A Poisson process reparameterisation for Bayesian inference for extremes A Poisson process reparaeterisation for Bayesian inference for extrees Paul Sharkey and Jonathan A. Tawn Eail: p.sharkey1@lancs.ac.uk, j.tawn@lancs.ac.uk STOR-i Centre for Doctoral Training, Departent

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

Mathematical Models to Determine Stable Behavior of Complex Systems

Mathematical Models to Determine Stable Behavior of Complex Systems Journal of Physics: Conference Series PAPER OPEN ACCESS Matheatical Models to Deterine Stable Behavior of Coplex Systes To cite this article: V I Suin et al 08 J. Phys.: Conf. Ser. 05 0336 View the article

More information

Local asymptotic powers of nonparametric and semiparametric tests for fractional integration

Local asymptotic powers of nonparametric and semiparametric tests for fractional integration Stochastic Processes and their Applications 117 (2007) 251 261 www.elsevier.co/locate/spa Local asyptotic powers of nonparaetric and seiparaetric tests for fractional integration Xiaofeng Shao, Wei Biao

More information

RAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5

RAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5 Course STA0: Statistics and Probability Lecture No to 5 Multiple Choice Questions:. Statistics deals with: a) Observations b) Aggregates of facts*** c) Individuals d) Isolated ites. A nuber of students

More information

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Proc. of the IEEE/OES Seventh Working Conference on Current Measureent Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Belinda Lipa Codar Ocean Sensors 15 La Sandra Way, Portola Valley, CA 98 blipa@pogo.co

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer

More information

Symbolic Analysis as Universal Tool for Deriving Properties of Non-linear Algorithms Case study of EM Algorithm

Symbolic Analysis as Universal Tool for Deriving Properties of Non-linear Algorithms Case study of EM Algorithm Acta Polytechnica Hungarica Vol., No., 04 Sybolic Analysis as Universal Tool for Deriving Properties of Non-linear Algoriths Case study of EM Algorith Vladiir Mladenović, Miroslav Lutovac, Dana Porrat

More information

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic

More information

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds.

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. Proceedings of the 2016 Winter Siulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtan, E. Zhou, T. Huschka, and S. E. Chick, eds. THE EMPIRICAL LIKELIHOOD APPROACH TO SIMULATION INPUT UNCERTAINTY

More information

When Short Runs Beat Long Runs

When Short Runs Beat Long Runs When Short Runs Beat Long Runs Sean Luke George Mason University http://www.cs.gu.edu/ sean/ Abstract What will yield the best results: doing one run n generations long or doing runs n/ generations long

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

A model reduction approach to numerical inversion for a parabolic partial differential equation

A model reduction approach to numerical inversion for a parabolic partial differential equation Inverse Probles Inverse Probles 30 (204) 250 (33pp) doi:0.088/0266-56/30/2/250 A odel reduction approach to nuerical inversion for a parabolic partial differential equation Liliana Borcea, Vladiir Drusin

More information

International Journal of Pure and Applied Mathematics Volume 37 No , IMPROVED DATA DRIVEN CONTROL CHARTS

International Journal of Pure and Applied Mathematics Volume 37 No , IMPROVED DATA DRIVEN CONTROL CHARTS International Journal of Pure and Applied Matheatics Volue 37 No. 3 2007, 423-438 IMPROVED DATA DRIVEN CONTROL CHARTS Wille Albers 1, Wilbert C.M. Kallenberg 2 1,2 Departent of Applied Matheatics Faculty

More information

An Adaptive UKF Algorithm for the State and Parameter Estimations of a Mobile Robot

An Adaptive UKF Algorithm for the State and Parameter Estimations of a Mobile Robot Vol. 34, No. 1 ACTA AUTOMATICA SINICA January, 2008 An Adaptive UKF Algorith for the State and Paraeter Estiations of a Mobile Robot SONG Qi 1, 2 HAN Jian-Da 1 Abstract For iproving the estiation accuracy

More information

Computable Shell Decomposition Bounds

Computable Shell Decomposition Bounds Coputable Shell Decoposition Bounds John Langford TTI-Chicago jcl@cs.cu.edu David McAllester TTI-Chicago dac@autoreason.co Editor: Leslie Pack Kaelbling and David Cohn Abstract Haussler, Kearns, Seung

More information

Research Article On the Isolated Vertices and Connectivity in Random Intersection Graphs

Research Article On the Isolated Vertices and Connectivity in Random Intersection Graphs International Cobinatorics Volue 2011, Article ID 872703, 9 pages doi:10.1155/2011/872703 Research Article On the Isolated Vertices and Connectivity in Rando Intersection Graphs Yilun Shang Institute for

More information

Symmetrization and Rademacher Averages

Symmetrization and Rademacher Averages Stat 928: Statistical Learning Theory Lecture: Syetrization and Radeacher Averages Instructor: Sha Kakade Radeacher Averages Recall that we are interested in bounding the difference between epirical and

More information

Meta-Analytic Interval Estimation for Bivariate Correlations

Meta-Analytic Interval Estimation for Bivariate Correlations Psychological Methods 2008, Vol. 13, No. 3, 173 181 Copyright 2008 by the Aerican Psychological Association 1082-989X/08/$12.00 DOI: 10.1037/a0012868 Meta-Analytic Interval Estiation for Bivariate Correlations

More information

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction TWO-STGE SMPLE DESIGN WITH SMLL CLUSTERS Robert G. Clark and David G. Steel School of Matheatics and pplied Statistics, University of Wollongong, NSW 5 ustralia. (robert.clark@abs.gov.au) Key Words: saple

More information