Block designs and statistics
|
|
- Sara Bryan
- 6 years ago
- Views:
Transcription
1 Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent is called a variety or treatent. Fix 2. Consider subsets of the set of the varieties with eleents. The design is specified by specifying certain of these subsets. A bloc is one of the specified subsets. Thus the design is a ultiset of -eleent subsets of the set of varieties. There are b blocs in this ultiset. Each variety i occurs in r i blocs. This nuber r i is called the replication of i. Each pair of varieties i, j with i j occurs in λ ij blocs. The idea of a balanced design is to attept to tae λ ij to be constant. The present discussion explores optiality for paraeter values when there is no balanced design. In suary, the other iportant paraeters for a bloc design are nuber of blocs to which variety i belongs: r i, nuber of blocs to which varieties i j belong together: λ ij. A variety i with a bloc to which it belongs is called an experiental unit. One wants to copare varieties within a bloc, that is, to consider each bloc in turn and exaine what happens with the corresponding units. The bloc design is given by a v by b incidence atrix A. Here A i is 1 if variety i belongs to bloc, otherwise A i = 0. So the 1 entries correspond to the experiental units. The nuber of varieties in each bloc A i = (1) is constant. The nuber of replications of a variety is b A i = r i. (2) =1 1
2 The first iportant identity is obtained by counting the units. It says that r i = b. (3) For the syetric v by v concurrence atrix Λ = AA T. Thus Λ ij = b A i A j. (4) =1 For i j this is the nuber of ties λ ij that the pair i, j occurs in a bloc. We have b b Λ ii = A 2 i = A i = r i. (5) Also and =1 =1 Λ ij = r j (6) Λ ij = r i. (7) j=1 This leads to the second iportant identity λ ij = ( 1)r i. (8) j i Define the concurrence ultigraph to be the ultigraph with v vertices and with λ ij edges between each i j. The degree of vertex i is the nuber of edges attached to vertex i. Thus it is given by d(i) = j i λ ij = ( 1)r i. (9) For each ultigraph there is a corresponding syetric Laplacian atrix L. Its diagonal eleents are L ii = d(i). Its off diagonal eleents are L ij = λ ij. Thus we have L ij = 0. (10) and Furtherore, the trace of L is L ij = 0. (11) j=1 tr L = ( 1) r i = ( 1)b. (12) 2
3 Soeties it is convenient to write L = R Λ, where R is the diagonal atrix with diagonal entries r i and where Λ is the concurrence atrix. It is convenient to introduce another atrix that is a ultiple of the Laplacian atrix. This is the inforation atrix C defined by C = 1 L. We have that C ij = 0 (13) and Furtherore, C ij = 0. (14) j=1 In the following we will use the noralized trace tr C = ( 1)b. (15) C = 1 ( 1)b tr C = v 1 v 1. (16) The inforation atrix C ay be written directly in ters of the concurrence atrix Λ by the forula C = R 1 Λ. (17) Here the entry Λ ii = r i is the nuber of ties that the variety i occurs in a bloc, while for i j the entry Λ ij = λ ij is the nuber of ties that varieties i, j occur in the sae bloc. It follows that the entry C ii = ( 1)r i / and for i j the entry C ij = λ ij /. The atrices L and C = 1 L = R 1 Λ are both syetric atrices whose row and colun sus are zero. This iplies there is a constant eigenvector with eigenvalue 0. In the following we suppose that the concurrence ultigraph is connected, so that this eigenvector has ultiplicity one. The atrix P with constant entries 1/v is the orthogonal projection onto this space of constant vectors. It follows that I P is the orthogonal projection onto the space of vectors whose entries su to zero. Let C be the syetric atrix with row and colun sus that su to zero, and such that C C = CC = I P. (18) Thus C is the inverse to C on the v 1 diensional space of vectors whose entries su to zero, which is the range of I P. We shall see that it is reasonable to call C the covariance atrix. The odel is that for each unit with variety i in bloc there is an associated nuber µ i = τ i + β. (19) These paraeters are the treatent effect τ i and the bloc effect β. They are unnown. The goal of the experient is to use experiental observations to 3
4 estiate quantities associated with the treatent effect. Let x be a contrast vector with x 1 + x x v = 0. The quantity to estiate is the contrast x T τ = x 1 τ 1 +x 2 τ 2 + +x v τ v. The bloc effect is of no interest; the only reason for introducing the blocs is to eliinate systeatic error in distinguishing the effect of different treatents. Assue that for each unit with variety i in bloc there is a rando variable Y i with ean µ i = τ i + β and variance σ 2. These rando variables are independent. These are the experiental observations. Each tie an experient is done to easure values of the rando variables the results obtained are different. The goal is to estiate the contrast x T τ by soe function of the rando variables Y i associated with the units. A naive approach would be to estiate the effect of the ith treatent by the su {:i } Y i divided by r i. This is the su over blocs containing i of the observations, divided by the nuber of such blocs. The ean of this is τ i + (1/r i ) {:i } β. If the bloc effects are big, then this gives a terrible idea of the paraeter τ i. The other possibility is that the bloc effects are sall, but then one would not even bother with a bloc design. Here is a better way to approach the proble. For each variety i define the deviation su D i = Y i 1 Y j = A i Y i 1 A i A j Y j. {:i } {j:j } j (20) This is the su over blocs containing i of the deviation of the observation in the bloc for variety i fro the bloc saple ean. The goal is to subtract out the bloc effects, but this attept involves other varieties. The result is that the ean of D i is A i (τ i +β ) 1 A i A j (τ j +β ) = A i τ i 1 A i A j τ j. j j (21) It is independent of the bloc paraeters! The price is that it involves all the treatent paraeters, but this difficulty will be overcoe. The ean of D i siplifies to r i τ i j 1 Λ ijτ j = j C ij τ j. (22) In other words, the rando vector D has ean Cτ. It follows that the ean of the rando variable x T C D is the contrast x T C Cτ = x T τ. (23) It is also not difficult to copute the covariance of D i with D j. A routine calculation gives the result to be ( ) A i δ ij 1 ( A i A j σ 2 = r i δ ij 1 ) Λ ij σ 2 = C ij σ 2. (24) 4
5 Here δ ij = 1 for i = j and δ ij = 0 for i j. This is the Kronecer delta sybol for the identity atrix. It follows that for a contrast vector z the variance of z T D = i z id i is (z T Cz)σ 2 = z i C ij z j σ 2. (25) j=1 Tae the contrast vector x with Cx = z, or x = C z. The final result is that the estiator x T C D has ean x T τ and variance (x T C x)σ 2 = x i C ij x j σ 2. (26) j=1 The striing thing is that the result for the variance is so siple. In conclusion, the odel for the experient to copare treatents has three paraeters: treatent effect vector τ, bloc effect vector β, variance of individual observation σ 2. The deviation vector D is built fro the individual observations. The covariance atrix C is coputed fro C = R 1 Λ. If x is a contrast vector, then x T C D estiates the contrast x T τ; x T C D has variance (x T C x)σ 2. Sall variance is good; it eans that the quantity x T C D coputed fro the observations is liely to be close to the contrast x T τ that one is trying to estiate. Fix two different varieties i, j to be copared. The corresponding pairwise contrast vector is the vector x with x i = 1 and x j = 1 and all other coponents zero. Then x T τ = τ i τ j is the contrast between the ean responses fro the two corresponding treatents. The variance for the corresponding estiator x T C D = p C ip D p q C jq D q is V ij = ( C ii + C jj 2C ij ) σ 2. (27) We can copare each variety i to each of the other j. The total variance is V ij = ( (v 1)C ii + (tr C C ii ) + ) 2C ii σ 2 = (tr C + vc ii )σ2. (28) j i The average value of these pairwise variances is V = 1 v(v 1) i j i 2 tr C V ij = 2σ v 1. (29) 5
6 A design is good if this average pairwise covariance is sall. In suary, to get a good design, copute the Laplacian atrix L and the corresponding inforation atrix C = 1 L = R 1 Λ. Then copute the covariance atrix C. One way to do this is to use the forula Finally, copute the noralized trace C = (C + P ) 1 P. (30) C = 1 v 1 tr C. (31) Try to choose the design so that this is sall. What is sall? Here it is useful to use the inequality of geoetric and haronic ean, applied to the covariance atrix C. It says that In other words, 1 C C. (32) v 1 ( 1)b C. (33) Furtherore, the strong for of the inequality says that there is equality only when C is a constant ultiple of I P. In other words, C = ( 1)b (I P ). (34) v 1 This is the balanced case. The atrix I P has diagonal entries 1 1/v = (v 1)/v and off-diagonal entries 1/v. So C has diagonal entries ( 1)b/v and off-diagonal entries (b/v)( 1)/(v 1). Thus L = C has diagonal entries (b/v)( 1) and off-diagonal entries (b/v)( 1)/(v 1). Each replication is r = b/v. In the concurrence ultigraph each vertex is of degree ( 1)r and the nuber of edges between two vertices is λ = r( 1)/(v 1). Fro this we can copute the covariance atrix in the balanced case C = v 1 (I P ). (35) ( 1)b The optiu value of the averaged pairwise variance is V = 2σ 2 v 1 ( 1)b. (36) It is difficult to estiate when you have lots of varieties. However, it is good to have big blocs, and lots of the. In general one wants to tae a design that is optial in the following sense. Suppose the design has the property that a pair i, j with i j occurs in λ ij blocs. Define the Laplacian atrix to be the syetric atrix with offdiagonal entries λ ij for i j and with row and colun sus equal to zero. Let 6
7 C = 1 L be the corresponding inforation atrix. Let C be the covariance atrix that inverts the inforation atrix C on the subspace of vectors that su to zero. For this design the average pairwise variance is 2 tr C V = 2σ v 1, (37) Choose the design that iniizes this quantity. Here is an exaple to illustrate these ideas. Tae the cyclic bloc design with v = 7 varieties. Use addition odulo 7. First tae a starter bloc with eleents 0, 1, 3. Keep adding 1,1,1 to for b = 7 blocs each of size = 3. This gives a balanced design. The atrix L has diagonal entries 6 and off-diagonal entries 1. It is easy to copute the inforation atrix C = 1 3 L = 7 3 (I P ) and then the covariance atrix C = 3 7 (I P ). In this case I P has diagonal eleents 6/7 and off-diagonal eleents 1/7; in particular its noralized trace is 1. The noralized trace of the covariance atrix C is thus 3 7 = This is the optiu design. Instead tae a starter bloc with eleents 0, 1, 4. Keep adding 1,1,1 to for b = 7 blocs each of size = 3. This is not a balanced design. In fact, its Laplacian atrix is given by the following: L = (38) Again define the inforation atrix C = 1 3L. With a coputer it is not hard to find that the covariance atrix C is given by the following: C = (39) The noralized trace of the covariance atrix is then the su of the entries, which is 7 ties , divided by 6. This gives This is indeed larger than the nuber for the noralized trace fro the optial design. In the above exaple the correct procedure is to use the balanced bloc design, since it has the sallest average pairwise covariance of any design with the sae values of v, b,. However for ost values of v, b, there will not be a balanced design. This is because if there were a balanced design with λ ij = λ, 7
8 then we would have (v 1)λ = ( 1)r i. In particular, it would follow that r i = r is independent of i. Also, we would have vr = b. However there is no particular reason to hope that r and λ will be integers. So in general one ust soehow try out various bloc designs that are not balanced, until one finds one with the sallest average pairwise covariance. This treatent above is a siplification (perhaps an oversiplification) of the discussion in a long paper by R. A. Bailey and Peter J. Caeron called Cobinatorics of optial designs. Bailey and Caeron use a ore general fraewor where the entries in the incidence atrix are natural nubers. This eans that the blocs are ultisets rather than sets. For siplicity this is not done in the present account. Also, these authors explain that there ay be other quantities that one ay try to iniize. Here the ephasis has been to choose the design to iniize the average variance of the pairwise contrasts, that is, to iniize the average pairwise variance 2σ 2 C. One alternative, for instance, would be to choose the design to iniize the largest possible value of σ 2 x T C x over all contrast vectors x with noralization x T x = 1. This would be the approach of a pessiist who worries about the worst case contrast vector that one could encounter in practice. In the situation when there is a balanced design it is still the best design, but in other cases this could give a different optial design. 8
Feature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationPhysics 215 Winter The Density Matrix
Physics 215 Winter 2018 The Density Matrix The quantu space of states is a Hilbert space H. Any state vector ψ H is a pure state. Since any linear cobination of eleents of H are also an eleent of H, it
More informationA note on the multiplication of sparse matrices
Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani
More informationON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD
PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY Physical and Matheatical Sciences 04,, p. 7 5 ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD M a t h e a t i c s Yu. A. HAKOPIAN, R. Z. HOVHANNISYAN
More information3.3 Variational Characterization of Singular Values
3.3. Variational Characterization of Singular Values 61 3.3 Variational Characterization of Singular Values Since the singular values are square roots of the eigenvalues of the Heritian atrices A A and
More informationPolygonal Designs: Existence and Construction
Polygonal Designs: Existence and Construction John Hegean Departent of Matheatics, Stanford University, Stanford, CA 9405 Jeff Langford Departent of Matheatics, Drake University, Des Moines, IA 5011 G
More informationCOS 424: Interacting with Data. Written Exercises
COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well
More informationOBJECTIVES INTRODUCTION
M7 Chapter 3 Section 1 OBJECTIVES Suarize data using easures of central tendency, such as the ean, edian, ode, and idrange. Describe data using the easures of variation, such as the range, variance, and
More informationLeast Squares Fitting of Data
Least Squares Fitting of Data David Eberly, Geoetric Tools, Redond WA 98052 https://www.geoetrictools.co/ This work is licensed under the Creative Coons Attribution 4.0 International License. To view a
More informationChapter 6 1-D Continuous Groups
Chapter 6 1-D Continuous Groups Continuous groups consist of group eleents labelled by one or ore continuous variables, say a 1, a 2,, a r, where each variable has a well- defined range. This chapter explores:
More informationIntelligent Systems: Reasoning and Recognition. Artificial Neural Networks
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationPattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition
More informationQuantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search
Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths
More informationPhysics 139B Solutions to Homework Set 3 Fall 2009
Physics 139B Solutions to Hoework Set 3 Fall 009 1. Consider a particle of ass attached to a rigid assless rod of fixed length R whose other end is fixed at the origin. The rod is free to rotate about
More informationMultivariate Methods. Matlab Example. Principal Components Analysis -- PCA
Multivariate Methos Xiaoun Qi Principal Coponents Analysis -- PCA he PCA etho generates a new set of variables, calle principal coponents Each principal coponent is a linear cobination of the original
More informationSharp Time Data Tradeoffs for Linear Inverse Problems
Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used
More informationA remark on a success rate model for DPA and CPA
A reark on a success rate odel for DPA and CPA A. Wieers, BSI Version 0.5 andreas.wieers@bsi.bund.de Septeber 5, 2018 Abstract The success rate is the ost coon evaluation etric for easuring the perforance
More informationExperimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis
City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna
More informationINNER CONSTRAINTS FOR A 3-D SURVEY NETWORK
eospatial Science INNER CONSRAINS FOR A 3-D SURVEY NEWORK hese notes follow closely the developent of inner constraint equations by Dr Willie an, Departent of Building, School of Design and Environent,
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationLinear Algebra (I) Yijia Chen. linear transformations and their algebraic properties. 1. A Starting Point. y := 3x.
Linear Algebra I) Yijia Chen Linear algebra studies Exaple.. Consider the function This is a linear function f : R R. linear transforations and their algebraic properties.. A Starting Point y := 3x. Geoetrically
More informationA Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More informationThe Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters
journal of ultivariate analysis 58, 96106 (1996) article no. 0041 The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Paraeters H. S. Steyn
More informationInteractive Markov Models of Evolutionary Algorithms
Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary
More informationReed-Muller Codes. m r inductive definition. Later, we shall explain how to construct Reed-Muller codes using the Kronecker product.
Coding Theory Massoud Malek Reed-Muller Codes An iportant class of linear block codes rich in algebraic and geoetric structure is the class of Reed-Muller codes, which includes the Extended Haing code.
More informationModel Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon
Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential
More informationCSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13
CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial
More informationTopic 5a Introduction to Curve Fitting & Linear Regression
/7/08 Course Instructor Dr. Rayond C. Rup Oice: A 337 Phone: (95) 747 6958 E ail: rcrup@utep.edu opic 5a Introduction to Curve Fitting & Linear Regression EE 4386/530 Coputational ethods in EE Outline
More informationRAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5
Course STA0: Statistics and Probability Lecture No to 5 Multiple Choice Questions:. Statistics deals with: a) Observations b) Aggregates of facts*** c) Individuals d) Isolated ites. A nuber of students
More information12 Towards hydrodynamic equations J Nonlinear Dynamics II: Continuum Systems Lecture 12 Spring 2015
18.354J Nonlinear Dynaics II: Continuu Systes Lecture 12 Spring 2015 12 Towards hydrodynaic equations The previous classes focussed on the continuu description of static (tie-independent) elastic systes.
More informationA Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair
Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving
More informationHybrid System Identification: An SDP Approach
49th IEEE Conference on Decision and Control Deceber 15-17, 2010 Hilton Atlanta Hotel, Atlanta, GA, USA Hybrid Syste Identification: An SDP Approach C Feng, C M Lagoa, N Ozay and M Sznaier Abstract The
More informationCh 12: Variations on Backpropagation
Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationM ath. Res. Lett. 15 (2008), no. 2, c International Press 2008 SUM-PRODUCT ESTIMATES VIA DIRECTED EXPANDERS. Van H. Vu. 1.
M ath. Res. Lett. 15 (2008), no. 2, 375 388 c International Press 2008 SUM-PRODUCT ESTIMATES VIA DIRECTED EXPANDERS Van H. Vu Abstract. Let F q be a finite field of order q and P be a polynoial in F q[x
More informationThe Hilbert Schmidt version of the commutator theorem for zero trace matrices
The Hilbert Schidt version of the coutator theore for zero trace atrices Oer Angel Gideon Schechtan March 205 Abstract Let A be a coplex atrix with zero trace. Then there are atrices B and C such that
More informationRandomized Recovery for Boolean Compressed Sensing
Randoized Recovery for Boolean Copressed Sensing Mitra Fatei and Martin Vetterli Laboratory of Audiovisual Counication École Polytechnique Fédéral de Lausanne (EPFL) Eail: {itra.fatei, artin.vetterli}@epfl.ch
More informationIntelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes
More informationIntroduction to Machine Learning. Recitation 11
Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,
More informationTEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES
TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES S. E. Ahed, R. J. Tokins and A. I. Volodin Departent of Matheatics and Statistics University of Regina Regina,
More informationi ij j ( ) sin cos x y z x x x interchangeably.)
Tensor Operators Michael Fowler,2/3/12 Introduction: Cartesian Vectors and Tensors Physics is full of vectors: x, L, S and so on Classically, a (three-diensional) vector is defined by its properties under
More informationLinear Transformations
Linear Transforations Hopfield Network Questions Initial Condition Recurrent Layer p S x W S x S b n(t + ) a(t + ) S x S x D a(t) S x S S x S a(0) p a(t + ) satlins (Wa(t) + b) The network output is repeatedly
More informationTABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS. By William W. Chen
TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS By Willia W. Chen The distribution of the non-null characteristic roots of a atri derived fro saple observations
More informationA model reduction approach to numerical inversion for a parabolic partial differential equation
Inverse Probles Inverse Probles 30 (204) 250 (33pp) doi:0.088/0266-56/30/2/250 A odel reduction approach to nuerical inversion for a parabolic partial differential equation Liliana Borcea, Vladiir Drusin
More informationIn this chapter, we consider several graph-theoretic and probabilistic models
THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions
More informationarxiv: v2 [math.co] 8 Mar 2018
Restricted lonesu atrices arxiv:1711.10178v2 [ath.co] 8 Mar 2018 Beáta Bényi Faculty of Water Sciences, National University of Public Service, Budapest beata.benyi@gail.co March 9, 2018 Keywords: enueration,
More informationTesting equality of variances for multiple univariate normal populations
University of Wollongong Research Online Centre for Statistical & Survey Methodology Working Paper Series Faculty of Engineering and Inforation Sciences 0 esting equality of variances for ultiple univariate
More information1 Proof of learning bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a
More informationAn Improved Particle Filter with Applications in Ballistic Target Tracking
Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing
More informationPage 1 Lab 1 Elementary Matrix and Linear Algebra Spring 2011
Page Lab Eleentary Matri and Linear Algebra Spring 0 Nae Due /03/0 Score /5 Probles through 4 are each worth 4 points.. Go to the Linear Algebra oolkit site ransforing a atri to reduced row echelon for
More informationGeneral Properties of Radiation Detectors Supplements
Phys. 649: Nuclear Techniques Physics Departent Yarouk University Chapter 4: General Properties of Radiation Detectors Suppleents Dr. Nidal M. Ershaidat Overview Phys. 649: Nuclear Techniques Physics Departent
More information13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices
CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay
More informationSupport recovery in compressed sensing: An estimation theoretic approach
Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de
More informationList Scheduling and LPT Oliver Braun (09/05/2017)
List Scheduling and LPT Oliver Braun (09/05/207) We investigate the classical scheduling proble P ax where a set of n independent jobs has to be processed on 2 parallel and identical processors (achines)
More informationCompressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements
1 Copressive Distilled Sensing: Sparse Recovery Using Adaptivity in Copressive Measureents Jarvis D. Haupt 1 Richard G. Baraniuk 1 Rui M. Castro 2 and Robert D. Nowak 3 1 Dept. of Electrical and Coputer
More informationMulti-Dimensional Hegselmann-Krause Dynamics
Multi-Diensional Hegselann-Krause Dynaics A. Nedić Industrial and Enterprise Systes Engineering Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu B. Touri Coordinated Science Laboratory
More informationTHE KALMAN FILTER: A LOOK BEHIND THE SCENE
HE KALMA FILER: A LOOK BEHID HE SCEE R.E. Deain School of Matheatical and Geospatial Sciences, RMI University eail: rod.deain@rit.edu.au Presented at the Victorian Regional Survey Conference, Mildura,
More informationForecasting Financial Indices: The Baltic Dry Indices
International Journal of Maritie, Trade & Econoic Issues pp. 109-130 Volue I, Issue (1), 2013 Forecasting Financial Indices: The Baltic Dry Indices Eleftherios I. Thalassinos 1, Mike P. Hanias 2, Panayiotis
More informationThe Transactional Nature of Quantum Information
The Transactional Nature of Quantu Inforation Subhash Kak Departent of Coputer Science Oklahoa State University Stillwater, OK 7478 ABSTRACT Inforation, in its counications sense, is a transactional property.
More informationPrincipal Components Analysis
Principal Coponents Analysis Cheng Li, Bingyu Wang Noveber 3, 204 What s PCA Principal coponent analysis (PCA) is a statistical procedure that uses an orthogonal transforation to convert a set of observations
More informationSequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5,
Sequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5, 2015 31 11 Motif Finding Sources for this section: Rouchka, 1997, A Brief Overview of Gibbs Sapling. J. Buhler, M. Topa:
More informationAn Introduction to Meta-Analysis
An Introduction to Meta-Analysis Douglas G. Bonett University of California, Santa Cruz How to cite this work: Bonett, D.G. (2016) An Introduction to Meta-analysis. Retrieved fro http://people.ucsc.edu/~dgbonett/eta.htl
More informationNon-Parametric Non-Line-of-Sight Identification 1
Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,
More informationConsistent Multiclass Algorithms for Complex Performance Measures. Supplementary Material
Consistent Multiclass Algoriths for Coplex Perforance Measures Suppleentary Material Notations. Let λ be the base easure over n given by the unifor rando variable (say U over n. Hence, for all easurable
More informationProbability Distributions
Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples
More informationPh 20.3 Numerical Solution of Ordinary Differential Equations
Ph 20.3 Nuerical Solution of Ordinary Differential Equations Due: Week 5 -v20170314- This Assignent So far, your assignents have tried to failiarize you with the hardware and software in the Physics Coputing
More informationEstimating Parameters for a Gaussian pdf
Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3
More informationLean Walsh Transform
Lean Walsh Transfor Edo Liberty 5th March 007 inforal intro We show an orthogonal atrix A of size d log 4 3 d (α = log 4 3) which is applicable in tie O(d). By applying a rando sign change atrix S to the
More informationTesting Properties of Collections of Distributions
Testing Properties of Collections of Distributions Reut Levi Dana Ron Ronitt Rubinfeld April 9, 0 Abstract We propose a fraework for studying property testing of collections of distributions, where the
More informationESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics
ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents
More informationLecture 9 November 23, 2015
CSC244: Discrepancy Theory in Coputer Science Fall 25 Aleksandar Nikolov Lecture 9 Noveber 23, 25 Scribe: Nick Spooner Properties of γ 2 Recall that γ 2 (A) is defined for A R n as follows: γ 2 (A) = in{r(u)
More informationAnalysis of Impulsive Natural Phenomena through Finite Difference Methods A MATLAB Computational Project-Based Learning
Analysis of Ipulsive Natural Phenoena through Finite Difference Methods A MATLAB Coputational Project-Based Learning Nicholas Kuia, Christopher Chariah, Mechatronics Engineering, Vaughn College of Aeronautics
More informationProc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES
Proc. of the IEEE/OES Seventh Working Conference on Current Measureent Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Belinda Lipa Codar Ocean Sensors 15 La Sandra Way, Portola Valley, CA 98 blipa@pogo.co
More informationChaotic Coupled Map Lattices
Chaotic Coupled Map Lattices Author: Dustin Keys Advisors: Dr. Robert Indik, Dr. Kevin Lin 1 Introduction When a syste of chaotic aps is coupled in a way that allows the to share inforation about each
More informationSimple procedures for finding mean first passage times in Markov chains
Res. Lett. Inf. Math. Sci., 2005, Vol. 8, pp 209-226 209 Availale online at http://iis.assey.ac.nz/research/letters/ Siple procedures for finding ean first passage ties in Markov chains JEFFREY J. HUNER
More informationNecessity of low effective dimension
Necessity of low effective diension Art B. Owen Stanford University October 2002, Orig: July 2002 Abstract Practitioners have long noticed that quasi-monte Carlo ethods work very well on functions that
More informationDistributed Subgradient Methods for Multi-agent Optimization
1 Distributed Subgradient Methods for Multi-agent Optiization Angelia Nedić and Asuan Ozdaglar October 29, 2007 Abstract We study a distributed coputation odel for optiizing a su of convex objective functions
More informationA Note on the Applied Use of MDL Approximations
A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016/2017 Lessons 9 11 Jan 2017 Outline Artificial Neural networks Notation...2 Convolutional Neural Networks...3
More informationThis model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.
CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when
More informationThe Simplex Method is Strongly Polynomial for the Markov Decision Problem with a Fixed Discount Rate
The Siplex Method is Strongly Polynoial for the Markov Decision Proble with a Fixed Discount Rate Yinyu Ye April 20, 2010 Abstract In this note we prove that the classic siplex ethod with the ost-negativereduced-cost
More informationLower Bounds for Quantized Matrix Completion
Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &
More informationarxiv: v1 [cs.ds] 17 Mar 2016
Tight Bounds for Single-Pass Streaing Coplexity of the Set Cover Proble Sepehr Assadi Sanjeev Khanna Yang Li Abstract arxiv:1603.05715v1 [cs.ds] 17 Mar 2016 We resolve the space coplexity of single-pass
More informationOptimal Jamming Over Additive Noise: Vector Source-Channel Case
Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper
More informationThe Fundamental Basis Theorem of Geometry from an algebraic point of view
Journal of Physics: Conference Series PAPER OPEN ACCESS The Fundaental Basis Theore of Geoetry fro an algebraic point of view To cite this article: U Bekbaev 2017 J Phys: Conf Ser 819 012013 View the article
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationKernel Methods and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic
More informationOn the Use of A Priori Information for Sparse Signal Approximations
ITS TECHNICAL REPORT NO. 3/4 On the Use of A Priori Inforation for Sparse Signal Approxiations Oscar Divorra Escoda, Lorenzo Granai and Pierre Vandergheynst Signal Processing Institute ITS) Ecole Polytechnique
More informationBayes Decision Rule and Naïve Bayes Classifier
Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.
More informationBipartite subgraphs and the smallest eigenvalue
Bipartite subgraphs and the sallest eigenvalue Noga Alon Benny Sudaov Abstract Two results dealing with the relation between the sallest eigenvalue of a graph and its bipartite subgraphs are obtained.
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationThe proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).
A Appendix: Proofs The proofs of Theore 1-3 are along the lines of Wied and Galeano (2013) Proof of Theore 1 Let D[d 1, d 2 ] be the space of càdlàg functions on the interval [d 1, d 2 ] equipped with
More informationIntroduction to Robotics (CS223A) (Winter 2006/2007) Homework #5 solutions
Introduction to Robotics (CS3A) Handout (Winter 6/7) Hoework #5 solutions. (a) Derive a forula that transfors an inertia tensor given in soe frae {C} into a new frae {A}. The frae {A} can differ fro frae
More informationThis article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and
This article appeared in a ournal published by Elsevier. The attached copy is furnished to the author for internal non-coercial research and education use, including for instruction at the authors institution
More informationOcean 420 Physical Processes in the Ocean Project 1: Hydrostatic Balance, Advection and Diffusion Answers
Ocean 40 Physical Processes in the Ocean Project 1: Hydrostatic Balance, Advection and Diffusion Answers 1. Hydrostatic Balance a) Set all of the levels on one of the coluns to the lowest possible density.
More informationNBN Algorithm Introduction Computational Fundamentals. Bogdan M. Wilamoswki Auburn University. Hao Yu Auburn University
NBN Algorith Bogdan M. Wilaoswki Auburn University Hao Yu Auburn University Nicholas Cotton Auburn University. Introduction. -. Coputational Fundaentals - Definition of Basic Concepts in Neural Network
More informationEstimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples
Open Journal of Statistics, 4, 4, 64-649 Published Online Septeber 4 in SciRes http//wwwscirporg/ournal/os http//ddoiorg/436/os4486 Estiation of the Mean of the Eponential Distribution Using Maiu Ranked
More information1 Bounding the Margin
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost
More informationASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical
IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul
More information