Discussion on Regularized Regression for Categorical Data (Tutz and Gertheiss)

Size: px
Start display at page:

Download "Discussion on Regularized Regression for Categorical Data (Tutz and Gertheiss)"

Transcription

1 Discussin n Regularized Regressin fr Categrical Data (Tutz and Gertheiss) Peter Bühlmann, Ruben Dezeure Seminar fr Statistics, Department f Mathematics, ETH Zürich, Switzerland Address fr crrespndence: Peter Bühlmann, Seminar fr Statistics, Department f Mathematics, HG G 17, Rämistrasse 101, CH-8092 Zürich, Switzerland. buhlmann@stat.math.ethz.ch. Phne: (+41) Fax: (+41) Key wrds: Categrical data; Cmbining different penalizatin types; Cnfidence intervals; High-dimensinal generalized regressin; Ordinal categrical predictrs; P- values; Uncertainty quantificatin

2 2 Peter Bühlmann and Ruben Dezeure We cngratulate Gerhard Tutz and Jan Gertheiss, referred in the sequel as TG, fr an interesting and inspiring paper n the imprtant tpic f regressin fr categrical data. Much has happened ver the 15 years, including earlier cntributins frm the authrs: the current verview and expsitry paper is a mst welcme additin t the literature. 1 Sme further thughts n the paper TG have written a master piece n mdern regressin with categrical cvariables. Special attentin is given t the case with rdinal categrical predictrs: TG explain a variety f pssibilities t build in additinal infrmatin in terms f sparsity, smthness and clusteredness. Frm a fundamental perspective, the case with rdinal categries is cmplex and difficult. Using a vague analgy t nnparametric curve estimatin, the issue here is a kind f varying regularizatin prblem t adapt well t the rugh and flat parts f an underlying true functin (besides the rle f selectin). Clever regularizatin r penalizatin is certainly a pwerful vehicle fr btaining accurate pint estimates f the underlying generalized regressin cefficients, r fr mdel parameters beynd regressin as discussed als in TG in Sectins 5 and Different penalizatin types fr different predictrs A natural questin in practice will be the chice f the penalizatin type which might be different fr different cvariables, and which f the categrical variables shuld be subject t a penalty term. Regarding the secnd questin, a sparsity-type penalizatin shuld autmatically select the variables t be strngly clustered, strngly smthed r threshlded t zer and thus, such sparsity inducing schemes ften wrk well fr the purpse f selectin (n the level f individual r grups f parameters). Crssvalidatin is a natural candidate fr addressing the first questin (aside frm its use t chse the amunt f regularizatin) but it might becme cmputatinally very cumbersme t run it a large multitude f times fr evaluating many cmbinatins. Here, a cmbinatin means that sme subset f rdinal cvariables are subject t a clustering penalty (Sectin in TG), anther subset f rdinal cvariables subject t a smthing penalty (Sectin in TG), and anther subset subject t a sparsity nly penalty (Sectin in TG). We will illustrate in Sectin 2 an explratry way, by inspecting cnfidence intervals, frm which ne might extract sme reasnable infrmatin fr a cmbinatin f the qualitatively different penalty terms acting n different cvariables.

3 Discussin Tutz Gertheiss Tree methds and Randm Frests Sectin 7.3 in TG mentins tree based appraches. They have the advantage that they wrk in a natural way fr mixed variables, where sme f the cvariables are cntinuus, sme categrical with nminal and rdinal categries. Appraches based n penalizatin becme intrinsically mre cmplicated with mixed data, requiring a careful chice f varius penalty terms and their crrespnding regularizatin parameters (see als abve regarding the difficulty t chse different penalizatin types). If the task wuld be predictin nly, Randm Frests (Breiman, 2001) is ften a very pwerful methd which (essentially) des nt require t chse any tuning parameter(s): reasns fr its success in general include that the algrithm als takes interactins between variables int accunt and that it can deal in a scale-free manner with mixed variables. We wnder hw well it wuld perfrm fr predictin in the dataset n Spending fr Fd analyzed in TG. Based n adaptatins f Randm Frests, we have btained in ther wrks sme interesting results fr predictive survival mdeling (Hthrn et al., 2006), fr predictive imputatin f missing data (Stekhven and Bühlmann, 2012), but als fr variable imprtance in mixed undirected graphical mdels (Fellinghauer et al., 2013). The disadvantage f Randm Frests technlgy is that it des nt yield statistical pint estimatin r inferential statements in a well-defined statistical mdel; and ntins f measuring variable imprtance, as advcated in Breiman (2001) r its mdified versin in (Strbl et al., 2008), are still very algrithmic with n statistical guarantees in terms f p-values r cnfidence intervals. 2 Sme utlk: cnfidence intervals and p-values We wuld like t utline here the additinal perspective f uncertainty quantificatin. Recently, sme prgress has been made t cnstruct cnfidence intervals and statistical hypthesis tests in ptentially high-dimensinal generalized regressin mdels (Bühlmann, 2013; Zhang and Zhang, 2014; van de Geer et al., 2014; Javanmard and Mntanari, 2014; Dezeure et al., 2015; van de Geer and Stucky, 2015). The key idea is t de-bias r de-sparsify the sparse Lass estimatr (Tibshirani, 1996). Starting frm the Lass, the de-sparsifying peratin leads t a nn-sparse but regular estimatr whse lw-dimensinal cmpnents have an asympttic Gaussian distributin. Fr example, cnsider a linear mdel Y = Xβ 0 + ε, with fixed design n p matrix X and a Gaussian errr term ε with i.i.d. cmpnents having mean zer and variance σ 2 ε (where p dentes here the number f parameters). The situatin might be very high-dimensinal with p n but the parameter β 0 is assumed t be sparse. Assuming such sparsity cnditins and a restricted eigenvalue

4 4 Peter Bühlmann and Ruben Dezeure assumptin n X (Bühlmann and van de Geer, 2011, cf.), we have: n( ˆβ β 0 ) = W +, W N p (0, σ 2 εω), max j = P (1) (p n ), j=1,...,p where Ω is a knwn cvariance matrix depending n the design matrix X (in analgy t rdinary least squares where Ω wuld be equal t (X T X/n) 1 ). The remarkable feature f this result is that: (i) the de-sparsified estimatr has 1/ n cnvergence rate, despite the very high-dimensinal setting and unlike fr estimating the regressin functin where the cnvergence rate is nly lg(p)/n (Bühlmann and van de Geer, 2011, cf.); and (ii) that the limiting distributin is knwn up t the unknwn nise variance which can be estimated als in the high-dimensinal setting. Thus, we can cnstruct cnfidence intervals fr single parameters β 0 j, and we can test statistical hyptheses fr grups f parameters. Regarding the latter, a grup hyptheses is f the frm H 0,G ; β 0 j = 0 fr all j G, where G {1,... p}. If G is very large, we have t rely n the test-statistic max j G ˆβ j /s.e.( ˆβ j ), and if G is small (say G 10), we can als use the sum-statistic ˆβ j G j : the crrespnding asympttic limiting distributins can be calculated r efficiently simulated in case f the max-type statistics. In additin, multiple testing crrectin can be dne efficiently since the cvariance structure, prprtinal t Ω, is knwn. All f this (but nt the sum-type statistics) is implemented in the R-package hdi fr generalized linear mdels (Meier et al., 2014; Dezeure et al., 2015). The methdlgy utlined abve can be used fr categrical (nminal) predictr variables, at least in the simple frm. In the ntatin f TG, cnsider the mdel frmulatin (1) fr a linear mdel as in (3): the regressin parameters fr the jth cvariable are β jr, r = 1,..., k j (using the cnstraint frm TG with β j0 = 0), fr j = 1,..., p. The R-package hdi then leads t cnfidence intervals fr the underlying true parameters βjr, 0 r = 1,... k j, j = 1,... p. We can als test whether the j categrical cvariable has a significant effect n the respnse Y by cnsidering the null-hypthesis H 0,Gj : β jr = 0 fr all r = 1,..., k j. Testing this hypthesis is related t the grup-wise selectin discussed in Sectin in TG, but nw with a statistical uncertainty measure in terms f a p-value instead f a pint estimatr nly. We illustrate the cnfidence intervals f the single regressin parameters fr a simulated dataset. The suggested apprach allws als t infrmally infer whether the cefficients frm rdered categrical cvariables are similar fr neighbring levels within the same categrical variable, by inspecting the cnfidence intervals. The results are displayed in Figure 1. If the cnfidence intervals wuld heavily verlap fr the cefficients frm neighbring levels within rdinal categrical variables, clustering f the

5 Discussin Tutz Gertheiss 5 crrespnding categries wuld be natural; if the verlap f neighbring cnfidence intervals is less prnunced, smthing f the levels seems reasnable; and if the cnfidence intervals wuld nt verlap at all, there shuld be n penalty r regularizatin fr the levels within the categrical variable. In particular, with such a visual inspectin f cnfidence intervals, ne culd determine fr which f the rdinal categrical variables sme smthing r clustering f categries wuld be expected t be beneficial. (Trying all cmbinatins and ptimizing a crss-validatin scre culd becme quickly cmputatinally infeasible). We infer frm Figure 1 that there are 95% CI: S0 cverage = 0.481, S0c cverage = Truth * * 3.5* * 15.9* * 31.8* 31.9* * significant fr hypthesis testing fr significance level 0.05 after BH crrectin Other rejected hyptheses = Nne Figure 1: Cnfidence intervals fr the nn-zer regressin cefficients based n the de-sparsified Lass fr high-dimensinal inference with R-package hdi. The underlying data is a single realizatin frm mdel (M) described belw with sample size n = 800 and 100 categrical variables with 581 parameters in ttal. Shwn are the cnfidence intervals fr the parameters frm the active set S 0 = {1, 3, 15, 31} f the categrical variables. The cmplement (S 0 ) c = {1,..., 100} \ S 0 encdes the categrical variables which have n effect n the respnse. The true parameters were cvered by the nminal 95% cnfidence intervals in 48.1% f the 27 intervals crrespnding t the active variables in S 0 (ften nly slightly missing t cver the true parameter), and in 98.4% f the 554 intervals crrespnding t the nn-active variables in (S 0 ) c. When using a Bnferrni-Hlm crrectin fr cntrlling the familywise errr rate in multiple testing, we detect all the fur categrical variables frm S 0 and n false psitive finding is made (i.e., n significant variable frm (S 0 ) c ).

6 6 Peter Bühlmann and Ruben Dezeure fur significant categrical variables, namely variables 1, 3, 15 and 31; and these are indeed the nly true variables having an effect. The plt als suggests the fllwing scenaris: variable 1 has tw clusters f categrical values (which is true), variable 3 has unstructured categrical values (which is true), variable 15 has tw clusters f categrical values (which is true) and that there is an indicatin that variable 31 has a smth structure fr its categrical values (which is true). Thus, the plt in Figure 1 is indeed rather infrmative abut the true underlying structure. Mre statements can be made: the significance f the categrical variables when adjusted fr cntrlling the familywise errr rate in multiple testing, using the Bnferrni-Hlm prcedure, leads t the fur significant variables (as mentined already) and t n false psitive findings; withut the multiplicity adjustment, 9 ut f 554 cnfidence intervals fr cefficients whse true values are zer d nt cver the true value zer and hence wuld lead t false psitives. The mdel underlying the ne simulated dataset f sample size n = 800 with 100 categrical variables is cnstructed as fllws. (M) Cnsider X N 100 (0, Σ) with Teplitz matrix Σ ij = 0.9 i j. Generate n i.i.d. samples frm N 100 (0, Σ). The active set f categrical variables with nn-zer effect fr the respnse is S 0 = {1, 3, 15, 31}. We categrize each variable X (j) (j = 1,..., 100) accrding t the empirical quantile (ver the n = 800 realizatins) f X (j) with U j categries: U 1 = 5, U 3 = 6, U j = 10 (j = 15, 31), all ther U j s frm Unifrm frm {2,..., 12}. This leads t a dummy encding design matrix X with 100 j=1 (U j 1) clumns. Cnsider the regressin cefficients: clustered categrical values: β 1,1 = β 1,2 = β 1,3 = 1, β 1,4 = 2, β 15,1 = β 15,2 =... = β 15,7 = 1, β 15,8 = β 15,9 = 2, smthed categrical values: β 31,1 = 1, β 31,2 = 1.25, β 31,3 = 1.5,..., β 31,9 = , unstructured cefficients: β 3,1 = 1, β 3,2 = 1, β 3,3 = 0, β 3,4 = 2, β 3,5 = 2. Finally, take the errr term as n = 800 realizatins frm ε N (0, 1). We have nt explred hw t btain cnfidence statements based n regularizatin fr smthing r clustering the categry levels within an rdinal categrical variable (whse imprtance has been presented in TG fr pint estimatin). Fllwing the same idea as fr the de-sparsified estimatr utlined abve (see als van de Geer et al. (2014)), ne might be able t gain statistical pwer by cnstructing cnfidence statements based n a versin f a regularized estimatr which encurages nt nly sparsity (as the Lass) but als t smth r cluster rdinal categries. T ur knwledge, this has nt been pursued s far.

7 Discussin Tutz Gertheiss 7 3 Cnclusins Tutz and Gertheiss have prvided a very insightful and careful verview f methdlgy develped in the last years fr regularized regressin fr categrical data. Their fcus is n pint estimatin, which is the first step in statistical inference. We have illustrated that recent techniques fr cnstructing cnfidence statements in high-dimensinal generalized regressin have the ptential t be useful als fr regressin with categrical data. Fr a simulated dataset with sample size n = 800, 100 categrical variables and 581 parameters in ttal, we btain very reasnable results even withut smthing r clustering categrical values: this can serve as a step fr inferring which categrical variables (r parts f their levels) shuld be subject t smthing r clustering as described in TG. Obtaining mre pwerful cnfidence statements by using als smthing r clustering f rdinal categries might be pssible: achieving this, particularly in the high-dimensinal setting, is an pen prblem. References Breiman, L. (2001). Randm Frests. Machine Learning, 45, Bühlmann, P. (2013). Statistical significance in high-dimensinal linear mdels. Bernulli, 19, Bühlmann, P. and van de Geer, S. (2011). Statistics fr High-Dimensinal Data: Methds, Thery and Applicatins. Springer. Dezeure, R., Bühlmann, P., Meier, L., and Meinshausen, N. (2015). High-dimensinal inference: cnfidence intervals, p-values and R-sftware hdi. Statistical Science, 30, Fellinghauer, B., Bühlmann, P., Ryffel, M., vn Rhein, M., and Reinhardt, J. (2013). Stable graphical mdel estimatin with Randm Frests fr discrete, cntinuus, and mixed variables. Cmputatinal Statistics & Data Analysis, 64, Hthrn, T., Bühlmann, P., Dudit, S., Mlinar, A., and van der Laan, M. (2006). Survival ensembles. Bistatistics, 7, Javanmard, A. and Mntanari, A. (2014). Cnfidence intervals and hypthesis testing fr high-dimensinal regressin. Jurnal f Machine Learning Research, 15, Meier, L., Meinshausen, N., and Dezeure, R. (2014). hdi: High-Dimensinal Inference. URL R package versin Stekhven, D. and Bühlmann, P. (2012). Missfrest - nnparametric missing value imputatin fr mixed-type data. Biinfrmatics, 28,

8 8 Peter Bühlmann and Ruben Dezeure Strbl, C., Bulesteix, A.-L., Kneib, T., Augustin, T., and Zeileis, A. (2008). Cnditinal variable imprtance fr Randm Frests. BMC Biinfrmatics, 9:307. Tibshirani, R. (1996). Regressin shrinkage and selectin via the Lass. Jurnal f the Ryal Statistical Sciety, Series B, 58, van de Geer, S. and Stucky, B. (2015). χ 2 -cnfidence sets in high-dimensinal regressin. T appear in the Abel Sympsium 2014 bk; Preprint arxiv: van de Geer, S., Bühlmann, P., Ritv, Y., and Dezeure, R. (2014). On asympttically ptimal cnfidence regins and tests fr high-dimensinal mdels. Annals f Statistics, 42, Zhang, C.-H. and Zhang, S. (2014). Cnfidence intervals fr lw dimensinal parameters in high dimensinal linear mdels. Jurnal f the Ryal Statistical Sciety, Series B, 76,

Resampling Methods. Chapter 5. Chapter 5 1 / 52

Resampling Methods. Chapter 5. Chapter 5 1 / 52 Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and

More information

High-dimensional Inference: Confidence intervals, p-values and R-Software hdi

High-dimensional Inference: Confidence intervals, p-values and R-Software hdi High-dimensinal Inference: Cnfidence intervals, p-values and R-Sftware hdi Ruben Dezeure, Peter Bühlmann, Lukas Meier and Niclai Meinshausen ETH Zurich arxiv:148.426v1 [stat.me] 18 Aug 214 Abstract. We

More information

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter

Midwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline

More information

Simple Linear Regression (single variable)

Simple Linear Regression (single variable) Simple Linear Regressin (single variable) Intrductin t Machine Learning Marek Petrik January 31, 2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression

3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression 3.3.4 Prstate Cancer Data Example (Cntinued) 3.4 Shrinkage Methds 61 Table 3.3 shws the cefficients frm a number f different selectin and shrinkage methds. They are best-subset selectin using an all-subsets

More information

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came. MATH 1342 Ch. 24 April 25 and 27, 2013 Page 1 f 5 CHAPTER 24: INFERENCE IN REGRESSION Chapters 4 and 5: Relatinships between tw quantitative variables. Be able t Make a graph (scatterplt) Summarize the

More information

The general linear model and Statistical Parametric Mapping I: Introduction to the GLM

The general linear model and Statistical Parametric Mapping I: Introduction to the GLM The general linear mdel and Statistical Parametric Mapping I: Intrductin t the GLM Alexa Mrcm and Stefan Kiebel, Rik Hensn, Andrew Hlmes & J-B J Pline Overview Intrductin Essential cncepts Mdelling Design

More information

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels

k-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

ENSC Discrete Time Systems. Project Outline. Semester

ENSC Discrete Time Systems. Project Outline. Semester ENSC 49 - iscrete Time Systems Prject Outline Semester 006-1. Objectives The gal f the prject is t design a channel fading simulatr. Upn successful cmpletin f the prject, yu will reinfrce yur understanding

More information

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017

Resampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017 Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised

More information

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs Admissibility Cnditins and Asympttic Behavir f Strngly Regular Graphs VASCO MOÇO MANO Department f Mathematics University f Prt Oprt PORTUGAL vascmcman@gmailcm LUÍS ANTÓNIO DE ALMEIDA VIEIRA Department

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm

More information

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal

More information

What is Statistical Learning?

What is Statistical Learning? What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,

More information

Tree Structured Classifier

Tree Structured Classifier Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information

Comparing Several Means: ANOVA. Group Means and Grand Mean

Comparing Several Means: ANOVA. Group Means and Grand Mean STAT 511 ANOVA and Regressin 1 Cmparing Several Means: ANOVA Slide 1 Blue Lake snap beans were grwn in 12 pen-tp chambers which are subject t 4 treatments 3 each with O 3 and SO 2 present/absent. The ttal

More information

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa There are tw parts t this lab. The first is intended t demnstrate hw t request and interpret the spatial diagnstics f a standard OLS regressin mdel using GeDa. The diagnstics prvide infrmatin abut the

More information

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551

More information

Part 3 Introduction to statistical classification techniques

Part 3 Introduction to statistical classification techniques Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms

More information

Localized Model Selection for Regression

Localized Model Selection for Regression Lcalized Mdel Selectin fr Regressin Yuhng Yang Schl f Statistics University f Minnesta Church Street S.E. Minneaplis, MN 5555 May 7, 007 Abstract Research n mdel/prcedure selectin has fcused n selecting

More information

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS 1 Influential bservatins are bservatins whse presence in the data can have a distrting effect n the parameter estimates and pssibly the entire analysis,

More information

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9. Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.

More information

STATS216v Introduction to Statistical Learning Stanford University, Summer Practice Final (Solutions) Duration: 3 hours

STATS216v Introduction to Statistical Learning Stanford University, Summer Practice Final (Solutions) Duration: 3 hours STATS216v Intrductin t Statistical Learning Stanfrd University, Summer 2016 Practice Final (Slutins) Duratin: 3 hurs Instructins: (This is a practice final and will nt be graded.) Remember the university

More information

Math Foundations 20 Work Plan

Math Foundations 20 Work Plan Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant

More information

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:

More information

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d) COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise

More information

Statistical Learning. 2.1 What Is Statistical Learning?

Statistical Learning. 2.1 What Is Statistical Learning? 2 Statistical Learning 2.1 What Is Statistical Learning? In rder t mtivate ur study f statistical learning, we begin with a simple example. Suppse that we are statistical cnsultants hired by a client t

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

We say that y is a linear function of x if. Chapter 13: The Correlation Coefficient and the Regression Line

We say that y is a linear function of x if. Chapter 13: The Correlation Coefficient and the Regression Line Chapter 13: The Crrelatin Cefficient and the Regressin Line We begin with a sme useful facts abut straight lines. Recall the x, y crdinate system, as pictured belw. 3 2 1 y = 2.5 y = 0.5x 3 2 1 1 2 3 1

More information

The standards are taught in the following sequence.

The standards are taught in the following sequence. B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and

More information

Least Squares Optimal Filtering with Multirate Observations

Least Squares Optimal Filtering with Multirate Observations Prc. 36th Asilmar Cnf. n Signals, Systems, and Cmputers, Pacific Grve, CA, Nvember 2002 Least Squares Optimal Filtering with Multirate Observatins Charles W. herrien and Anthny H. Hawes Department f Electrical

More information

Lab 1 The Scientific Method

Lab 1 The Scientific Method INTRODUCTION The fllwing labratry exercise is designed t give yu, the student, an pprtunity t explre unknwn systems, r universes, and hypthesize pssible rules which may gvern the behavir within them. Scientific

More information

Statistics, Numerical Models and Ensembles

Statistics, Numerical Models and Ensembles Statistics, Numerical Mdels and Ensembles Duglas Nychka, Reinhard Furrer,, Dan Cley Claudia Tebaldi, Linda Mearns, Jerry Meehl and Richard Smith (UNC). Spatial predictin and data assimilatin Precipitatin

More information

On Huntsberger Type Shrinkage Estimator for the Mean of Normal Distribution ABSTRACT INTRODUCTION

On Huntsberger Type Shrinkage Estimator for the Mean of Normal Distribution ABSTRACT INTRODUCTION Malaysian Jurnal f Mathematical Sciences 4(): 7-4 () On Huntsberger Type Shrinkage Estimatr fr the Mean f Nrmal Distributin Department f Mathematical and Physical Sciences, University f Nizwa, Sultanate

More information

Lecture 10, Principal Component Analysis

Lecture 10, Principal Component Analysis Principal Cmpnent Analysis Lecture 10, Principal Cmpnent Analysis Ha Helen Zhang Fall 2017 Ha Helen Zhang Lecture 10, Principal Cmpnent Analysis 1 / 16 Principal Cmpnent Analysis Lecture 10, Principal

More information

The blessing of dimensionality for kernel methods

The blessing of dimensionality for kernel methods fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

Hypothesis Tests for One Population Mean

Hypothesis Tests for One Population Mean Hypthesis Tests fr One Ppulatin Mean Chapter 9 Ala Abdelbaki Objective Objective: T estimate the value f ne ppulatin mean Inferential statistics using statistics in rder t estimate parameters We will be

More information

SAMPLING DYNAMICAL SYSTEMS

SAMPLING DYNAMICAL SYSTEMS SAMPLING DYNAMICAL SYSTEMS Melvin J. Hinich Applied Research Labratries The University f Texas at Austin Austin, TX 78713-8029, USA (512) 835-3278 (Vice) 835-3259 (Fax) hinich@mail.la.utexas.edu ABSTRACT

More information

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION NUROP Chinese Pinyin T Chinese Character Cnversin NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION CHIA LI SHI 1 AND LUA KIM TENG 2 Schl f Cmputing, Natinal University f Singapre 3 Science

More information

Lead/Lag Compensator Frequency Domain Properties and Design Methods

Lead/Lag Compensator Frequency Domain Properties and Design Methods Lectures 6 and 7 Lead/Lag Cmpensatr Frequency Dmain Prperties and Design Methds Definitin Cnsider the cmpensatr (ie cntrller Fr, it is called a lag cmpensatr s K Fr s, it is called a lead cmpensatr Ntatin

More information

February 28, 2013 COMMENTS ON DIFFUSION, DIFFUSIVITY AND DERIVATION OF HYPERBOLIC EQUATIONS DESCRIBING THE DIFFUSION PHENOMENA

February 28, 2013 COMMENTS ON DIFFUSION, DIFFUSIVITY AND DERIVATION OF HYPERBOLIC EQUATIONS DESCRIBING THE DIFFUSION PHENOMENA February 28, 2013 COMMENTS ON DIFFUSION, DIFFUSIVITY AND DERIVATION OF HYPERBOLIC EQUATIONS DESCRIBING THE DIFFUSION PHENOMENA Mental Experiment regarding 1D randm walk Cnsider a cntainer f gas in thermal

More information

Modelling of Clock Behaviour. Don Percival. Applied Physics Laboratory University of Washington Seattle, Washington, USA

Modelling of Clock Behaviour. Don Percival. Applied Physics Laboratory University of Washington Seattle, Washington, USA Mdelling f Clck Behaviur Dn Percival Applied Physics Labratry University f Washingtn Seattle, Washingtn, USA verheads and paper fr talk available at http://faculty.washingtn.edu/dbp/talks.html 1 Overview

More information

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares

More information

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse

More information

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*

More information

Math 10 - Exam 1 Topics

Math 10 - Exam 1 Topics Math 10 - Exam 1 Tpics Types and Levels f data Categrical, Discrete r Cntinuus Nminal, Ordinal, Interval r Rati Descriptive Statistics Stem and Leaf Graph Dt Plt (Interpret) Gruped Data Relative and Cumulative

More information

Physics 2B Chapter 23 Notes - Faraday s Law & Inductors Spring 2018

Physics 2B Chapter 23 Notes - Faraday s Law & Inductors Spring 2018 Michael Faraday lived in the Lndn area frm 1791 t 1867. He was 29 years ld when Hand Oersted, in 1820, accidentally discvered that electric current creates magnetic field. Thrugh empirical bservatin and

More information

Homology groups of disks with holes

Homology groups of disks with holes Hmlgy grups f disks with hles THEOREM. Let p 1,, p k } be a sequence f distinct pints in the interir unit disk D n where n 2, and suppse that fr all j the sets E j Int D n are clsed, pairwise disjint subdisks.

More information

Checking the resolved resonance region in EXFOR database

Checking the resolved resonance region in EXFOR database Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 11: Mdeling with systems f ODEs In Petre Department f IT, Ab Akademi http://www.users.ab.fi/ipetre/cmpmd/ Mdeling with differential equatins Mdeling strategy Fcus

More information

Eric Klein and Ning Sa

Eric Klein and Ning Sa Week 12. Statistical Appraches t Netwrks: p1 and p* Wasserman and Faust Chapter 15: Statistical Analysis f Single Relatinal Netwrks There are fur tasks in psitinal analysis: 1) Define Equivalence 2) Measure

More information

IAML: Support Vector Machines

IAML: Support Vector Machines 1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int

More information

Linear programming III

Linear programming III Linear prgramming III Review 1/33 What have cvered in previus tw classes LP prblem setup: linear bjective functin, linear cnstraints. exist extreme pint ptimal slutin. Simplex methd: g thrugh extreme pint

More information

Comprehensive Exam Guidelines Department of Chemical and Biomolecular Engineering, Ohio University

Comprehensive Exam Guidelines Department of Chemical and Biomolecular Engineering, Ohio University Cmprehensive Exam Guidelines Department f Chemical and Bimlecular Engineering, Ohi University Purpse In the Cmprehensive Exam, the student prepares an ral and a written research prpsal. The Cmprehensive

More information

NUMBERS, MATHEMATICS AND EQUATIONS

NUMBERS, MATHEMATICS AND EQUATIONS AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t

More information

Computational Statistics

Computational Statistics Cmputatinal Statistics Spring 2008 Peter Bühlmann and Martin Mächler Seminar für Statistik ETH Zürich February 2008 (February 23, 2011) ii Cntents 1 Multiple Linear Regressin 1 1.1 Intrductin....................................

More information

[COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t o m a k e s u r e y o u a r e r e a d y )

[COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t o m a k e s u r e y o u a r e r e a d y ) (Abut the final) [COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t m a k e s u r e y u a r e r e a d y ) The department writes the final exam s I dn't really knw what's n it and I can't very well

More information

Emphases in Common Core Standards for Mathematical Content Kindergarten High School

Emphases in Common Core Standards for Mathematical Content Kindergarten High School Emphases in Cmmn Cre Standards fr Mathematical Cntent Kindergarten High Schl Cntent Emphases by Cluster March 12, 2012 Describes cntent emphases in the standards at the cluster level fr each grade. These

More information

Determining the Accuracy of Modal Parameter Estimation Methods

Determining the Accuracy of Modal Parameter Estimation Methods Determining the Accuracy f Mdal Parameter Estimatin Methds by Michael Lee Ph.D., P.E. & Mar Richardsn Ph.D. Structural Measurement Systems Milpitas, CA Abstract The mst cmmn type f mdal testing system

More information

Inference in the Multiple-Regression

Inference in the Multiple-Regression Sectin 5 Mdel Inference in the Multiple-Regressin Kinds f hypthesis tests in a multiple regressin There are several distinct kinds f hypthesis tests we can run in a multiple regressin. Suppse that amng

More information

Functional Form and Nonlinearities

Functional Form and Nonlinearities Sectin 6 Functinal Frm and Nnlinearities This is a gd place t remind urselves f Assumptin #0: That all bservatins fllw the same mdel. Levels f measurement and kinds f variables There are (at least) three

More information

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme Enhancing Perfrmance f / Neural Classifiers via an Multivariate Data Distributin Scheme Halis Altun, Gökhan Gelen Nigde University, Electrical and Electrnics Engineering Department Nigde, Turkey haltun@nigde.edu.tr

More information

CONTENTS OF PART IV NOTES FOR SUMMER STATISTICS INSTITUTE COURSE COMMON MISTAKES IN STATISTICS SPOTTING THEM AND AVOIDING THEM

CONTENTS OF PART IV NOTES FOR SUMMER STATISTICS INSTITUTE COURSE COMMON MISTAKES IN STATISTICS SPOTTING THEM AND AVOIDING THEM 1 2 CONTENTS OF PART IV NOTES FOR SUMMER STATISTICS INSTITUTE COURSE COMMON MISTAKES IN STATISTICS SPOTTING THEM AND AVOIDING THEM Part IV: Mistakes Invlving Regressin Dividing a Cntinuus Variable int

More information

You need to be able to define the following terms and answer basic questions about them:

You need to be able to define the following terms and answer basic questions about them: CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f

More information

Document for ENES5 meeting

Document for ENES5 meeting HARMONISATION OF EXPOSURE SCENARIO SHORT TITLES Dcument fr ENES5 meeting Paper jintly prepared by ECHA Cefic DUCC ESCOM ES Shrt Titles Grup 13 Nvember 2013 OBJECTIVES FOR ENES5 The bjective f this dcument

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

Revision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax

Revision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax .7.4: Direct frequency dmain circuit analysis Revisin: August 9, 00 5 E Main Suite D Pullman, WA 9963 (509) 334 6306 ice and Fax Overview n chapter.7., we determined the steadystate respnse f electrical

More information

Linear Classification

Linear Classification Linear Classificatin CS 54: Machine Learning Slides adapted frm Lee Cper, Jydeep Ghsh, and Sham Kakade Review: Linear Regressin CS 54 [Spring 07] - H Regressin Given an input vectr x T = (x, x,, xp), we

More information

CN700 Additive Models and Trees Chapter 9: Hastie et al. (2001)

CN700 Additive Models and Trees Chapter 9: Hastie et al. (2001) CN700 Additive Mdels and Trees Chapter 9: Hastie et al. (2001) Madhusudana Shashanka Department f Cgnitive and Neural Systems Bstn University CN700 - Additive Mdels and Trees March 02, 2004 p.1/34 Overview

More information

5 th grade Common Core Standards

5 th grade Common Core Standards 5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin

More information

Lesson Plan. Recode: They will do a graphic organizer to sequence the steps of scientific method.

Lesson Plan. Recode: They will do a graphic organizer to sequence the steps of scientific method. Lessn Plan Reach: Ask the students if they ever ppped a bag f micrwave ppcrn and nticed hw many kernels were unppped at the bttm f the bag which made yu wnder if ther brands pp better than the ne yu are

More information

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction

T Algorithmic methods for data mining. Slide set 6: dimensionality reduction T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,

More information

NOTE ON A CASE-STUDY IN BOX-JENKINS SEASONAL FORECASTING OF TIME SERIES BY STEFFEN L. LAURITZEN TECHNICAL REPORT NO. 16 APRIL 1974

NOTE ON A CASE-STUDY IN BOX-JENKINS SEASONAL FORECASTING OF TIME SERIES BY STEFFEN L. LAURITZEN TECHNICAL REPORT NO. 16 APRIL 1974 NTE N A CASE-STUDY IN B-JENKINS SEASNAL FRECASTING F TIME SERIES BY STEFFEN L. LAURITZEN TECHNICAL REPRT N. 16 APRIL 1974 PREPARED UNDER CNTRACT N00014-67-A-0112-0030 (NR-042-034) FR THE FFICE F NAVAL

More information

MODULE FOUR. This module addresses functions. SC Academic Elementary Algebra Standards:

MODULE FOUR. This module addresses functions. SC Academic Elementary Algebra Standards: MODULE FOUR This mdule addresses functins SC Academic Standards: EA-3.1 Classify a relatinship as being either a functin r nt a functin when given data as a table, set f rdered pairs, r graph. EA-3.2 Use

More information

Methods for Determination of Mean Speckle Size in Simulated Speckle Pattern

Methods for Determination of Mean Speckle Size in Simulated Speckle Pattern 0.478/msr-04-004 MEASUREMENT SCENCE REVEW, Vlume 4, N. 3, 04 Methds fr Determinatin f Mean Speckle Size in Simulated Speckle Pattern. Hamarvá, P. Šmíd, P. Hrváth, M. Hrabvský nstitute f Physics f the Academy

More information

Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Key Wrds: Autregressive, Mving Average, Runs Tests, Shewhart Cntrl Chart

Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Key Wrds: Autregressive, Mving Average, Runs Tests, Shewhart Cntrl Chart Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Sandy D. Balkin Dennis K. J. Lin y Pennsylvania State University, University Park, PA 16802 Sandy Balkin is a graduate student

More information

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10]

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10] EECS 270, Winter 2017, Lecture 3 Page 1 f 6 Medium Scale Integrated (MSI) devices [Sectins 2.9 and 2.10] As we ve seen, it s smetimes nt reasnable t d all the design wrk at the gate-level smetimes we just

More information

Surface and Contact Stress

Surface and Contact Stress Surface and Cntact Stress The cncept f the frce is fundamental t mechanics and many imprtant prblems can be cast in terms f frces nly, fr example the prblems cnsidered in Chapter. Hwever, mre sphisticated

More information

Section 11 Simultaneous Equations

Section 11 Simultaneous Equations Sectin 11 Simultaneus Equatins The mst crucial f ur OLS assumptins (which carry er t mst f the ther estimatrs that we hae studied) is that the regressrs be exgenus uncrrelated with the errr term This assumptin

More information

Time, Synchronization, and Wireless Sensor Networks

Time, Synchronization, and Wireless Sensor Networks Time, Synchrnizatin, and Wireless Sensr Netwrks Part II Ted Herman University f Iwa Ted Herman/March 2005 1 Presentatin: Part II metrics and techniques single-hp beacns reginal time znes ruting-structure

More information

More Tutorial at

More Tutorial at Answer each questin in the space prvided; use back f page if extra space is needed. Answer questins s the grader can READILY understand yur wrk; nly wrk n the exam sheet will be cnsidered. Write answers,

More information

Lecture 24: Flory-Huggins Theory

Lecture 24: Flory-Huggins Theory Lecture 24: 12.07.05 Flry-Huggins Thery Tday: LAST TIME...2 Lattice Mdels f Slutins...2 ENTROPY OF MIXING IN THE FLORY-HUGGINS MODEL...3 CONFIGURATIONS OF A SINGLE CHAIN...3 COUNTING CONFIGURATIONS FOR

More information

Smoothing, penalized least squares and splines

Smoothing, penalized least squares and splines Smthing, penalized least squares and splines Duglas Nychka, www.image.ucar.edu/~nychka Lcally weighted averages Penalized least squares smthers Prperties f smthers Splines and Reprducing Kernels The interplatin

More information

Distributions, spatial statistics and a Bayesian perspective

Distributions, spatial statistics and a Bayesian perspective Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics

More information

WRITING THE REPORT. Organizing the report. Title Page. Table of Contents

WRITING THE REPORT. Organizing the report. Title Page. Table of Contents WRITING THE REPORT Organizing the reprt Mst reprts shuld be rganized in the fllwing manner. Smetime there is a valid reasn t include extra chapters in within the bdy f the reprt. 1. Title page 2. Executive

More information

Chapter 15 & 16: Random Forests & Ensemble Learning

Chapter 15 & 16: Random Forests & Ensemble Learning Chapter 15 & 16: Randm Frests & Ensemble Learning DD3364 Nvember 27, 2012 Ty Prblem fr Bsted Tree Bsted Tree Example Estimate this functin with a sum f trees with 9-terminal ndes by minimizing the sum

More information

Evaluating enterprise support: state of the art and future challenges. Dirk Czarnitzki KU Leuven, Belgium, and ZEW Mannheim, Germany

Evaluating enterprise support: state of the art and future challenges. Dirk Czarnitzki KU Leuven, Belgium, and ZEW Mannheim, Germany Evaluating enterprise supprt: state f the art and future challenges Dirk Czarnitzki KU Leuven, Belgium, and ZEW Mannheim, Germany Intrductin During the last decade, mircecnmetric ecnmetric cunterfactual

More information

download instant at

download instant at dwnlad instant at wwweasysemestercm Part A: Overview and Suggestins Statistics in the Cntext f Scientific Research This chapter pens with an verview f scientific research The gal is t cnvey the pint that

More information

A Quick Overview of the. Framework for K 12 Science Education

A Quick Overview of the. Framework for K 12 Science Education A Quick Overview f the NGSS EQuIP MODULE 1 Framewrk fr K 12 Science Educatin Mdule 1: A Quick Overview f the Framewrk fr K 12 Science Educatin This mdule prvides a brief backgrund n the Framewrk fr K-12

More information

Sequential Allocation with Minimal Switching

Sequential Allocation with Minimal Switching In Cmputing Science and Statistics 28 (1996), pp. 567 572 Sequential Allcatin with Minimal Switching Quentin F. Stut 1 Janis Hardwick 1 EECS Dept., University f Michigan Statistics Dept., Purdue University

More information

A study on GPS PDOP and its impact on position error

A study on GPS PDOP and its impact on position error IndianJurnalfRadi& SpacePhysics V1.26,April1997,pp. 107-111 A study n GPS and its impact n psitin errr P Banerjee,AnindyaBse& B SMathur TimeandFrequencySectin,NatinalPhysicalLabratry,NewDelhi110012 Received19June

More information