Law School Data GPA LSAT score

Size: px
Start display at page:

Download "Law School Data GPA LSAT score"

Transcription

1 Bootstrap Estimation Suppose a simple ranom sample (sampling with replacement) X ; X ; : : : ; X n is available from some population with istribution function (cf) F (x): Objective: Make inferences about some feature of the population ffl meian ffl variance ffl correlation 08 A statistic t n is compute from the observe ata: Sample mean: t n = n Stanar eviation: t n = vu u t Correlation: t n = X j = 6 4 n X Pn j= X j n (X j X) j= X j X j j = ; ; : : : ; n Pn j= (X j X : )(X j X : ) vu u t P n j= (X j X : ) v u ut Pn j= (X j X : ) 08 t n estimates some feature of the population. What can you say about the istribution of t n, with respect to all possible samples of size n from the population? ffl Expectation of t n ffl Stanar eviation ffl Distribution function How can a confience interval be constructe? Simulation: (The population c..f. is known) ffl For a univariate normal istribution with mean μ an variance ff, the cf is F (x) = P rfx» xg = Z x p e (w μ) ff w ßff ffl Simulate B samples of size n an compute the value of t n for each sample: t n; ; t n; ; : : : ; t n;b

2 ffl Approximate E F (t n ) with the simulate mean t = B B X k= t n;k ffl Approximate V ar F (t n ) with B X B (t n;k t) k= ffl Approximate the stanar eviation of t n with vu u t B B X k= (t n;k t) ffl Approximate the c..f. with for t n F n (t) = number of samples with t n;k < t B ffl Orer the B values of t n smallest to largest t n()» t n()» : : :» t n(b) from an approximate percentiles of the istribution of t n What if F (X), the population c..f. is unknown? ffl You cannot use a ranom number generator to simulate samples of size n an values of t n, from the actual population. ffl Use a bootstrap (or resampling) metho? Basic iea: () Approximate the population cf F (x) with the empirical cf ^F n (x) obtaine from the observe sample X ; X ; : : : ; X n Assign probability to each observation in the sample. n Then F b n (x) = n where b is the number of observations in the sample with X i < x for i = ; ; : : : ; n

3 ffl Approximate the act of simulating B samples of size n from a population with c..f. F (x) by simulating B samples of size n from a population with c..f. F n (x) The approximating" population is the original sample. Sample n observations from the original sample using simple ranom sampling with replacement. This will be calle a boostrap sample. repeat this B times to obtain B bootstrap samples of size n. Evaluate the summary statistic for each bootstrap sample Sample : t Λ n; Sample : t Λ n;. Sample B: t Λ n;b ffl Evaluate bootstrap estimates of features of the sampling istribution for t n, when the c..f. is ^F n (x). V ar Fn (t n ) E Fn (t n ) = B = B ±B b= B X b= tλ n;b 4 t Λ n;b E F n (t n ) 3 5 ffl This resampling proceure is calle a nonparametric bootstrap ffl If use properly it provies consistent large sample results: As n! an B! E Fn (t n ) = B X B b= tλ n;b V ar Fn (t n ) =! E F (t n ) B B X 6 4t Λ n;b 3 B X B b= tλ 7 n;b5 b=! V ar F (t n ) 09 09

4 The bootstrap is a large sample metho ffl Large number of bootstrap samples. As B! ffl Consistency: Original sample size must become large As n! ; ^F n (x)! F (x) for any x E ^Fn (t n ) = B V ar ^Fn (t n ) = B X b= tλ n;b! E ^Fn (t n ) B B X! V ar ^Fn (t n ) b= (tλ n;b What is a goo value for B? stanar eviation: B ß 00 confience interval: B ß 000 more emaning applications: B ß 5000 E ^Fn (t n )) 093 Then, E ^Fn (t n )! E F (t n ) V ar ^Fn (t n )! V ar F (t n ).. ^F n (x) ffl For small values of n; coul eviate substantially from F (x). 094 Example.: Average values for GPA an LSAT scores for stuents amitte to n=5 Law Schools in 973. School LSAT GPA GPA Law School Data LSAT score

5 ffl These schools were ranomly selecte from a larger population of law schools. ffl We want to make inferences about the correlation between GPA an LSAT scores for the population of law schools ffl The sample correlation (n=5) is r = 0:7764 t n Bootstrap samples Take samples of n=5 schools, using simple ranom sampling with replacement Sample : School LSAT GPA t Λ n; = 0:8586 = r 5; Sample : School LSAT GPA Repeat this B = 5000 times to obtain t Λ n; ; tλ n; ; : : : ; tλ n;5000 t Λ n; = 0:6673 = r 5;

6 Estimate correlation from the original sample of n = 5 law schools 5000 Bootstrap Correlations r = 0: Bootstrap stanar error (from B = 5000 bootstrap samples) is S r = vu u t B B X 6 b= 4t Λ n;b B 3 B X tλ 7 n;j5 j= = vu u t X 6 b= 4r 5;b X r 7 5;j 5 j= = 0: Number of Bootstrap samples Bootstrap estimate of Stanar error B = 5 0:08 B = 50 0:0985 B = 00 0:334 B = 50 0:306 B = 500 0:38 B = 000 0:99 B = 500 0:366 B = :34 GPA Law School Data Very selom are more than B = 00 replications neee for estimating a stanar error." Efron & Tibshirani (993) (page 5) LSAT score 03 04

7 School LSAT GPA Type School LSAT GPA Type School LSAT GPA Type School LSAT GPA Type

8 The correlation coefficient for the population of 8 Law Schools in 973 is ρ = 0:7600 The exact istribution of estimate correlation coefficients for ranom samples of size n=5 from this population involves More than possible samples of size n=5 Results from 00,000 samples of size n=5. Percentiles min = max = mean = st. error = Bias: From a sample of size n, t n is use to estimate a population parameter. Bias F (t n ) = E F (t n ) " " average across true all possible parameter samples of size n value from the population Law School example: Bias F (r 5 ) = E F (r 5 ) 0:7600 = :7464 :7600 = 0:036 Bootstrap estimate of bias: true value" if you take the original sample as the population Bias F (t n ) = E F (t n ) t n " approximate this with the average of results from B bootstrap samples PB B b= t Λ n;b #

9 Improve bootstrap bias estimation: Efron & Tibshirani (993), Section 0.4 Law School example: (B = 5000 bootstrap samples) Bias F (r 5 ) = 5000 ±5000 b= r 5;b r 5 = :769 :7764 = :007 3 Bias correcte estimates: f t n = t n Bias b (t n ) = t n 0 B Law School example: ~r 5 = r 5 Bias b (r 5 ) B X tλ C n;ba b= = :7764 ( :007) = 0:7836 " here we move farther away from ρ = :7600: 4 Bias correction can be angerous in practice: ffl f t n may have a substantially larger variance than t n ffl MSE F ( f t n ) = [Bias F ( f t n )] + V ar F ( f t n ) is often larger than MSE F (t n ) = [Bias F (t n )] + V ar F (t n ) Empirical Percentile Bootstrap Confience Intervals Construct an approximate ( ff) 00% confience interval for. ffl t n is an estimator for ffl Compute B bootstrap samples to obtain t Λ n; ; : : : ; tλ n;b ffl Orer the bootstrappe values from smallest to largest t n;()» t n;()» : : :» t n;(b) 5 6

10 ffl Compute upper an lower 00th percentiles ff Compute k L = 6 4(B + ) ff = largest integer» (B + ) ff Law School example: (B = 5000 bootstrap samples) Construct an approximate 90% confience interval for ρ = population correlation ff = :0 k U = B + k L k L = [(500)(:05)] = [50:05] = 50 Then, an approximate ( ff) 00% confience interval for is [t n;(kl ); t n;(ku ) ] k U = = 475 An approximate large sample 90% confience interval is [r 5;(50) ; r 5;(475) ] = [0:507; 0:9487] Bootstrap Correlations Law School example: ffl Fisher Z-transformation Z n = 0 log + r n B _ο N log r n 0 C A + ρ ρ C A ; n 3 ffl An approximate ( ff) 00% confience interval for 0 +ρ A ρ is lower limit: Z n Z ff= p n 3 = Z L C A upper limit: Z n +Z ff= p n 3 = Z U ffl Transform back to the original scale 6 4 e Z L e ; ezu Z L + e Z u

11 For n = 5 an r 5 = :7764 we have an Z 5 = 0 log + :7764 C A = :0364 :7764 Z L = Z 5 Z :05 p 5 3 = :5637 Z U = Z 5 + Z :05 p 5 3 = :53 an an approximate 90% confience interval for the correlation is (0:509; 0:907) The bootstrap percentile interval woul approximate this interval if the original sample was taken from a bivariate normal approximation. Percentile Bootstrap Confience Intervals Suppose there is a transformation such that ffi = m( ) ffi = m(t n ) ο N(ffi;! ) for some stanar eviation!. Then, an approximate confience interval for is [m ( ffi Z (ff=)!); m ( ffi + Z ff=!)] ffl The bootstrap percentile interval is a consistent approximation. (You o not have to ientify the m() transformation. ffl The bootstrap approximation becomes more accurate for larger sample ffl For smaller samples, the coverage probability of the bootstrap percentile interval tens to be smaller than the nominal level ( ff) 00. ffl A percentile interval is entirely insie the parameter space. A percentile confience interval for a correlation lies insie the interval [ ; ]. 3 4

12 Bias-correcte an accelerate (BCa) bootstrap percentile confience intervals ffl simulate B bootstrap samples an orer the resulting estimates t Λ n;()» tλ n;()»» tλ n;(b) ffl The BC a interval of intene coverage ff is given by where ff ff [t Λ n;([ff(b+)]) ; tλ n;([ff(b+)]) ] = Φ = Φ 0 Z Z 0 + Z 0 Z ff= a( Z 0 Z ff= ) Z 0 + Z ff= a( Z 0 + Z ff= ) C A C A 5 6 an Φ() Z ff= Z 0 = Φ ψ is the c..f. for the stanar normal istribution is an upper" percentile of the stanar normal istribution, e.g., Z :05 = :645 an Φ(Z ff= ) = ff proportion of t Λ n;b values smaller than t n is roughly a measure of meian bias of t n in normal units." When exactly half of the bootstrap samples have t Λ n;b values less than t n, then Z 0 = 0.! a = Pn j= (t n;( ) t n; j ) P n j= (t n;( ) t n; j ) 3 5 3= is the estimate acceleration, where t n; j an is the value of t n when the j-th case is remove from the sample t n;( ) = n Pn j= (t n; j ) 7 8

13 ffl BC a intervals are secon orer accurate P rf < lower en of BC a intervalg = ff + C lower n P rf > upper en of BC a intervalg = ff + C upper n ffl Bootstrap percentile intervals are first orer accurate P rf < lower eng ff = + CΛ lower p n ffl ABC intervals are approximations to BC a intervals secon orer accurate only use about 3% of the computation time (See Efron & Tibshirani (993) Chapter 4) P rf < upper eng ff = + CΛ upper p n 9 30 # This is S-plus coe for creating # bootstrappe confience intervals # for a correlation coefficient. It is # store in the file # # lawschl.ssc # # Any line precee with a poun sign # is a comment that is ignore by the # program. The law school ata are # rea from the file # # lawschl.at # Enter the law school ata into a ata frame laws <- rea.table("lawschl.at", col.names=c("school","lsat","gpa")) laws School LSAT GPA

14 # Plot the ata par(fin=c(7.0,7.0),pch=6,mkh=.5,mex=.5) plot(laws$lsat,laws$gpa, type="p", xlab="gpa",ylab="lsat score", main="law School Data") # Compute the sample correlation matrix rr<-cor(laws$lsat,laws$gpa) cat("estimate correlation: ", roun(rr,5), fill=t) # First test for zero correlation n <- length(laws$lsat); tt<- sqrt(n-)*rr/sqrt( - rr*rr) pval <- - pt(tt,n-) pval <- roun(pval,igits=5) cat("t-test for zero correlation: ", roun(tt,4), fill=t) t-test for zero correlation: cat("p-values for the t-test for zero correlation: ", pval, fill=t) Estimate correlation: p-values for the t-test for zero correlation: # Use Fisher's z-transformation to construct # approximate confience intervals. # First set the level of confience at -alpha. alpha <-.0 z <- 0.5*log((+rr)/(-rr)) zl <- z - qnorm(-alpha/)/sqrt(n-3) zu <- z + qnorm(-alpha/)/sqrt(n-3) rl <- roun((exp(*zl)-)/(exp(*zl)+), igits=4) ru <- roun((exp(*zu)-)/(exp(*zu)+), igits=4) per <- (-alpha)*00; cat( per,"% confience interval: (", rl,", ",ru,")",fill=t) # Compute bootstrap confience intervals. # Use B=5000 bootstrap samples. nboot < rboot <- bootstrap(ata=laws, statistic=cor(gpa,lsat),b=nboot) Forming replications to 00 Forming replications 0 to 00 Forming replications 0 to Forming replications 480 to 4900 Forming replications 490 to % confience interval: ( 0.509, ) 35 36

15 # limits.emp(): Calculates empirical percentiles # for the bootstrappe parameter # estimates in a resamp object. # The quantile function is use to # calculate the empirical percentiles. # usage: # limits.emp(x, probs=c(0.05, 0.05, 0.95, 0.975)) limits.emp(rboot, probs=c(0.05,0.95)) 5% 95% Param # limits.bca(): Calculates BCa (bootstrap # bias-correct, ajuste) # confience limits. # usage: # limits.bca(boot.obj, # probs=c(0.05, 0.05, 0.95, 0.975), # etails=f, z0=null, # acceleration=null, # group.size=null, # frame.eval.jack=sys.parent()) # Do another set of 5000 bootstrappe values limits.bca(rboot,probs=c(0.05,0.95),etail=t) rboot <- bootstrap(ata=laws, statistic=cor(gpa,lsat),b=nboot) limits.emp(rboot, probs=c(0.05,0.95)) $limits: 5% 95% Param % 95% Param $emp.probs: 5% 95% Param # Both sets of confience intervals coul # have been obtaine from the summary( ) # function summary(rboot) $z0: Param $acceleration: Param $group.size: [] Call: bootstrap(ata = laws, statistic = cor(gpa, LSAT), B = nboot) Number of Replications: 5000 Summary Statistics: Observe Bias Mean SE Param Empirical Percentiles:.5% 5% 95% 97.5% Param BCa Percentiles:.5% 5% 95% 97.5% Param

16 5000 Bootstrap Correlations # Make a histogram of the bootstrappe # correlations. hist(rboot$rep,nclass=50, xlab=" ", main="5000 Bootstrap Correlations", ensity=.000) The bootstrap can fail: Example: X ; X ; : : : ; X n are sample from a uniform (0; ) istribution: true ensity: f(x) = ; 0 < x < true c..f.: F (x) = 8 >< >: 0 x» 0 x 0 < x» x > Bootstrap percentile confience intervals for ten to be too short. Application of the bootstrap must aequately replicate the ranom process that prouce the original sample ffl Simple ranom samples ffl Neste" experiments Sample plants from a fiel Sample leaves from plants ffl Curve fitting (existence of covariates) Fixe levels Ranom samples 43 44

17 Parametric Bootstrap ffl Suppose you knew" that (X j ; X j ) j = ; : : : ; n were obtaine from a simple ranom sample from a bivariate normal istribution, i.e., X j = 6 4 X j X j ο NID μ μ ; ± ffl Estimate unknown parameters 3 μ μ = 6 n X 4 7 μ X j = X j= ± = n n X 5 = n C A j= (X j X)(X j X) T 45 ffl Obtain a bootstrap sample of size n by sampling from a N(^μ; ^±) istribution, i.e. X ;b ; : : : ; X n;b an compute t Λ n;b = r n;b ffl Repeat this to obtain r n; ; r n; ; : : : ; r n;b ffl Compute bootstrap stanar errors bias estimators confience intervals 46 References Davison, A.C. an Hinkley, D.V. (997) Bootstrap Methos an Their Applications, Cambrige Series in Statistical an Probabilistic Mathematics, Cambrige University Press, New York. Efron, B. (98) The Jackknife, The Bootstrap an other resampling plans, CBMS, 38, SIAM-NSF, Philaelphia. Efron, B. an Gong, G. (983) The American Statistician,, Efron, B. (987) Better Bootstrap confience intervals (with iscussion) Journal of the American Statistical Association, 8, Efron, B. an Tibsharani, R. (993) An Introuction to the Bootstrap, Chapman an Hall, New York. Shao, J. an Tu, D. (995) The Jackknife an Bootstrap, New York, Springer. 47 Example.: Stormer viscometer ata (Venables & Ripley, Chapter 8) ffl measure viscosity of a flui ffl measure time taken for an inner cycliner in the mechanism to complete a specific number of revolutions in response to an actuating weight ffl calibrate the viscometer using runs with varying weights (W ) (g) fluis with known viscosity (V ) recor the time (T ) (sec 48

18 ffl theoretical moel T = fi V W fi + ffl # This coe is use to explore the # Stormer viscometer ata. It is store # in the file # # stormer.ssc # # Enter the ata into a ata frame. # The ata are store in the file # # stormer.at library(mass) stormer <- rea.table("stormer.at") stormer 49 Viscosity Wt Time Starting values for fi an fi : Fit an approximate linear moel. Note that T fi V i i = + ffl W i fi i ) (W i fi )T i = fi V i + ffl i (W i fi ) ) W i T i = fi V i + fi T i + ffl i (W fi ) " " this is the new this is the response variable new error # Use a linear approximation to obtain # starting values for least squares # estimation in the non-linear moel fm0 <- lm(wt*time ~ Viscosity + Time -, ata=stormer) b0 <- coef(fm0) names(b0) <- c("b","b") # Fit the non-linear moel storm.fm <- nls( formula = Time ~ b*viscosity/(wt-b), ata = stormer, start = b0, trace = T) Use OLS estimation to obtain fi (0) = 8:876 fi (0) = : : : :

19 # Create a bivariate confience region # for the the (b,b) parameters. # First set up a gri of (b,b) values bc <- coef(storm.fm) se <- sqrt(iag(vcov(storm.fm))) v <- eviance(storm.fm) summary(storm.fm)$parameters Value St. Error t value b b gsize<-5 b <- bc[] + seq(-3*se[], 3*se[], length = gsize) b <- bc[] + seq(-3*se[], 3*se[], length = gsize) bv <- expan.gri(b, b) # Create a function to evaluate sums of squares ssq <- function(b) sum((stormer$time - b[] * stormer$viscosity/ (stormer$wt-b[]))^) # Create the plot # Evalute the sum of square resiuals an # approximate F-ratios for all of the # (b,b) values on the gri beta <- apply(bv,, ssq) n<-length(stormer$viscosity) f<-length(bc) f<-n-f fstat <- matrix( ((beta - v)/f) / (v/f), gsize, gsize) par(fin=c(7.0,7.0), mex=.5,lw=3) plot(b, b, type="n", main="95% Confience Region") contour(b, b, fstat, levels=c(,,5,7,0,5,0), labex=0.75, lty=, a=t) contour(b, b, fstat, levels=qf(0.95,,), labex=0, lty=, a=t) text(3.6,0.3,"95% CR", aj=0, cex=0.75) points(bc[], bc[], pch=3, mkh=0.5) # remove b,b, an bv rm(b,b,bv,fstat) 55 56

20 Construct a joint confience region for (fi ; fi ): ffl Deviance (resiual sum of squares): (fi ; fi ) = n X 6 j= T 4 j fi V j W j fi ffl Approximate F-statistic: F (fi ; fi ) = (fi;fi) ( c fi; c fi) ( c fi; c fi) n b % Confience Region % CR ffl An approximate ( ff) 00% confience interval consists of all (fi ; fi ) such that b F (fi ; fi ) < F (;n );ff Bootstrap Estimation Bootstrap I: Sample n = 3 cases (T i ; W i ; V i ) from the original ata set (using simple ranom sampling with replacement) #========================================= # Bootstrap I: # Treat the regressors as ranom an # resample the cases (y,x_,x_) #========================================= storm.boot <- bootstrap(stormer, coef(nls(time~b*viscosity/(wt-b), ata=stormer,start=bc)),b=000) summary(storm.boot) Call: bootstrap(ata = stormer, statistic = coef(nls(time ~ (b * Viscosity)/(Wt - b), ata = stormer, start = bc)), B = 000) Number of Replications:

21 Summary Statistics: Observe Bias Mean SE b b Empirical Percentiles:.5% 5% 95% 97.5% b b BCa Percentiles:.5% 5% 95% 97.5% b b Correlation of Replicates: b b b b # Prouce histograms of the bootstrappe # values of the regression coefficients # The following coe is use to raw # non-parametric estimate ensities, # normal ensities, an a histogram # on the same graph. # truehist(): Plot a Histogram (prob=t # by efault) For the function # hist(), probability=f by # efault. # with.sj(): Banwith Selection by Pilot # Estimation of Derivatives. # Uses the metho of Sheather # & Jones (99) to select the # banwith of a Gaussian kernel # ensity estimator. 6 6 b.boot <- storm.boot$rep[,] b.boot <- storm.boot$rep[,] library(mass) par(fin=c(7.0,7.0),mex=.5) # ensity() : Estimate Probability Density # Function. Returns x an y # coorinates of a non-parametric # estimate of the probability # ensity of the ata. Options # inclue the type of winow to # use an the number of points # at which to estimate the ensity. # n = the number of equally space # points at which the ensity # is evaluate. mm <- range(b.boot) min.int <- floor(mm[]) max.int <- ceiling(mm[]) truehist(b.boot,xlim=c(min.int,max.int), ensity=.000) with.sj(b.boot) [] b.boot.ns <- ensity(b.boot,n=00,with=.6) b.boot.ns <- list( x = b.boot.ns$x, y = norm(b.boot.ns$x,mean(b.boot), sqrt(var(b.boot)))) 63 64

22 # Draw the non-parametric ensity lines(b.boot.ns,lty=3,lw=) # Draw normal ensities lines(b.boot.ns,lty=,lw=) truehist(b.boot,xlim=c(min.int,max.int), ensity=.00) with.sj(b.boot) [] legen(7.,-0.30,c("nonparametric", "Normal approx."), lty=c(3,),bty="n",lw=) # Display the istribution of the estimate # of the secon parameter par(fin=c(7.0,7.0),mex=.5) mm <- range(b.boot) min.int <- floor(mm[]) max.int <- ceiling(mm[]) b.boot.ns <- ensity(b.boot,n=00,with=.8) b.boot.ns <- list( x = b.boot.ns$x, y = norm(b.boot.ns$x,mean(b.boot), sqrt(var(b.boot)))) lines(b.boot.ns,lty=3,lw=) lines(b.boot.ns,lty=,lw=) legen(.,-0.30,c("nonparametric", "Normal approx."), lty=c(3,),bty="n",lw=) b.boot b.boot Nonparametric Normal approx. Nonparametric Normal approx

23 Bootstrap II: Fix the values of the explanatory variables f(w j ; V j ) : j = ; : : : ; ng ffl Compute resiuals from fitting the moel to the original sample e j = T j fi V j W j fi j = ; : : : ; n ffl Approximate sampling from the population of ranom errors by taking a sample (with replacement) from fe ; : : : ; e n g say, ffl Create new observations: where (T Λ j;b ; W j; V j ) j = ; : : : ; n T Λ j;b = fi V j W j fi + e Λ j ffl Fit the moel to the j-th bootstrap sample to obtain fi Λ ;b an fi Λ ;b ffl Repeat this for b = ; : : : ; B bootstrap samples e Λ ;b ; eλ ;b ; : : : ; eλ n;b #======================================= # Bootstrap II: # Treat the regressors as fixe an # resample from the resiuals #======================================= # Center the resiuals at zero an # ivie resiuals by linear approximations # to a multiple of the stanar errors. These # centere an scale resiuals approximately # have the same first two moments as the # ranom errors, but they are not quite # uncorrelate. rs <- scale(resi(storm.fm),center=t,scale=f) g <- gra.f(b[], b[], stormer$viscosity, stormer$wt, stormer$tim) D <- attr(g, "graient") h <- -iag(d%*%solve(t(d)%*%d)%*%t(d)) rs <- rs/sqrt(h) fe <- length(rs)-length(coef(storm.fm)) vr <- var(rs) rs <- rs%*%sqrt(eviance(storm.fm)/fe/vr) gra.f <- eriv3( expr = Y ~ (b*x)/(x-b), namevec = c("b", "b"), function.arg = function(b, b, X, X, Y) NULL) 7 7

24 # Compute 000 bootstrappe values of the # regression parameters # Create a function to use in fitting # the moel to bootstrap samples storm.boot <- bootstrap(ata=rs, statistic=storm.bf(rs),b=000) storm.bf <- function(rs) assign("tim", fitte(storm.fm) + rs, frame = ) nls(formula = Tim ~ (b*viscosity)/(wt-b), ata = stormer, start = coef(storm.fm) )$parameters } summary(storm.fm)$parameters Forming replications to 00 Forming replications 0 to 00 Forming replications 0 to 300 Forming replications 30 to 400 Forming replications 40 to 500 Forming replications 50 to 600 Forming replications 60 to 700 Forming replications 70 to 800 Forming replications 80 to 900 Forming replications 90 to 000 Value St. Error t value b b Call: bootstrap(ata = rs, statistic = storm.bf(rs), B = 000) 73 b.boot <- storm.boot$rep[,] b.boot <- storm.boot$rep[,] 74 # The BCA intervals may not be correctly # compute by the following function summary(storm.boot) Number of Replications: 000 Summary Statistics: Observe Bias Mean SE b b Empirical Percentiles:.5% 5% 95% 97.5% b b BCa Confience Limits:.5% 5% 95% 97.5% b b Correlation of Replicates: b b b

25 # The following coe is use to raw # non-parametric estimate ensities, # normal ensities, an histograms # on the same graphic winow. par(fin=c(7.0,7.0),mex=.3) # Draw a non-parametric ensity mm <- range(b.boot) min.int <- floor(mm[]) max.int <- ceiling(mm[]) lines(b.boot.ns,lty=3,lw=) # raw normal ensities truehist(b.boot,xlim=c(min.int,max.int), ensity=.000) b.boot.ns <- ensity(b.boot,n=00, with=with.sj(b.boot)) b.boot.ns <- list( x = b.boot.ns$x, y = norm(b.boot.ns$x,mean(b.boot), sqrt(var(b.boot)))) lines(b.boot.ns,lty=,lw=) legen(7,-0.,c("nonparametric", "Normal approx."), lty=c(3,),bty="n",lw=) # Display the istribution for the other # parameter estimate par(fin=c(7.0,7.0),mex=.3) mm <- range(b.boot) min.int <- floor(mm[]) max.int <- ceiling(mm[]) truehist(b.boot,xlim=c(min.int,max.int), ensity=.0000) b.boot.ns <- ensity(b.boot,n=00, with=with.sj(b.boot)) b.boot.ns <- list( x = b.boot.ns$x, y = norm(b.boot.ns$x,mean(b.boot), sqrt(var(b.boot)))) lines(b.boot.ns,lty=3,lw=) lines(b.boot.ns,lty=lw=) legen(0.5,-0.5,c("nonparametric", "Normal approx."),lty=c(3,), bty="n",lw=) b.boot Nonparametric Normal approx. 80

26 Comparison of stanar errors: Asymptotic Para- normal Ranom Fixe meter approx. Bootstrap Bootstrap fi fi b.boot Nonparametric Normal approx. Comparison of approximate 95% confience intervals: Asymptotic Para- normal Ranom Fixe meter approx. Bootstrap Bootstrap fi (7.50, 3.3) (8.3, 30.40) (7.4, 3.03) fi (0.83, 3.60) (.06, 3.5) (.03, 3.46) " fi i ± t ;:05 S c fii 8 8

The Bootstrap Suppose we draw aniid sample Y 1 ;:::;Y B from a distribution G. Bythe law of large numbers, Y n = 1 B BX j=1 Y j P! Z ydg(y) =E

The Bootstrap Suppose we draw aniid sample Y 1 ;:::;Y B from a distribution G. Bythe law of large numbers, Y n = 1 B BX j=1 Y j P! Z ydg(y) =E 9 The Bootstrap The bootstrap is a nonparametric method for estimating standard errors and computing confidence intervals. Let T n = g(x 1 ;:::;X n ) be a statistic, that is, any function of the data.

More information

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013 Survey Sampling Kosuke Imai Department of Politics, Princeton University February 19, 2013 Survey sampling is one of the most commonly use ata collection methos for social scientists. We begin by escribing

More information

The assumptions are needed to give us... valid standard errors valid confidence intervals valid hypothesis tests and p-values

The assumptions are needed to give us... valid standard errors valid confidence intervals valid hypothesis tests and p-values Statistical Consulting Topics The Bootstrap... The bootstrap is a computer-based method for assigning measures of accuracy to statistical estimates. (Efron and Tibshrani, 1998.) What do we do when our

More information

A Modification of the Jarque-Bera Test. for Normality

A Modification of the Jarque-Bera Test. for Normality Int. J. Contemp. Math. Sciences, Vol. 8, 01, no. 17, 84-85 HIKARI Lt, www.m-hikari.com http://x.oi.org/10.1988/ijcms.01.9106 A Moification of the Jarque-Bera Test for Normality Moawa El-Fallah Ab El-Salam

More information

HW6 2.(c) HW6 2.(c) X2 = 1 X2 = 10 X2 = 100 X1 = 1 X1 = 10 X1 = 100. Fitted Value. Fitted Value

HW6 2.(c) HW6 2.(c) X2 = 1 X2 = 10 X2 = 100 X1 = 1 X1 = 10 X1 = 100. Fitted Value. Fitted Value STAT 511 Solutions to Assignment 9 Spring 2002 1. For each of the following models, state whether it is a linear model, an intrinsically linear model, or a non linear model. In the case of an intrinsically

More information

Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods

Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods Chapter 4 Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods 4.1 Introduction It is now explicable that ridge regression estimator (here we take ordinary ridge estimator (ORE)

More information

11. Bootstrap Methods

11. Bootstrap Methods 11. Bootstrap Methods c A. Colin Cameron & Pravin K. Trivedi 2006 These transparencies were prepared in 20043. They can be used as an adjunct to Chapter 11 of our subsequent book Microeconometrics: Methods

More information

The Nonparametric Bootstrap

The Nonparametric Bootstrap The Nonparametric Bootstrap The nonparametric bootstrap may involve inferences about a parameter, but we use a nonparametric procedure in approximating the parametric distribution using the ECDF. We use

More information

arxiv: v1 [stat.co] 26 May 2009

arxiv: v1 [stat.co] 26 May 2009 MAXIMUM LIKELIHOOD ESTIMATION FOR MARKOV CHAINS arxiv:0905.4131v1 [stat.co] 6 May 009 IULIANA TEODORESCU Abstract. A new approach for optimal estimation of Markov chains with sparse transition matrices

More information

4 Resampling Methods: The Bootstrap

4 Resampling Methods: The Bootstrap 4 Resampling Methods: The Bootstrap Situation: Let x 1, x 2,..., x n be a SRS of size n taken from a distribution that is unknown. Let θ be a parameter of interest associated with this distribution and

More information

STAT440/840: Statistical Computing

STAT440/840: Statistical Computing First Prev Next Last STAT440/840: Statistical Computing Paul Marriott pmarriott@math.uwaterloo.ca MC 6096 February 2, 2005 Page 1 of 41 First Prev Next Last Page 2 of 41 Chapter 3: Data resampling: the

More information

Section 2.7 Derivatives of powers of functions

Section 2.7 Derivatives of powers of functions Section 2.7 Derivatives of powers of functions (3/19/08) Overview: In this section we iscuss the Chain Rule formula for the erivatives of composite functions that are forme by taking powers of other functions.

More information

Supporting Information for Estimating restricted mean. treatment effects with stacked survival models

Supporting Information for Estimating restricted mean. treatment effects with stacked survival models Supporting Information for Estimating restricted mean treatment effects with stacked survival models Andrew Wey, David Vock, John Connett, and Kyle Rudser Section 1 presents several extensions to the simulation

More information

Estimation of AUC from 0 to Infinity in Serial Sacrifice Designs

Estimation of AUC from 0 to Infinity in Serial Sacrifice Designs Estimation of AUC from 0 to Infinity in Serial Sacrifice Designs Martin J. Wolfsegger Department of Biostatistics, Baxter AG, Vienna, Austria Thomas Jaki Department of Statistics, University of South Carolina,

More information

UNIVERSITÄT POTSDAM Institut für Mathematik

UNIVERSITÄT POTSDAM Institut für Mathematik UNIVERSITÄT POTSDAM Institut für Mathematik Testing the Acceleration Function in Life Time Models Hannelore Liero Matthias Liero Mathematische Statistik und Wahrscheinlichkeitstheorie Universität Potsdam

More information

Designing of Acceptance Double Sampling Plan for Life Test Based on Percentiles of Exponentiated Rayleigh Distribution

Designing of Acceptance Double Sampling Plan for Life Test Based on Percentiles of Exponentiated Rayleigh Distribution International Journal of Statistics an Systems ISSN 973-675 Volume, Number 3 (7), pp. 475-484 Research Inia Publications http://www.ripublication.com Designing of Acceptance Double Sampling Plan for Life

More information

Preliminaries The bootstrap Bias reduction Hypothesis tests Regression Confidence intervals Time series Final remark. Bootstrap inference

Preliminaries The bootstrap Bias reduction Hypothesis tests Regression Confidence intervals Time series Final remark. Bootstrap inference 1 / 171 Bootstrap inference Francisco Cribari-Neto Departamento de Estatística Universidade Federal de Pernambuco Recife / PE, Brazil email: cribari@gmail.com October 2013 2 / 171 Unpaid advertisement

More information

Analytical Bootstrap Methods for Censored Data

Analytical Bootstrap Methods for Censored Data JOURNAL OF APPLIED MATHEMATICS AND DECISION SCIENCES, 6(2, 129 141 Copyright c 2002, Lawrence Erlbaum Associates, Inc. Analytical Bootstrap Methods for Censored Data ALAN D. HUTSON Division of Biostatistics,

More information

Nonparametric Additive Models

Nonparametric Additive Models Nonparametric Aitive Moels Joel L. Horowitz The Institute for Fiscal Stuies Department of Economics, UCL cemmap working paper CWP20/2 Nonparametric Aitive Moels Joel L. Horowitz. INTRODUCTION Much applie

More information

MATHEMATICAL METHODS

MATHEMATICAL METHODS Victorian Certificate of Eucation 207 SUPERVISOR TO ATTACH PROCESSING LABEL HERE Letter STUDENT NUMBER MATHEMATICAL METHODS Written examination Wenesay 8 November 207 Reaing time: 9.00 am to 9.5 am (5

More information

Quantum Mechanics in Three Dimensions

Quantum Mechanics in Three Dimensions Physics 342 Lecture 20 Quantum Mechanics in Three Dimensions Lecture 20 Physics 342 Quantum Mechanics I Monay, March 24th, 2008 We begin our spherical solutions with the simplest possible case zero potential.

More information

A better way to bootstrap pairs

A better way to bootstrap pairs A better way to bootstrap pairs Emmanuel Flachaire GREQAM - Université de la Méditerranée CORE - Université Catholique de Louvain April 999 Abstract In this paper we are interested in heteroskedastic regression

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Non-parametric Inference, Polynomial Regression Week 9, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Non-parametric Inference, Polynomial Regression Week 9, Lecture 2 MA 575 Linear Models: Cedric E. Ginestet, Boston University Non-parametric Inference, Polynomial Regression Week 9, Lecture 2 1 Bootstrapped Bias and CIs Given a multiple regression model with mean and

More information

This module is part of the. Memobust Handbook. on Methodology of Modern Business Statistics

This module is part of the. Memobust Handbook. on Methodology of Modern Business Statistics This moule is part of the Memobust Hanbook on Methoology of Moern Business Statistics 26 March 2014 Metho: Balance Sampling for Multi-Way Stratification Contents General section... 3 1. Summary... 3 2.

More information

ELECTRON DIFFRACTION

ELECTRON DIFFRACTION ELECTRON DIFFRACTION Electrons : wave or quanta? Measurement of wavelength an momentum of electrons. Introuction Electrons isplay both wave an particle properties. What is the relationship between the

More information

Finite Population Correction Methods

Finite Population Correction Methods Finite Population Correction Methods Moses Obiri May 5, 2017 Contents 1 Introduction 1 2 Normal-based Confidence Interval 2 3 Bootstrap Confidence Interval 3 4 Finite Population Bootstrap Sampling 5 4.1

More information

Quantile function expansion using regularly varying functions

Quantile function expansion using regularly varying functions Quantile function expansion using regularly varying functions arxiv:705.09494v [math.st] 9 Aug 07 Thomas Fung a, an Eugene Seneta b a Department of Statistics, Macquarie University, NSW 09, Australia b

More information

Least-Squares Regression on Sparse Spaces

Least-Squares Regression on Sparse Spaces Least-Squares Regression on Sparse Spaces Yuri Grinberg, Mahi Milani Far, Joelle Pineau School of Computer Science McGill University Montreal, Canaa {ygrinb,mmilan1,jpineau}@cs.mcgill.ca 1 Introuction

More information

12.11 Laplace s Equation in Cylindrical and

12.11 Laplace s Equation in Cylindrical and SEC. 2. Laplace s Equation in Cylinrical an Spherical Coorinates. Potential 593 2. Laplace s Equation in Cylinrical an Spherical Coorinates. Potential One of the most important PDEs in physics an engineering

More information

ON THE NUMBER OF BOOTSTRAP REPETITIONS FOR BC a CONFIDENCE INTERVALS. DONALD W. K. ANDREWS and MOSHE BUCHINSKY COWLES FOUNDATION PAPER NO.

ON THE NUMBER OF BOOTSTRAP REPETITIONS FOR BC a CONFIDENCE INTERVALS. DONALD W. K. ANDREWS and MOSHE BUCHINSKY COWLES FOUNDATION PAPER NO. ON THE NUMBER OF BOOTSTRAP REPETITIONS FOR BC a CONFIDENCE INTERVALS BY DONALD W. K. ANDREWS and MOSHE BUCHINSKY COWLES FOUNDATION PAPER NO. 1069 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS YALE UNIVERSITY

More information

The Role of Models in Model-Assisted and Model- Dependent Estimation for Domains and Small Areas

The Role of Models in Model-Assisted and Model- Dependent Estimation for Domains and Small Areas The Role of Moels in Moel-Assiste an Moel- Depenent Estimation for Domains an Small Areas Risto Lehtonen University of Helsini Mio Myrsylä University of Pennsylvania Carl-Eri Särnal University of Montreal

More information

Econometrics. Andrés M. Alonso. Unit 1: Introduction: The regression model. Unit 2: Estimation principles. Unit 3: Hypothesis testing principles.

Econometrics. Andrés M. Alonso. Unit 1: Introduction: The regression model. Unit 2: Estimation principles. Unit 3: Hypothesis testing principles. Andrés M. Alonso andres.alonso@uc3m.es Unit 1: Introduction: The regression model. Unit 2: Estimation principles. Unit 3: Hypothesis testing principles. Unit 4: Heteroscedasticity in the regression model.

More information

Parameter estimation: A new approach to weighting a priori information

Parameter estimation: A new approach to weighting a priori information Parameter estimation: A new approach to weighting a priori information J.L. Mea Department of Mathematics, Boise State University, Boise, ID 83725-555 E-mail: jmea@boisestate.eu Abstract. We propose a

More information

Lecture 6 : Dimensionality Reduction

Lecture 6 : Dimensionality Reduction CPS290: Algorithmic Founations of Data Science February 3, 207 Lecture 6 : Dimensionality Reuction Lecturer: Kamesh Munagala Scribe: Kamesh Munagala In this lecture, we will consier the roblem of maing

More information

Bootstrapping the Confidence Intervals of R 2 MAD for Samples from Contaminated Standard Logistic Distribution

Bootstrapping the Confidence Intervals of R 2 MAD for Samples from Contaminated Standard Logistic Distribution Pertanika J. Sci. & Technol. 18 (1): 209 221 (2010) ISSN: 0128-7680 Universiti Putra Malaysia Press Bootstrapping the Confidence Intervals of R 2 MAD for Samples from Contaminated Standard Logistic Distribution

More information

Time-of-Arrival Estimation in Non-Line-Of-Sight Environments

Time-of-Arrival Estimation in Non-Line-Of-Sight Environments 2 Conference on Information Sciences an Systems, The Johns Hopkins University, March 2, 2 Time-of-Arrival Estimation in Non-Line-Of-Sight Environments Sinan Gezici, Hisashi Kobayashi an H. Vincent Poor

More information

Improving linear quantile regression for

Improving linear quantile regression for Improving linear quantile regression for replicated data arxiv:1901.0369v1 [stat.ap] 16 Jan 2019 Kaushik Jana 1 and Debasis Sengupta 2 1 Imperial College London, UK 2 Indian Statistical Institute, Kolkata,

More information

TMA 4195 Matematisk modellering Exam Tuesday December 16, :00 13:00 Problems and solution with additional comments

TMA 4195 Matematisk modellering Exam Tuesday December 16, :00 13:00 Problems and solution with additional comments Problem F U L W D g m 3 2 s 2 0 0 0 0 2 kg 0 0 0 0 0 0 Table : Dimension matrix TMA 495 Matematisk moellering Exam Tuesay December 6, 2008 09:00 3:00 Problems an solution with aitional comments The necessary

More information

Improving Estimation Accuracy in Nonrandomized Response Questioning Methods by Multiple Answers

Improving Estimation Accuracy in Nonrandomized Response Questioning Methods by Multiple Answers International Journal of Statistics an Probability; Vol 6, No 5; September 207 ISSN 927-7032 E-ISSN 927-7040 Publishe by Canaian Center of Science an Eucation Improving Estimation Accuracy in Nonranomize

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

Preliminaries The bootstrap Bias reduction Hypothesis tests Regression Confidence intervals Time series Final remark. Bootstrap inference

Preliminaries The bootstrap Bias reduction Hypothesis tests Regression Confidence intervals Time series Final remark. Bootstrap inference 1 / 172 Bootstrap inference Francisco Cribari-Neto Departamento de Estatística Universidade Federal de Pernambuco Recife / PE, Brazil email: cribari@gmail.com October 2014 2 / 172 Unpaid advertisement

More information

Modeling of Dependence Structures in Risk Management and Solvency

Modeling of Dependence Structures in Risk Management and Solvency Moeling of Depenence Structures in Risk Management an Solvency University of California, Santa Barbara 0. August 007 Doreen Straßburger Structure. Risk Measurement uner Solvency II. Copulas 3. Depenent

More information

Bootstrap, Jackknife and other resampling methods

Bootstrap, Jackknife and other resampling methods Bootstrap, Jackknife and other resampling methods Part III: Parametric Bootstrap Rozenn Dahyot Room 128, Department of Statistics Trinity College Dublin, Ireland dahyot@mee.tcd.ie 2005 R. Dahyot (TCD)

More information

Better Bootstrap Confidence Intervals

Better Bootstrap Confidence Intervals by Bradley Efron University of Washington, Department of Statistics April 12, 2012 An example Suppose we wish to make inference on some parameter θ T (F ) (e.g. θ = E F X ), based on data We might suppose

More information

Bootstrapping, Randomization, 2B-PLS

Bootstrapping, Randomization, 2B-PLS Bootstrapping, Randomization, 2B-PLS Statistics, Tests, and Bootstrapping Statistic a measure that summarizes some feature of a set of data (e.g., mean, standard deviation, skew, coefficient of variation,

More information

3.7 Implicit Differentiation -- A Brief Introduction -- Student Notes

3.7 Implicit Differentiation -- A Brief Introduction -- Student Notes Fin these erivatives of these functions: y.7 Implicit Differentiation -- A Brief Introuction -- Stuent Notes tan y sin tan = sin y e = e = Write the inverses of these functions: y tan y sin How woul we

More information

Computing Exact Confidence Coefficients of Simultaneous Confidence Intervals for Multinomial Proportions and their Functions

Computing Exact Confidence Coefficients of Simultaneous Confidence Intervals for Multinomial Proportions and their Functions Working Paper 2013:5 Department of Statistics Computing Exact Confience Coefficients of Simultaneous Confience Intervals for Multinomial Proportions an their Functions Shaobo Jin Working Paper 2013:5

More information

A Course in Machine Learning

A Course in Machine Learning A Course in Machine Learning Hal Daumé III 12 EFFICIENT LEARNING So far, our focus has been on moels of learning an basic algorithms for those moels. We have not place much emphasis on how to learn quickly.

More information

VI. Linking and Equating: Getting from A to B Unleashing the full power of Rasch models means identifying, perhaps conceiving an important aspect,

VI. Linking and Equating: Getting from A to B Unleashing the full power of Rasch models means identifying, perhaps conceiving an important aspect, VI. Linking an Equating: Getting from A to B Unleashing the full power of Rasch moels means ientifying, perhaps conceiving an important aspect, efining a useful construct, an calibrating a pool of relevant

More information

Reliable Inference in Conditions of Extreme Events. Adriana Cornea

Reliable Inference in Conditions of Extreme Events. Adriana Cornea Reliable Inference in Conditions of Extreme Events by Adriana Cornea University of Exeter Business School Department of Economics ExISta Early Career Event October 17, 2012 Outline of the talk Extreme

More information

Conservation Laws. Chapter Conservation of Energy

Conservation Laws. Chapter Conservation of Energy 20 Chapter 3 Conservation Laws In orer to check the physical consistency of the above set of equations governing Maxwell-Lorentz electroynamics [(2.10) an (2.12) or (1.65) an (1.68)], we examine the action

More information

Exam #2, Electrostatics

Exam #2, Electrostatics Exam #2, Electrostatics Prof. Maurik Holtrop Department of Physics PHYS 408 University of New Hampshire March 27 th, 2003 Name: Stuent # NOTE: There are 5 questions. You have until 9 pm to finish. You

More information

Interval Estimation III: Fisher's Information & Bootstrapping

Interval Estimation III: Fisher's Information & Bootstrapping Interval Estimation III: Fisher's Information & Bootstrapping Frequentist Confidence Interval Will consider four approaches to estimating confidence interval Standard Error (+/- 1.96 se) Likelihood Profile

More information

Outline. Confidence intervals More parametric tests More bootstrap and randomization tests. Cohen Empirical Methods CS650

Outline. Confidence intervals More parametric tests More bootstrap and randomization tests. Cohen Empirical Methods CS650 Outline Confidence intervals More parametric tests More bootstrap and randomization tests Parameter Estimation Collect a sample to estimate the value of a population parameter. Example: estimate mean age

More information

arxiv: v1 [hep-ex] 4 Sep 2018 Simone Ragoni, for the ALICE Collaboration

arxiv: v1 [hep-ex] 4 Sep 2018 Simone Ragoni, for the ALICE Collaboration Prouction of pions, kaons an protons in Xe Xe collisions at s =. ev arxiv:09.0v [hep-ex] Sep 0, for the ALICE Collaboration Università i Bologna an INFN (Bologna) E-mail: simone.ragoni@cern.ch In late

More information

where x i and u i are iid N (0; 1) random variates and are mutually independent, ff =0; and fi =1. ff(x i )=fl0 + fl1x i with fl0 =1. We examine the e

where x i and u i are iid N (0; 1) random variates and are mutually independent, ff =0; and fi =1. ff(x i )=fl0 + fl1x i with fl0 =1. We examine the e Inference on the Quantile Regression Process Electronic Appendix Roger Koenker and Zhijie Xiao 1 Asymptotic Critical Values Like many other Kolmogorov-Smirnov type tests (see, e.g. Andrews (1993)), the

More information

3.2 Shot peening - modeling 3 PROCEEDINGS

3.2 Shot peening - modeling 3 PROCEEDINGS 3.2 Shot peening - moeling 3 PROCEEDINGS Computer assiste coverage simulation François-Xavier Abaie a, b a FROHN, Germany, fx.abaie@frohn.com. b PEENING ACCESSORIES, Switzerlan, info@peening.ch Keywors:

More information

Topic 7: Convergence of Random Variables

Topic 7: Convergence of Random Variables Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information

More information

arxiv: v2 [cond-mat.stat-mech] 11 Nov 2016

arxiv: v2 [cond-mat.stat-mech] 11 Nov 2016 Noname manuscript No. (will be inserte by the eitor) Scaling properties of the number of ranom sequential asorption iterations neee to generate saturate ranom packing arxiv:607.06668v2 [con-mat.stat-mech]

More information

INDEPENDENT COMPONENT ANALYSIS VIA

INDEPENDENT COMPONENT ANALYSIS VIA INDEPENDENT COMPONENT ANALYSIS VIA NONPARAMETRIC MAXIMUM LIKELIHOOD ESTIMATION Truth Rotate S 2 2 1 0 1 2 3 4 X 2 2 1 0 1 2 3 4 4 2 0 2 4 6 4 2 0 2 4 6 S 1 X 1 Reconstructe S^ 2 2 1 0 1 2 3 4 Marginal

More information

Inference in Nonparametric Series Estimation with Specification Searches for the Number of Series Terms

Inference in Nonparametric Series Estimation with Specification Searches for the Number of Series Terms Economics Working Paper Series 208/00 Inference in Nonparametric Series Estimation with Specification Searches for the Number of Series Terms Byunghoon Kang The Department of Economics Lancaster University

More information

A Practitioner s Guide to Cluster-Robust Inference

A Practitioner s Guide to Cluster-Robust Inference A Practitioner s Guide to Cluster-Robust Inference A. C. Cameron and D. L. Miller presented by Federico Curci March 4, 2015 Cameron Miller Cluster Clinic II March 4, 2015 1 / 20 In the previous episode

More information

THE VAN KAMPEN EXPANSION FOR LINKED DUFFING LINEAR OSCILLATORS EXCITED BY COLORED NOISE

THE VAN KAMPEN EXPANSION FOR LINKED DUFFING LINEAR OSCILLATORS EXCITED BY COLORED NOISE Journal of Soun an Vibration (1996) 191(3), 397 414 THE VAN KAMPEN EXPANSION FOR LINKED DUFFING LINEAR OSCILLATORS EXCITED BY COLORED NOISE E. M. WEINSTEIN Galaxy Scientific Corporation, 2500 English Creek

More information

Inverse Theory Course: LTU Kiruna. Day 1

Inverse Theory Course: LTU Kiruna. Day 1 Inverse Theory Course: LTU Kiruna. Day Hugh Pumphrey March 6, 0 Preamble These are the notes for the course Inverse Theory to be taught at LuleåTekniska Universitet, Kiruna in February 00. They are not

More information

A Non-parametric bootstrap for multilevel models

A Non-parametric bootstrap for multilevel models A Non-parametric bootstrap for multilevel models By James Carpenter London School of Hygiene and ropical Medicine Harvey Goldstein and Jon asbash Institute of Education 1. Introduction Bootstrapping is

More information

Bootstrapping Long Memory Tests: Some Monte Carlo Results

Bootstrapping Long Memory Tests: Some Monte Carlo Results Bootstrapping Long Memory Tests: Some Monte Carlo Results Anthony Murphy and Marwan Izzeldin University College Dublin and Cass Business School. July 2004 - Preliminary Abstract We investigate the bootstrapped

More information

Optimization of Geometries by Energy Minimization

Optimization of Geometries by Energy Minimization Optimization of Geometries by Energy Minimization by Tracy P. Hamilton Department of Chemistry University of Alabama at Birmingham Birmingham, AL 3594-140 hamilton@uab.eu Copyright Tracy P. Hamilton, 1997.

More information

Center of Gravity and Center of Mass

Center of Gravity and Center of Mass Center of Gravity an Center of Mass 1 Introuction. Center of mass an center of gravity closely parallel each other: they both work the same way. Center of mass is the more important, but center of gravity

More information

Nonparametric Methods II

Nonparametric Methods II Nonparametric Methods II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 PART 3: Statistical Inference by

More information

Accuracy of Estimators, Confidence Intervals and Tests

Accuracy of Estimators, Confidence Intervals and Tests 2 Accuracy of Estimators, Confidence Intervals and Tests To determine the accuracy of the estimates that we made in Chapter 1, we demonstrate how to calculate confidence intervals and perform tests The

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Bootstrap for Regression Week 9, Lecture 1

MA 575 Linear Models: Cedric E. Ginestet, Boston University Bootstrap for Regression Week 9, Lecture 1 MA 575 Linear Models: Cedric E. Ginestet, Boston University Bootstrap for Regression Week 9, Lecture 1 1 The General Bootstrap This is a computer-intensive resampling algorithm for estimating the empirical

More information

Time series power spectral density. frequency-side,, vs. time-side, t

Time series power spectral density. frequency-side,, vs. time-side, t ime series power spectral ensity. requency-sie,, vs. time-sie, t t, t= 0, ±1, ±, Suppose stationary c u = cov{t+u,t} u = 0, ±1, ±, lag = 1/π Σ exp {-iu} c u perio π non-negative / : requency cycles/unit

More information

Conditional Effects Contrasts of interest main effects" for factor 1: μ i: μ:: i = 1; ; : : : ; a μ i: μ k: i = k main effects" for factor : μ :j μ::

Conditional Effects Contrasts of interest main effects for factor 1: μ i: μ:: i = 1; ; : : : ; a μ i: μ k: i = k main effects for factor : μ :j μ:: 8. Two-way crossed classifications Example 8.1 Days to germination of three varieties of carrot seed grown in two types of potting soil. Soil Variety Type 1 1 Y 111 = Y 11 = 1 Y 11 = 1 Y 11 = 10 Y 1 =

More information

Estimating International Migration on the Base of Small Area Techniques

Estimating International Migration on the Base of Small Area Techniques MPRA Munich Personal RePEc Archive Estimating International Migration on the Base of Small Area echniques Vergil Voineagu an Nicoleta Caragea an Silvia Pisica 2013 Online at http://mpra.ub.uni-muenchen.e/48775/

More information

Contents. Acknowledgments. xix

Contents. Acknowledgments. xix Table of Preface Acknowledgments page xv xix 1 Introduction 1 The Role of the Computer in Data Analysis 1 Statistics: Descriptive and Inferential 2 Variables and Constants 3 The Measurement of Variables

More information

Schrödinger s equation.

Schrödinger s equation. Physics 342 Lecture 5 Schröinger s Equation Lecture 5 Physics 342 Quantum Mechanics I Wenesay, February 3r, 2010 Toay we iscuss Schröinger s equation an show that it supports the basic interpretation of

More information

Bivariate distributions characterized by one family of conditionals and conditional percentile or mode functions

Bivariate distributions characterized by one family of conditionals and conditional percentile or mode functions Journal of Multivariate Analysis 99 2008) 1383 1392 www.elsevier.com/locate/jmva Bivariate istributions characterize by one family of conitionals an conitional ercentile or moe functions Barry C. Arnol

More information

Improved Geoid Model for Syria Using the Local Gravimetric and GPS Measurements 1

Improved Geoid Model for Syria Using the Local Gravimetric and GPS Measurements 1 Improve Geoi Moel for Syria Using the Local Gravimetric an GPS Measurements 1 Ria Al-Masri 2 Abstract "The objective of this paper is to iscuss recent research one by the author to evelop a new improve

More information

Both the ASME B and the draft VDI/VDE 2617 have strengths and

Both the ASME B and the draft VDI/VDE 2617 have strengths and Choosing Test Positions for Laser Tracker Evaluation an Future Stanars Development ala Muralikrishnan 1, Daniel Sawyer 1, Christopher lackburn 1, Steven Phillips 1, Craig Shakarji 1, E Morse 2, an Robert

More information

Bootstrap Confidence Intervals

Bootstrap Confidence Intervals Bootstrap Confidence Intervals Patrick Breheny September 18 Patrick Breheny STA 621: Nonparametric Statistics 1/22 Introduction Bootstrap confidence intervals So far, we have discussed the idea behind

More information

The exact bootstrap method shown on the example of the mean and variance estimation

The exact bootstrap method shown on the example of the mean and variance estimation Comput Stat (2013) 28:1061 1077 DOI 10.1007/s00180-012-0350-0 ORIGINAL PAPER The exact bootstrap method shown on the example of the mean and variance estimation Joanna Kisielinska Received: 21 May 2011

More information

A note on asymptotic formulae for one-dimensional network flow problems Carlos F. Daganzo and Karen R. Smilowitz

A note on asymptotic formulae for one-dimensional network flow problems Carlos F. Daganzo and Karen R. Smilowitz A note on asymptotic formulae for one-imensional network flow problems Carlos F. Daganzo an Karen R. Smilowitz (to appear in Annals of Operations Research) Abstract This note evelops asymptotic formulae

More information

Inference via Kernel Smoothing of Bootstrap P Values

Inference via Kernel Smoothing of Bootstrap P Values Queen s Economics Department Working Paper No. 1054 Inference via Kernel Smoothing of Bootstrap P Values Jeff Racine McMaster University James G. MacKinnon Queen s University Department of Economics Queen

More information

under the null hypothesis, the sign test (with continuity correction) rejects H 0 when α n + n 2 2.

under the null hypothesis, the sign test (with continuity correction) rejects H 0 when α n + n 2 2. Assignment 13 Exercise 8.4 For the hypotheses consiere in Examples 8.12 an 8.13, the sign test is base on the statistic N + = #{i : Z i > 0}. Since 2 n(n + /n 1) N(0, 1) 2 uner the null hypothesis, the

More information

ST745: Survival Analysis: Nonparametric methods

ST745: Survival Analysis: Nonparametric methods ST745: Survival Analysis: Nonparametric methods Eric B. Laber Department of Statistics, North Carolina State University February 5, 2015 The KM estimator is used ubiquitously in medical studies to estimate

More information

SPECIALIST MATHEMATICS

SPECIALIST MATHEMATICS Victorian Certificate of Eucation 07 SUPERVISOR TO ATTACH PROCESSING LABEL HERE Letter STUDENT NUMBER SPECIALIST MATHEMATICS Written examination Friay 0 November 07 Reaing time: 9.00 am to 9.5 am (5 minutes)

More information

Planar sheath and presheath

Planar sheath and presheath 5/11/1 Flui-Poisson System Planar sheath an presheath 1 Planar sheath an presheath A plasma between plane parallel walls evelops a positive potential which equalizes the rate of loss of electrons an ions.

More information

Subject CS1 Actuarial Statistics 1 Core Principles

Subject CS1 Actuarial Statistics 1 Core Principles Institute of Actuaries of India Subject CS1 Actuarial Statistics 1 Core Principles For 2019 Examinations Aim The aim of the Actuarial Statistics 1 subject is to provide a grounding in mathematical and

More information

Experiment 2, Physics 2BL

Experiment 2, Physics 2BL Experiment 2, Physics 2BL Deuction of Mass Distributions. Last Upate: 2009-05-03 Preparation Before this experiment, we recommen you review or familiarize yourself with the following: Chapters 4-6 in Taylor

More information

Construction of the Electronic Radial Wave Functions and Probability Distributions of Hydrogen-like Systems

Construction of the Electronic Radial Wave Functions and Probability Distributions of Hydrogen-like Systems Construction of the Electronic Raial Wave Functions an Probability Distributions of Hyrogen-like Systems Thomas S. Kuntzleman, Department of Chemistry Spring Arbor University, Spring Arbor MI 498 tkuntzle@arbor.eu

More information

Tutorial on Maximum Likelyhood Estimation: Parametric Density Estimation

Tutorial on Maximum Likelyhood Estimation: Parametric Density Estimation Tutorial on Maximum Likelyhoo Estimation: Parametric Density Estimation Suhir B Kylasa 03/13/2014 1 Motivation Suppose one wishes to etermine just how biase an unfair coin is. Call the probability of tossing

More information

Asian International School Curriculum Mapping Level : Starter Subject : Mathematics_ School Year:

Asian International School Curriculum Mapping Level : Starter Subject : Mathematics_ School Year: Asian International School Star Stran Content Skills Activities Assessments August Septembe r 5.2 number 5.2 number NUMBER: Integersan ecimals Integers Place Value an ecimals Multiplyan ivieby 10,100or1000

More information

USE OF NESTED DESIGNS IN DIALLEL CROSS EXPERIMENTS

USE OF NESTED DESIGNS IN DIALLEL CROSS EXPERIMENTS USE OF NESTED DESIGNS IN DIALLEL CROSS EXPERIMENTS. Introuction Rajener Parsa I.A.S.R.I., Library Avenue, New Delhi - 0 0 The term iallel is a Greek wor an implies all possible crosses among a collection

More information

ay (t) + by (t) + cy(t) = 0 (2)

ay (t) + by (t) + cy(t) = 0 (2) Solving ay + by + cy = 0 Without Characteristic Equations, Complex Numbers, or Hats John Tolle Department of Mathematical Sciences Carnegie Mellon University Pittsburgh, PA 15213-3890 Some calculus courses

More information

CONFIRMATORY FACTOR ANALYSIS

CONFIRMATORY FACTOR ANALYSIS 1 CONFIRMATORY FACTOR ANALYSIS The purpose of confirmatory factor analysis (CFA) is to explain the pattern of associations among a set of observe variables in terms of a smaller number of unerlying latent

More information

3 Joint Distributions 71

3 Joint Distributions 71 2.2.3 The Normal Distribution 54 2.2.4 The Beta Density 58 2.3 Functions of a Random Variable 58 2.4 Concluding Remarks 64 2.5 Problems 64 3 Joint Distributions 71 3.1 Introduction 71 3.2 Discrete Random

More information

by using the derivative rules. o Building blocks: d

by using the derivative rules. o Building blocks: d Calculus for Business an Social Sciences - Prof D Yuen Eam Review version /9/01 Check website for any poste typos an upates Eam is on Sections, 5, 6,, 1,, Derivatives Rules Know how to fin the formula

More information

Logarithmic spurious regressions

Logarithmic spurious regressions Logarithmic spurious regressions Robert M. e Jong Michigan State University February 5, 22 Abstract Spurious regressions, i.e. regressions in which an integrate process is regresse on another integrate

More information

Math 342 Partial Differential Equations «Viktor Grigoryan

Math 342 Partial Differential Equations «Viktor Grigoryan Math 342 Partial Differential Equations «Viktor Grigoryan 6 Wave equation: solution In this lecture we will solve the wave equation on the entire real line x R. This correspons to a string of infinite

More information