A KERNEL APPROACH TO ESTIMATING THE DENSITY OF A CONDITIONAL EXPECTATION. Samuel G. Steckley Shane G. Henderson
|
|
- Jeffery Clinton Chase
- 5 years ago
- Views:
Transcription
1 Proceedings of te 3 Winter Siulation Conference S Cick P J Sáncez D Ferrin and D J Morrice eds A KERNEL APPROACH TO ESTIMATING THE DENSITY OF A CONDITIONAL EXPECTATION Sauel G Steckley Sane G Henderson Scool of Operations Researc and Industrial Engineering Cornell Universtiy Itaca NY 853 USA ABSTRACT Given uncertainty in te input odel and paraeters of a siulation study te goal of te siulation study often becoes te estiation of a conditional expectation Te conditional expectation is expected perforance conditional on te selected odel and paraeters Te distribution of tis conditional expectation describes precisely and concisely te ipact of input uncertainty on perforance prediction In tis paper we estiate te density of a conditional expectation using ideas fro te field of kernel density estiation We present a result on asyptotically optial rates of convergence and exaine a nuber of nuerical exaples INTRODUCTION Let X be a real-valued rando variable wit E X < Let Z be soe oter rando variable Te conditional expectation E(X Z) is a rando variable tat represents one s best guess (in a certain sense) as to te value of X given only te value of te rando variable Z In tis paper we assue tat te rando variable E(X Z) as a density wit respect to Lebesgue easure and develop a etod for estiating it Our ain assuptions are tat we can generate iid replicates of te rando variable Z and we can generate iid observations fro te conditional distribution P (X Z = z) for any z in te range of Z In tis paper we confine our attention to te case were Z is a real-valued rando variable but we are working on establising analogous results wen Z is a ore coplicated rando object Suc a generalization is iportant because our priary otivation for studying tis proble stes fro te issue of input odel uncertainty Tis for of uncertainty arises wen one is not copletely certain wat input distributions and associated paraeters sould be used in a siulation odel Tere are any etods for dealing wit suc uncertainty; see Henderson (3) for a review Many of tese etods ipose a probability distribution on te input distributions and paraeters For exaple tis is te case in te papers Ceng (99) Ceng and Holland (997) Ceng and Holland (998) Ceng and Holland (3) Cick () Zouaoui and Wilson (b) Zouaoui and Wilson (a) See Henderson (3) for furter discussion Te input odel uncertainty proble aps to te setting in tis paper as follows Te rando object Z corresponds to a selection of input distributions and associated paraeters for a siulation experient Te rando variable X represents an estiate of a perforance easure fro te siulation odel Its distribution is dependent on te coice Z of input distributions and paraeters Te conditional expectation E(X Z) represents te expected value of te perforance easure as a function of te input distributions and paraeters It is essentially wat one would copute fro te siulation if te siulation were allowed to run for an infinite aount of tie Notice tat it is still a rando variable owing to te uncertainty in te input distributions and paraeters A density of E(X Z) gives a sense of te uncertainty in te estiate of te perforance easure due to te uncertainty in te values of te input distributions and paraeters Exaple 3 of Henderson (3) discusses tis density in te setting of a queueing siulation and describes ow it ay be interpreted Very little work as been done on te estiation of te distribution of a conditional expectation Te ost closely related work to ours involves te estiation of te distribution function of te conditional expectation E(X Z) Lee and Glynn (999) considered te case were Z is a discrete rando variable Tis work was
2 an outgrowt of Capter of Lee (998) were te case were Z is continuous is also considered We prefer to directly estiate te density because we believe tat te density is ore easily interpreted (visually) tan a distribution function We use kernel density estiation etods to estiate te required density Te analysis of our estiator draws fro etods tat are used in variable-bandwidt kernel density estiation etods (Hall 99) Andradóttir and Glynn (3) discuss a certain estiation proble tat in our setting is essentially te estiation of EX Teir proble is coplicated by te fact tat tey explicitly allow for bias in te estiator Suc bias can arise in steady-state siulation experients for exaple Tis paper is organized as follows In we describe our estiation etodology and sow tat under fairly general conditions te error in our density estiator converges at rate c /7 were c is te overall coputational budget We ten present soe nuerical exaples in 3 Soe brief conclusions and directions for future researc appear in ESTIMATION METHODOLOGY Our proble is very siilar in structure to tat of Lee and Glynn (999) Accordingly we adopt uc of teir proble structure and assuptions in wat follows We assue te ability to draw saples fro te distribution P (Z ) and for any z in te range of Z to draw saples fro te conditional distribution P (X Z = z) Let f denote te (target) density of E(X Z) wic we assue exists Let (Z i : i n) be a sequence of independent identically distributed (iid) copies of te rando variable Z Conditional on (Z i : i n) te saple (X j (Z i ) : i n j ) consists of independent rando variables in wic X j (Z i ) follows te distribution P (X Z = Z i ) For ease of notation define µ( ) E(X Z = ) and σ ( ) Var(X Z = ) Suppose Y is a rando variable wit an unknown density g and (Y i : i n) is a sequence of iid copies of te rando variable Y Te standard kernel density estiator at te value x is of te for ĝ(x; ) = n n i= ( x Yi ) were K is typically cosen to be a uniodal probability density function (pdf) tat is syetric about zero and te sooting paraeter often referred to as te bandwidt is a positive nuber (Wand and Jones 995 p ) Te estiator can be crudely described as te su of equally weigted kernels centered at eac realization Y i If te kernel is a pdf te kernel spreads out te ass of /n syetrically about te neigborood of Y i In te case tat K is te pdf of a standard noral rando variable is te standard deviation and tus gives te spread of te kernels Tis estiator iediately suggests tat we can estiate f(x) te density of E(X Z) evaluated at x by were ˆf(x; n ) = n X (Z i ) = n i= ( x X (Z i ) ) X j (Z i ) for i = n j= Note tat te values X (Z i ) at wic te kernels are centered are not realizations of te rando variable E(X Z) as in te standard kernel density estiation setting described above but rater estiates We wis to analyze te asyptotics of te estiator For a given coputer budget c let = (c) and n = n(c) be cosen so tat te total coputational effort required to generate ˆf(x; n ) is approxiately c Following Lee and Glynn (999) te coputational effort required to copute ˆf(x; n ) is α n(c) + α n(c)(c) were α and α are te average coputational effort used to generate Z i and X j (Z i ) respectively We need (c) as c to ensure tat X (c) (Z i ) E(X Z = Z i ) and we can assue α = witout loss of generality It follows tat (c) and n(c) ust be cosen to satisfy te asyptotic relationsip (c)n(c)/c as c Te bandwidt = (c) is also a function of c To keep te notation less cubersoe te dependence of n and on c will be suppressed in te calculations Te error criterion tat we coose to use in analyzing te convergence is ean integrated squared error (ise) defined as ise( ˆf( ; n )) = E ( ˆf(x; n ) f(x) ) dx Tis error criterion is not witout its drawbacks (see Devroye and Györfi 985) but its ateatical siplicity is appealing Switcing te order of integration yields ise( ˆf( ; n )) = [ E ( ˆf(x; n ) f(x)) ] dx
3 Note tat te integrand is ean squared error (MSE) wic decoposes into squared bias and variance We tus ave ise( ˆf( ; n )) = bias ( ˆf(x; n )) dx + Var( ˆf(x; n )) dx In order to siplify te bias and variance calculations we ake te following additional assuptions Let N(a a ) denote a norally distributed rando variable wit ean a and variance a A Conditional on (Z i : i n) X (Z i ) N(µ(Z i ) σ (Z i )) for i = n and are (conditionally) independent; A Te kernel K is te density of a N( ) rando variable; A3 Te function µ( ) is strictly onotonic on its doain; A Te bandwidt is defined by = a δ were δ > and a > are constants independent of c If te central liit teore olds ten for large assuption A is approxiately true Tis assuption will be furter exained in 3 In A we specify te kernel K to be noral Togeter te norality of A and A allow us to derive copact expressions for bias and variance for te estiator ˆf(x; n ) Tis will be illustrated in te proof presented in Assuption A3 is a siplifying assuption tat ensures tat a cange of variable tat we eploy later is valid Since we ave as c A ensures tat as c wic is necessary for convergence in te standard kernel density estiation setting Also note tat given A is copletely deterined by and δ Now te density estiator is a function of x n and δ so ˆf(x; n ) and ise( ˆf( ; n )) can now be written wit an abuse of notation as ˆf(x; n δ) and ise( ˆf( ; n δ)) respectively In te case in wic te variance function σ ( ) is constant is exained In te general case is considered Constant Variance Function σ ( ) Te analysis for te case in wic te variance function σ ( ) is constant is siilar to tat in te standard kernel density estiation setting Hence te following assuptions togeter wit A and A are siilar to tose found in Prakasa Rao (983) p : A5 f is a bounded continuous square-integrable function; A6 n and as c Finally te constant variance assuption is noted: A7 σ ( ) σ > In Propositions and we ake use of o ( sall o ) notation For sequences of real nubers a n and b n we say tat a n = o(b n ) as n iff li a n/b n = n Taylor s Teore wit integral reainder is useful in te proof of Proposition We state it ere as a lea A proof can be found on p 78 of Apostol (967) Lea Assue f is twice continuously differentiable Ten f(x + ) = f(x) + f (x) + ( t)f (x + t)dt Proposition Assue A-A7 Ten were and ise( ˆf( ; n δ)) = b = ) ( + σ b + b n ( +o ( + ) + ) () n f (x) dx b = π Since = (a/) /δ () is equivalent to te following: for < δ ise( ˆf( ; n δ)) = bc + b ( n + o + ) n and for δ > ise( ˆf( ; ( a ) /δ n δ)) = b + b ( n +o + ) /δ n were Proof: b c = (ai(δ = ) + σ ) f (x) dx E( ˆf(x; n δ)) ( ( )) x = E X (Z ) ( [ ( ) ]) x = E E X (Z ) Z ()
4 Conditional on Z X (Z ) N(µ(Z ) σ ) Since K is te density of a N( ) rando variable K( /) is te density of a N( ) rando variable Ten te conditional expectation above is a convolution of te densities of two noral rando variables and so E( ˆf(x; n δ)) = E + σ Define η = K + σ x µ(z ) + σ so tat E ˆf(x; n ) can be expressed as ( ( )) x E η K µ(z ) (3) η A cange of variable using A3 sows tat (3) is given by ( ) x y η K f(y) dy () η and anoter cange of variable gives f(x uη)k(u) du Since f is twice continuously differentiable we ave by Lea tat E ˆf(x; n ) = f(x) K(u) du ηf (x) uk(u) du + η ( t)f (x tuη)u K(u) dtdu = f(x) + η ( t)f (x tuη)u K(u) dtdu Ten we ave bias ( ˆf(x; n δ)) dx [ = η ( t)f (x tuη)u K(u) dtdu] dx A rater involved arguent tat uses Lebesgue s doinated convergence teore (see eg p 5 of Prakasa Rao 983) establises tat [ ( t)f (x tuη)u K(u) dtdu] dx b as c It follows tat bias ( ˆf(x; n δ)) dx ) ( ( = ( + σ b + o + ) ) (5) Siilarly Var( ˆf(x; n δ)) = [ ( ( ))] x var n X (Z ) = [ ( ( ( )) ) x E n X (Z ) ( ( ( ))) ] x E X (Z ) (6) = ( [ ( ) ]) n π E x E / K X (Z ) / Z ( ) +o (7) n To obtain (7) we ave used te fact tat te second ter in (6) is f (x) plus error ters as sown in te bias calculations above Terefore te second ter in (6) is of order n and terefore o((n) ) We ten use te fact tat wen K( ; ) is te density of a noral rando variable wit ean and variance K ( ; ) = K( ; /) π to obtain (7) Applying te sae steps to ( [ ( ) ]) x E E / K X (Z ) / Z as were applied to ( [ ( ) ]) x E E X (Z ) Z gives var( ˆf(x; n δ)) = ( ) n π f(x) + o n Ten var( ˆf(x; n δ)) dx = n π + o (8) ( ) (9) n and () follows fro (5) and (9) Te norality given by A and A assures tat te convolutions in () and (8) ave well defined fors
5 Specifically for eac convolution we get a noral density evaluated at x wit expectation and variance resulting fro te su of te convolved distributions expectations and variances respectively Consider () wic is iediate fro te convolution perfored on te expectation Tis sows tat we are effectively doing standard kernel density estiation using kernels tat ave squared bandwidt η = + σ rater tan just Note tat te extra coponent σ is te variance of te error of te estiate X (Z ) about E(X Z ) conditional on Z So under te te stated assuptions te effect of using estiates X (Z) of realizations of E(X Z) rater tan actual realizations of E(X Z) is an effective bandwidt wose square is wider tan by te variance of te error of te estiate Copare () to te ise in standard kernel density estiation (Wand and Jones 995) ise(ĝ( ; )) = b u K(u) du + n K(u) du + o( + ) n () Note tat wen K is te density of a N( ) rando variable u K(u) du = and K(u) du = π in wic case () is precisely te sae as () except for te order of te leading ter on te bias dx ter For () it is wereas for () it is ( + σ ) Recalling tat = (a/) /δ we see te order on te bias dx ter in () is actually larger for < δ < as copared to () For δ te order is te sae So te wider bandwidt arising fro using estiates rater tan realizations of E(X Z) translates into diinised convergence on te bias dx ter in ise for < δ < We wis to coose n and δ to optiize te rate of convergence of ise( ˆf( ; n δ)) in ters of te coputer budget c and copare it to te optial rate of convergence for standard kernel density estiation To do tis we drop te low order ters and consider te large saple approxiation of ise( ˆf( ; n δ)): aise( ˆf( ; n δ)) = { b c ( a n if < δ ) /δ b + b n if δ > + b Here aise stands for asyptotic ean integrated squared error We now solve for te n and δ tat give te optial rate of convergence of aise( ˆf( ; n δ)) wic in turn give te optial rate of convergence of ise( ˆf( ; n δ)) As entioned above we ust satisfy te asyptotic relationsip (c)n(c)/c as c For te large saple approxiation take n = c/ It turns out tat for any given δ we can solve for and tus n = c/ : { d (δ)c (δ) = δ/(3δ+) if < δ d (δ)c δ/(δ+5) if δ > were and ( ) b d (δ) = a /(3δ+) c δ/(3δ+) δ b (δ + ) ( ) δ/(δ+5) d (δ) = a 5/(δ+5) b b (δ + ) We note tat and n are suc tat assuption A6 olds for any δ Substituting into our expressions for aise( ˆf( ; n δ)) gives aise( ˆf( ; ; n ; δ)) = were and { d (δ)c δ/(3δ+) if < δ d (δ)c /(δ+5) if δ > d (δ) = a /(3δ+) b δ/(3δ+) ( ) ( ) 3δ + b c (δ+)/(3δ+) δ δ (δ + ) d (δ) = (ab ) /(δ+5) ( 5 + δ ) ( ) (δ+)/(δ+5) b δ + Tus te best rate of convergence is attained at δ = Ten te optial coice of is were and te optial aise is were = dc /7 ( ) b d = a /7 c /7 3b aise( ˆf( ; ; n ; δ )) = dc /7 d = a /7 b /7 ( 7 ) ( ) b c 3/7 3 Te optial rate of convergence of ise is tus c /7 Altoug given above is te optial coice of for tis rate of convergence it is possible to acieve te rate c /7 so long as asyptotically (c) = d 3 c 7
6 for any positive constant d 3 and of course n = c In standard kernel density estiation te optial rate of convergence is c /5 (Wand and Jones 995 p 3) Te decrease in rate of convergence is expected in tat for eac of te n observations X (Z i ) units of coputer tie are required and as c wereas in te standard kernel density estiation setting eac observation requires only one unit of coputer tie In addition as we noted above for < δ < te convergence of bias dx in te expression for ise is slower in tis setting as copared to standard kernel density estiation Te slower convergence for < δ < induces coosing δ = so tat it does play a role in deterining te optial rate of convergence General Variance Function σ ( ) In tis section assuption A7 is relaxed Define te function ρ( ) σ (µ ( )) We ake te following additional assuptions: A8 Te function ρ( ) is bounded above and also away fro zero is twice differentiable and its derivatives are continuous and bounded; A9 f f and f are square integrable Proposition Assue A-A6 A8 and A9 Ten for < δ ise( ˆf( ; n δ)) = bv + b ( n + o + ) n and for δ > ise( ˆf( ; ( a ) /δ n δ)) = b + b ( n +o δ were b v = [f (x)(ai(δ = ) + ρ(x)) +f (x)ρ (x) + f(x)ρ (x)] dx and b and b are te sae as above + ) n Te proof of Proposition wic uses tecniques fro te variable bandwidt literature (Hall 99) but is oterwise siilar to te proof of Proposition will be given elsewere We reark tat te condition tat ρ( ) be bounded away fro zero is used only for te case δ ( ) Te only difference in ise in te constant variance function case and te general variance function case is in te coefficient of te bias dx ter wen < δ In te general case we ave b v wereas in te constant variance function case we ave b c We first note tat wen we ave a constant variance function (σ ( ) σ ) ρ( ) σ and ρ and ρ are zero so tat b v siplifies to b c Secondly we note tat in te general case σ ( ) plays a significant role in te constant b v troug te functions ρ ρ and ρ And finally we note tat if we define te function β( ) f( )ρ( ) b v is very siilar to (β (x)) dx So we expect b v to be large wen te function β ( ) tends to be large in agnitude Following te sae line of reasoning as in te constant variance case te best rate of convergence is attained at δ = and were Te optial aise is were = d v c /7 ( ) b d v = a /7 v /7 3b aise( ˆf( ; ; n ; δ )) = d v c /7 d v = a /7 b /7 ( 7 ) ( ) b v 3/7 3 So te sae rate is acieved as in te constant variance case but te coefficient is different We again note tat te optial rate c /7 is attainable provided tat asyptotically = d c /7 for any positive constant d and n = c 3 EXAMPLES In tis section we exaine te convergence of a few basic exaples and copare to te teoretical results presented in Specifically for eac exaple we look at ise as a function of te coputer budget c For clarity we no longer suppress te dependence of our functions on c For exaple we now write ise(c) (c) and n(c) To estiate ise(c) we first replicate te density estiator 5 independent ties: { ˆf k ( ; (c) n(c) δ) : k = 5} We define integrated squared error (ise) as follows: ise(c) = [ ˆf(x; (c) n(c) δ) f(x)] dx
7 For eac k= 5 we use nuerical integration to approxiately copute ise k (c) = [ ˆf k (x; (c) n(c) δ) f(x)] dx Our estiate for ise(c) is ten 5 5 k= ise k (c) In calculating ˆf k ( ; (c) n(c) δ) we take δ = and (c) = rc /7 as suggested in were te constant r was cosen in brief preliinary experients to be 3 for Exaple and for te oter exaples We took = /δ (so tat a = ) We estiate ise(c) for te following values of c: {c = l : l = 8} Exaple In te first exaple we let Z Beta( ) (a Beta(a a ) rando variable as density on ( ) proportional to x a ( x) a ) and conditional on Z = z X N(z 5) Ten te true density of te conditional expectation f is just te density of te Beta( ) distribution In Figure we plot log(ise(c)) vs log(c) Te linearity of te plot suggests tat asyptotically ise(c) = V c γ for soe constants V and γ Teoretically we expect γ = /7 57 Note tat γ is te slope of te (log(c) log(ise(c))) plot and te estiated slope of te plot in Figure is -5 Tis is very close to te expected rate of convergence log(mise) log(mise) vs log(c) Figure : Mean Integrated Squared Error as a Function of Coputational Budget for Exaple log(c) Exaple In tis exaple we consider a nonconstant variance function σ ( ) Once again let Z Beta( ) Conditional on Z = z we take X N(z z ) Te target density f is again te density of te Beta( ) distribution We present te log(ise(c)) vs log(c) plot in Figure Te plot is linear and te slope is estiated to be -5 indicating poorer convergence as copared to Exaple Tis is likely te result of te variance function σ (z) = z on te interval ( ) and zero elsewere We will furter discuss te ipact of tis variance function in Exaple 3 but we note ere tat tis variance function does not satisfy te sootness assuption A8 at z = log(mise) log(mise) vs log(c) Figure : Mean Integrated Squared Error as a Function of Coputational Budget for Exaple Exaple 3 In tis exaple we study te ipact of violating te assuption A in wic we assue tat conditional on (Z i : i n) X (Z i ) is norally distributed for i = n We take Z to ave a Beta( ) distribution sifted to te rigt by one unit so te support of Z is te interval ( ) Conditional on Z = z we now suppose X exp(/z) ie conditional on Z = z X is exponentially distributed wit ean /z Note tat conditional on (Z i : i n) X (Z i ) Gaa( Z i /) for i = n (a Gaa(a a ) rando variable as density on ( ) proportional to x a e x/a ) so tat assuption A is violated Te target density f is te Beta( ) density sifted to te rigt by one unit In Figure 3 we give te log(ise(c)) vs log(c) plot Te slope is estiated to be - Te rates of convergence in Exaples and 3 are quite siilar suggesting tat te norality of X (Z i ) i = n is not crucial to te rate of convergence Tis is to be expected since te central liit teore (CLT) tells us tat conditional on (Z i : i n) for large X (Z i ) beaves approxiately like a rando variable wit a N(Z i Zi /) distribution as in Exaple log(c)
8 log(mise) log(mise) vs log(c) large X (Z i ) Z i Ten te observations X (Z i ) i = n tat are larger in value are sooted ore tan te observations wit saller value resulting in te skewness seen in te plot Tis nonunifor sooting will probably be typical in exaples wit nonconstant variance but te teory presented in tis paper sows tat te convergence rate will not be affected Of course wile te rate ay not be affected te agnitude of te error ay be significantly affected troug ultiplicative constants log(c) CONCLUSIONS AND FUTURE RE- SEARCH Figure 3: Mean Integrated Squared Error as a Function of Coputational Budget for Exaple 3 density c=8 c=6 true density x Figure : Te True Density and Two Estiators for Exaple 3 Te rate is not as good as tat seen in Exaple We suspect tat tis reduction in rate and tat seen in Exaple is due to te function ρ( ) being discontinuous at z = Tis boundary effect is peraps evident in Figure were te perforance of te density estiates deteriorates near te boundaries of te plot Te density estiate for c = 8 is quite poor but iproves wen c = 6 For c = 6 te density estiate is sligtly skewed to te left Tis sae skewness was evident in oter independent replications of te experient We believe tat te skewness is a natural result of te for of te variance function σ (z) = z on te interval ( ) and zero elsewere Recall tat to generate te estiated density our observations X (Z i ) i = n are sooted by noral kernels wit bandwidt ( + σ (Z i )/) / For We ave sown ow to sare a coputational budget between external sapling of Z and internal sapling conditional on values of Z so as to iniize te aise of te density estiator Te aise can converge to at rate c /7 were c is te coputational budget Tis is slower tan te c /5 rate exibited in te standard density estiation context and bot of tese rates are slower tan te standard Monte Carlo rate c wen one is estiating an expectation Neverteless we believe tat te insigt one obtains fro te estiated density justifies te additional coputational effort involved Furterore one does not need an especially accurate estiate of te density in order to get soe idea of te extent of te effect of input uncertainty Clearly uc reains to be done We need to generalize our results beyond te case were Z is real-valued so as to capture ultiple input paraeters and/or distributions Te rate of convergence of te estiator sees to strongly depend on te sootness of ρ and as observed in experients not reported ere sootness of te target density We need to understand tis better In view of te relatively slow convergence of our estiators confidence intervals for estiates of f(x) or ore generally confidence bands for te entire density f would be of great value A key assuption is tat X (z) is exactly norally distributed Tis assuption often olds approxiately due to te central liit teore since we require to grow wit te coputational budget Te results for Exaple 3 suggest tat nonnorality ay not severely ipact te rate of convergence but we need to better understand tis ipact It is also of interest to consider probles tat do not fit te fraework ere suc as steadystate siulation and quantile estiation
9 We ave sown ow to coose te bandwidt only up to a ultiplicative constant Tis constant can ave a strong ipact on te perforance of te estiator even toug it doesn t cange te asyptotic rate of convergence So just as in te standard kernel-density estiation case bandwidt selection reains an issue In view of te popularity of istogra estiators it would be interesting to explore teir asyptotic perforance in our setting Tey are known to converge at a slower rate tan kernel-based estiators in te iid setting (Freedan and Diaconis 98) We are pursuing or plan to pursue all of tese topics ACKNOWLEDGMENTS Te first autor was supported by a National Defense Science and Engineering Graduate Fellowsip Te work of te second autor was partially supported by National Science Foundation grants DMI 88 and DMI 358 REFERENCES Andradóttir S and P W Glynn 3 Coputing Bayesian eans using siulation Subitted for publication Apostol T M 967 Calculus Volue I nd ed New York: Wiley Ceng R C H 99 Selecting input odels In Proceedings of te 99 Winter Siulation Conference ed J D Tew S Manivannan D A Sadowski and A F Seila 8 9 Piscataway NJ: IEEE Ceng R C H and W Holland 997 Sensitivity of coputer siulation experients to errors in input data Journal of Statistical Coputation and Siulation 57:9 Ceng R C H and W Holland 998 Two-point etods for assessing variability in siulation output Journal of Statistical Coputation and Siulation 6:83 5 Ceng R C H and W Holland 3 Calculation of confidence intervals for siulation output Subitted for publication Cick S E Input distribution selection for siulation experients: accounting for input uncertainty Operations Researc 9:7 758 Devroye L and L Györfi 985 Nonparaetric Density Estiation : Te L View New York: Wiley Freedan D and P Diaconis 98 On te istogra as a density estiator: L teory Z Warsceinlickeitsteorie verw Gebeite 57:53 76 Hall P 99 On te bias of variable bandwidt curve estiators Bioetrika 77 (3): Henderson S G 3 Input odel uncertainty: wy do we care and wat sould we do about it? In Proceedings of te 3 Winter Siulation Conference ed S E Cick P J Sáncez D J Morrice and D Ferrin To appear Piscataway NJ: IEEE Lee S H 998 Monte Carlo Coputation of Conditional Expectation Quantiles PD tesis Stanford University Stanford CA Lee S H and P W Glynn 999 Coputing te distribution function of a conditional expectation via Monte Carlo: discrete conditioning spaces In Proceedings of te 999 Winter Siulation Conference ed P A Farrington H Black Nebard D T Sturrock and G W Evans Piscataway NJ: IEEE Prakasa Rao B L S 983 Nonparaetric Functional Estiation New York: Acadeic Press Wand M and M Jones 995 Kernel Sooting London: Capan & Hall Zouaoui F and J R Wilson a Accounting for input odel and paraeter uncertainty in siulation In Proceedings of te Winter Siulation Conference ed B A Peters J S Sit D J Medeiros and M W Rorer 9 99 Piscataway NJ: IEEE Zouaoui F and J R Wilson b Accounting for paraeter uncertainty in siulation input odeling In Proceedings of te Winter Siulation Conference ed B A Peters J S Sit D J Medeiros and M W Rorer Piscataway NJ: IEEE AUTHOR BIOGRAPHIES SAMUEL G STECKLEY is a PD candidate in te Scool of Operations Researc and Industrial Engineering at Cornell University His priary field of interest is input odel uncertainty in discrete-event siulation He is te recipient of an NDSEG fellowsip His e-ail address is <steckley@oriecornelledu> SHANE G HENDERSON is an assistant professor in te Scool of Operations Researc and Industrial Engineering at Cornell University He as previously eld positions at te University of Micigan and te University of Auckland He is an associate editor for te ACM Transactions on Modeling and Coputer Siulation Operations Researc Letters and Mateatics of Operations Researc and te newsletter editor for te INFORMS College on Siulation His researc interests include discrete-event siulation queueing teory and sceduling probles His e-ail address is <sg9@cornelledu> and is web page URL is <wwworiecornelledu/~sane>
A KERNEL APPROACH TO ESTIMATING THE DENSITY OF A CONDITIONAL EXPECTATION. Samuel G. Steckley Shane G. Henderson
Proceedings of te 3 Winter Siulation Conference S Cick P J Sáncez D Ferrin and D J Morrice eds A KERNEL APPROACH TO ESTIMATING THE DENSITY OF A CONDITIONAL EXPECTATION Sauel G Steckley Sane G Henderson
More informationEstimating the Density of a Conditional Expectation
Estiating te Density of a Conditional Expectation Sauel G. Steckley Sane G. Henderson David Ruppert Ran Yang Daniel W. Apley Jerey Stau Abstract In tis paper, we analyze etods for estiating te density
More informationDerivative at a point
Roberto s Notes on Differential Calculus Capter : Definition of derivative Section Derivative at a point Wat you need to know already: Te concept of liit and basic etods for coputing liits. Wat you can
More informationlecture 35: Linear Multistep Mehods: Truncation Error
88 lecture 5: Linear Multistep Meods: Truncation Error 5.5 Linear ultistep etods One-step etods construct an approxiate solution x k+ x(t k+ ) using only one previous approxiation, x k. Tis approac enoys
More information1 Proving the Fundamental Theorem of Statistical Learning
THEORETICAL MACHINE LEARNING COS 5 LECTURE #7 APRIL 5, 6 LECTURER: ELAD HAZAN NAME: FERMI MA ANDDANIEL SUO oving te Fundaental Teore of Statistical Learning In tis section, we prove te following: Teore.
More informationStationary Gaussian Markov processes as limits of stationary autoregressive time series
Stationary Gaussian Markov processes as liits of stationary autoregressive tie series Pilip A. rnst 1,, Lawrence D. Brown 2,, Larry Sepp 3,, Robert L. Wolpert 4, Abstract We consider te class, C p, of
More information5.1 The derivative or the gradient of a curve. Definition and finding the gradient from first principles
Capter 5: Dierentiation In tis capter, we will study: 51 e derivative or te gradient o a curve Deinition and inding te gradient ro irst principles 5 Forulas or derivatives 5 e equation o te tangent line
More informationSupplementary Materials: Proofs and Technical Details for Parsimonious Tensor Response Regression Lexin Li and Xin Zhang
Suppleentary Materials: Proofs and Tecnical Details for Parsionious Tensor Response Regression Lexin Li and Xin Zang A Soe preliinary results We will apply te following two results repeatedly. For a positive
More informationEstimation for the Parameters of the Exponentiated Exponential Distribution Using a Median Ranked Set Sampling
Journal of Modern Applied Statistical Metods Volue 14 Issue 1 Article 19 5-1-015 Estiation for te Paraeters of te Exponentiated Exponential Distribution Using a Median Ranked Set Sapling Monjed H. Sau
More informationSubmanifold density estimation
Subanifold density estiation Arkadas Ozakin Georgia Tec Researc Institute Georgia Insitute of Tecnology arkadas.ozakin@gtri.gatec.edu Alexander Gray College of Coputing Georgia Institute of Tecnology agray@cc.gatec.edu
More informationNeural Networks Trained with the EEM Algorithm: Tuning the Smoothing Parameter
eural etworks Trained wit te EEM Algorit: Tuning te Sooting Paraeter JORGE M. SATOS,2, JOAQUIM MARQUES DE SÁ AD LUÍS A. ALEXADRE 3 Intituto de Engenaria Bioédica, Porto, Portugal 2 Instituto Superior de
More informationThe Priestley-Chao Estimator
Te Priestley-Cao Estimator In tis section we will consider te Pristley-Cao estimator of te unknown regression function. It is assumed tat we ave a sample of observations (Y i, x i ), i = 1,..., n wic are
More informationLAB #3: ELECTROSTATIC FIELD COMPUTATION
ECE 306 Revised: 1-6-00 LAB #3: ELECTROSTATIC FIELD COMPUTATION Purpose During tis lab you will investigate te ways in wic te electrostatic field can be teoretically predicted. Bot analytic and nuerical
More informationNon-Parametric Non-Line-of-Sight Identification 1
Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,
More informationarxiv: v2 [stat.me] 28 Aug 2016
arxiv:509.04704v [stat.me] 8 Aug 06 Central liit teores for network driven sapling Xiao Li Scool of Mateatical Sciences Peking University Karl Roe Departent of Statistics University of Wisconsin-Madison
More information. If lim. x 2 x 1. f(x+h) f(x)
Review of Differential Calculus Wen te value of one variable y is uniquely determined by te value of anoter variable x, ten te relationsip between x and y is described by a function f tat assigns a value
More informationf a h f a h h lim lim
Te Derivative Te derivative of a function f at a (denoted f a) is f a if tis it exists. An alternative way of defining f a is f a x a fa fa fx fa x a Note tat te tangent line to te grap of f at te point
More information5.1 We will begin this section with the definition of a rational expression. We
Basic Properties and Reducing to Lowest Terms 5.1 We will begin tis section wit te definition of a rational epression. We will ten state te two basic properties associated wit rational epressions and go
More informationA = h w (1) Error Analysis Physics 141
Introduction In all brances of pysical science and engineering one deals constantly wit numbers wic results more or less directly from experimental observations. Experimental observations always ave inaccuracies.
More informationA Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More informationConsider a function f we ll specify which assumptions we need to make about it in a minute. Let us reformulate the integral. 1 f(x) dx.
Capter 2 Integrals as sums and derivatives as differences We now switc to te simplest metods for integrating or differentiating a function from its function samples. A careful study of Taylor expansions
More informationIEOR 165 Lecture 10 Distribution Estimation
IEOR 165 Lecture 10 Distribution Estimation 1 Motivating Problem Consider a situation were we ave iid data x i from some unknown distribution. One problem of interest is estimating te distribution tat
More informationDEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS
ISSN 1440-771X AUSTRALIA DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS An Iproved Method for Bandwidth Selection When Estiating ROC Curves Peter G Hall and Rob J Hyndan Working Paper 11/00 An iproved
More informationCOS 424: Interacting with Data. Written Exercises
COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well
More informationBootstrapping Dependent Data
Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly
More informationMVT and Rolle s Theorem
AP Calculus CHAPTER 4 WORKSHEET APPLICATIONS OF DIFFERENTIATION MVT and Rolle s Teorem Name Seat # Date UNLESS INDICATED, DO NOT USE YOUR CALCULATOR FOR ANY OF THESE QUESTIONS In problems 1 and, state
More informationNumerical Differentiation
Numerical Differentiation Finite Difference Formulas for te first derivative (Using Taylor Expansion tecnique) (section 8.3.) Suppose tat f() = g() is a function of te variable, and tat as 0 te function
More informationThe research of the rst author was supported in part by an Information Technology
Tecnical Report 95-376 Absorbing Boundary Conditions for te Scrodinger Equation Toas Fevens Hong Jiang February 16, 1995 Te researc of te rst autor was supported in part by an Inforation Tecnology Researc
More information3.4 Worksheet: Proof of the Chain Rule NAME
Mat 1170 3.4 Workseet: Proof of te Cain Rule NAME Te Cain Rule So far we are able to differentiate all types of functions. For example: polynomials, rational, root, and trigonometric functions. We are
More informationMathematics 5 Worksheet 11 Geometry, Tangency, and the Derivative
Matematics 5 Workseet 11 Geometry, Tangency, and te Derivative Problem 1. Find te equation of a line wit slope m tat intersects te point (3, 9). Solution. Te equation for a line passing troug a point (x
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationVolume 29, Issue 3. Existence of competitive equilibrium in economies with multi-member households
Volume 29, Issue 3 Existence of competitive equilibrium in economies wit multi-member ouseolds Noriisa Sato Graduate Scool of Economics, Waseda University Abstract Tis paper focuses on te existence of
More informationProbability Distributions
Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples
More informationc hc h c h. Chapter Since E n L 2 in Eq. 39-4, we see that if L is doubled, then E 1 becomes (2.6 ev)(2) 2 = 0.65 ev.
Capter 39 Since n L in q 39-4, we see tat if L is doubled, ten becoes (6 ev)() = 065 ev We first note tat since = 666 0 34 J s and c = 998 0 8 /s, 34 8 c6 66 0 J sc 998 0 / s c 40eV n 9 9 60 0 J / ev 0
More informationBob Brown Math 251 Calculus 1 Chapter 3, Section 1 Completed 1 CCBC Dundalk
Bob Brown Mat 251 Calculus 1 Capter 3, Section 1 Completed 1 Te Tangent Line Problem Te idea of a tangent line first arises in geometry in te context of a circle. But before we jump into a discussion of
More informationINFORMATION THEORETIC STRUCTURE LEARNING WITH CONFIDENCE
INFORMATION THEORETIC STRUCTURE LEARNING WITH CONFIDENCE Kevin R. Moon Yale University Departent of Genetics New Haven, Connecticut, U.S.A Morteza Nosad, Salie Yasaei Seke, Alfred O. Hero III University
More informationPreface. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.
Preface Here are my online notes for my course tat I teac ere at Lamar University. Despite te fact tat tese are my class notes, tey sould be accessible to anyone wanting to learn or needing a refreser
More informationExam 1 Review Solutions
Exam Review Solutions Please also review te old quizzes, and be sure tat you understand te omework problems. General notes: () Always give an algebraic reason for your answer (graps are not sufficient),
More informationKeywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution
Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality
More informationRegularized Regression
Regularized Regression David M. Blei Columbia University December 5, 205 Modern regression problems are ig dimensional, wic means tat te number of covariates p is large. In practice statisticians regularize
More informationBlock designs and statistics
Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent
More informationBiostatistics Department Technical Report
Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent
More informationKernel Density Based Linear Regression Estimate
Kernel Density Based Linear Regression Estimate Weixin Yao and Zibiao Zao Abstract For linear regression models wit non-normally distributed errors, te least squares estimate (LSE will lose some efficiency
More informationAn unbalanced Optimal Transport splitting scheme for general advection-reaction-diffusion problems
An unbalanced Optial Transport splitting scee for general advection-reaction-diffusion probles T.O. Gallouët, M. Laborde, L. Monsaingeon May 2, 27 Abstract In tis paper, we sow tat unbalanced optial transport
More informationPassivity based control of magnetic levitation systems: theory and experiments Λ
Passivity based control of agnetic levitation systes: teory and experients Λ Hugo Rodriguez a, Roeo Ortega ay and Houria Siguerdidjane b alaboratoire des Signaux et Systées bservice d Autoatique Supelec
More informationNumerical Solution for Non-Stationary Heat Equation in Cooling of Computer Radiator System
(JZS) Journal of Zankoy Sulaiani, 9, 1(1) Part A (97-1) A119 Nuerical Solution for Non-Stationary Heat Equation in Cooling of Coputer Radiator Syste Aree A. Maad*, Faraidun K. Haa Sal**, and Najadin W.
More informationJackknife Emperical Likelihood Method and its Applications
Georgia State University ScolarWorks @ Georgia State University Mateatics Dissertations Departent of Mateatics and Statistics 8--202 Jackknife Eperical Likeliood Metod and its Applications Hanfang Yang
More informationContinuity and Differentiability Worksheet
Continuity and Differentiability Workseet (Be sure tat you can also do te grapical eercises from te tet- Tese were not included below! Typical problems are like problems -3, p. 6; -3, p. 7; 33-34, p. 7;
More informationDifferential Calculus (The basics) Prepared by Mr. C. Hull
Differential Calculus Te basics) A : Limits In tis work on limits, we will deal only wit functions i.e. tose relationsips in wic an input variable ) defines a unique output variable y). Wen we work wit
More informationCombining functions: algebraic methods
Combining functions: algebraic metods Functions can be added, subtracted, multiplied, divided, and raised to a power, just like numbers or algebra expressions. If f(x) = x 2 and g(x) = x + 2, clearly f(x)
More informationSection 3.1: Derivatives of Polynomials and Exponential Functions
Section 3.1: Derivatives of Polynomials and Exponential Functions In previous sections we developed te concept of te derivative and derivative function. Te only issue wit our definition owever is tat it
More informationHomotopy analysis of 1D unsteady, nonlinear groundwater flow through porous media
Hootopy analysis of D unsteady, nonlinear groundwater flow troug porous edia Autor Song, Hao, Tao, Longbin Publised 7 Journal Title Journal of Coastal Researc Copyrigt Stateent 7 CERF. Te attaced file
More informationAnalytic Functions. Differentiable Functions of a Complex Variable
Analytic Functions Differentiable Functions of a Complex Variable In tis capter, we sall generalize te ideas for polynomials power series of a complex variable we developed in te previous capter to general
More informationThe derivative function
Roberto s Notes on Differential Calculus Capter : Definition of derivative Section Te derivative function Wat you need to know already: f is at a point on its grap and ow to compute it. Wat te derivative
More informationModel Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon
Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential
More informationLIMITATIONS OF EULER S METHOD FOR NUMERICAL INTEGRATION
LIMITATIONS OF EULER S METHOD FOR NUMERICAL INTEGRATION LAURA EVANS.. Introduction Not all differential equations can be explicitly solved for y. Tis can be problematic if we need to know te value of y
More informationNUCLEAR THERMAL-HYDRAULIC FUNDAMENTALS
NUCLEAR THERMAL-HYDRAULIC FUNDAMENTALS Dr. J. Micael Doster Departent of Nuclear Engineering Nort Carolina State University Raleig, NC Copyrigted POER CYCLES Te analysis of Terodynaic Cycles is based alost
More informationCalculus I Practice Exam 1A
Calculus I Practice Exam A Calculus I Practice Exam A Tis practice exam empasizes conceptual connections and understanding to a greater degree tan te exams tat are usually administered in introductory
More informationlecture 26: Richardson extrapolation
43 lecture 26: Ricardson extrapolation 35 Ricardson extrapolation, Romberg integration Trougout numerical analysis, one encounters procedures tat apply some simple approximation (eg, linear interpolation)
More informationInternational Journal of Advance Engineering and Research Development OSCILLATION AND STABILITY IN A MASS SPRING SYSTEM
Scientific Journal of Ipact Factor (SJIF): 5.71 International Journal of Advance Engineering and Researc Developent Volue 5, Issue 06, June -018 e-issn (O): 348-4470 p-issn (P): 348-6406 OSCILLATION AND
More informationInspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information
Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub
More informationThe Weierstrass Approximation Theorem
36 The Weierstrass Approxiation Theore Recall that the fundaental idea underlying the construction of the real nubers is approxiation by the sipler rational nubers. Firstly, nubers are often deterined
More informationMAT 145. Type of Calculator Used TI-89 Titanium 100 points Score 100 possible points
MAT 15 Test #2 Name Solution Guide Type of Calculator Used TI-89 Titanium 100 points Score 100 possible points Use te grap of a function sown ere as you respond to questions 1 to 8. 1. lim f (x) 0 2. lim
More information2.11 That s So Derivative
2.11 Tat s So Derivative Introduction to Differential Calculus Just as one defines instantaneous velocity in terms of average velocity, we now define te instantaneous rate of cange of a function at a point
More informationDetermining Limits of Thermal NDT of Thick Graphite/Epoxy Composites
ECNDT 006 - We.3.8.1 Deterining Liits of Teral NDT of Tick Grapite/Epoy Coposites Vladiir VAVILOV Institute of Introscopy Tosk Russia Abstract. Te known approac to inspecting tin coposites by using infrared
More informationESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics
ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents
More informationA Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair
Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving
More informationLecture 15. Interpolation II. 2 Piecewise polynomial interpolation Hermite splines
Lecture 5 Interpolation II Introduction In te previous lecture we focused primarily on polynomial interpolation of a set of n points. A difficulty we observed is tat wen n is large, our polynomial as to
More informationOn the Impact of Stochastic Loads and Wind Generation on Under Load Tap Changers
On te Ipact of Stocastic Loads and Wind Generation on Under Load Tap Cangers M. A. A. Murad, Student Meber, IEEE, F. M. Mele, Student Meber, IEEE, F. Milano, Fellow, IEEE Scool of Electrical & Electronic
More informationChapter 1. Density Estimation
Capter 1 Density Estimation Let X 1, X,..., X n be observations from a density f X x. Te aim is to use only tis data to obtain an estimate ˆf X x of f X x. Properties of f f X x x, Parametric metods f
More informationEN40: Dynamics and Vibrations. Midterm Examination Tuesday March
EN4: Dynaics and ibrations Midter Exaination Tuesday Marc 4 14 Scool of Engineering Brown University NAME: General Instructions No collaboration of any kind is peritted on tis exaination. You ay bring
More informationExperimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis
City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna
More informationCONVERGENCE OF A MIXED METHOD FOR A SEMI-STATIONARY COMPRESSIBLE STOKES SYSTEM
CONVERGENCE OF A MIXED METHOD FOR A SEMI-STATIONARY COMPRESSIBLE STOKES SYSTEM KENNETH H. KARLSEN AND TRYGVE K. KARPER Abstract. We propose and analyze a finite eleent etod for a sei stationary Stoes syste
More informationTesting equality of variances for multiple univariate normal populations
University of Wollongong Research Online Centre for Statistical & Survey Methodology Working Paper Series Faculty of Engineering and Inforation Sciences 0 esting equality of variances for ultiple univariate
More informationBoosting Kernel Density Estimates: a Bias Reduction. Technique?
Boosting Kernel Density Estimates: a Bias Reduction Tecnique? Marco Di Marzio Dipartimento di Metodi Quantitativi e Teoria Economica, Università di Cieti-Pescara, Viale Pindaro 42, 65127 Pescara, Italy
More informationBloom Features. Kwabena Boahen Bioengineering Department Stanford University Stanford CA, USA
2015 International Conference on Coputational Science and Coputational Intelligence Bloo Features Asok Cutkosky Coputer Science Departent Stanford University Stanford CA, USA asokc@stanford.edu Kwabena
More informationSECTION 3.2: DERIVATIVE FUNCTIONS and DIFFERENTIABILITY
(Section 3.2: Derivative Functions and Differentiability) 3.2.1 SECTION 3.2: DERIVATIVE FUNCTIONS and DIFFERENTIABILITY LEARNING OBJECTIVES Know, understand, and apply te Limit Definition of te Derivative
More informationLecture XVII. Abstract We introduce the concept of directional derivative of a scalar function and discuss its relation with the gradient operator.
Lecture XVII Abstract We introduce te concept of directional derivative of a scalar function and discuss its relation wit te gradient operator. Directional derivative and gradient Te directional derivative
More informationA Possible Solution to the Cosmological Constant Problem By Discrete Space-time Hypothesis
A Possible Solution to te Cosological Constant Proble By Discrete Space-tie Hypotesis H.M.Mok Radiation Healt Unit, 3/F., Saiwano Healt Centre, Hong Kong SAR Got, 28 Tai Hong St., Saiwano, Hong Kong, Cina.
More informationNew Distribution Theory for the Estimation of Structural Break Point in Mean
New Distribution Teory for te Estimation of Structural Break Point in Mean Liang Jiang Singapore Management University Xiaou Wang Te Cinese University of Hong Kong Jun Yu Singapore Management University
More informationA MONTE CARLO ANALYSIS OF THE EFFECTS OF COVARIANCE ON PROPAGATED UNCERTAINTIES
A MONTE CARLO ANALYSIS OF THE EFFECTS OF COVARIANCE ON PROPAGATED UNCERTAINTIES Ronald Ainswort Hart Scientific, American Fork UT, USA ABSTRACT Reports of calibration typically provide total combined uncertainties
More informationChapter 5 FINITE DIFFERENCE METHOD (FDM)
MEE7 Computer Modeling Tecniques in Engineering Capter 5 FINITE DIFFERENCE METHOD (FDM) 5. Introduction to FDM Te finite difference tecniques are based upon approximations wic permit replacing differential
More informationREVIEW LAB ANSWER KEY
REVIEW LAB ANSWER KEY. Witout using SN, find te derivative of eac of te following (you do not need to simplify your answers): a. f x 3x 3 5x x 6 f x 3 3x 5 x 0 b. g x 4 x x x notice te trick ere! x x g
More informationPolynomial Interpolation
Capter 4 Polynomial Interpolation In tis capter, we consider te important problem of approximatinga function fx, wose values at a set of distinct points x, x, x,, x n are known, by a polynomial P x suc
More informationIN modern society that various systems have become more
Developent of Reliability Function in -Coponent Standby Redundant Syste with Priority Based on Maxiu Entropy Principle Ryosuke Hirata, Ikuo Arizono, Ryosuke Toohiro, Satoshi Oigawa, and Yasuhiko Takeoto
More informationINFINITE ORDER CROSS-VALIDATED LOCAL POLYNOMIAL REGRESSION. 1. Introduction
INFINITE ORDER CROSS-VALIDATED LOCAL POLYNOMIAL REGRESSION PETER G. HALL AND JEFFREY S. RACINE Abstract. Many practical problems require nonparametric estimates of regression functions, and local polynomial
More informationEFFICIENCY OF MODEL-ASSISTED REGRESSION ESTIMATORS IN SAMPLE SURVEYS
Statistica Sinica 24 2014, 395-414 doi:ttp://dx.doi.org/10.5705/ss.2012.064 EFFICIENCY OF MODEL-ASSISTED REGRESSION ESTIMATORS IN SAMPLE SURVEYS Jun Sao 1,2 and Seng Wang 3 1 East Cina Normal University,
More informationHow to Combine Tree-Search Methods in Reinforcement Learning
How to Cobine Tree-Searc Metods in Reinforceent Learning Yonatan Efroni Tecnion, Israel Gal Dalal Tecnion, Israel Bruno Scerrer INRIA, Villers les Nancy, France Sie Mannor Tecnion, Israel Abstract Finite-orizon
More informationA Note on the Applied Use of MDL Approximations
A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention
More informationApplications of the van Trees inequality to non-parametric estimation.
Brno-06, Lecture 2, 16.05.06 D/Stat/Brno-06/2.tex www.mast.queensu.ca/ blevit/ Applications of te van Trees inequality to non-parametric estimation. Regular non-parametric problems. As an example of suc
More informationA.P. CALCULUS (AB) Outline Chapter 3 (Derivatives)
A.P. CALCULUS (AB) Outline Capter 3 (Derivatives) NAME Date Previously in Capter 2 we determined te slope of a tangent line to a curve at a point as te limit of te slopes of secant lines using tat point
More informationAN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS
Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri
More information7 Semiparametric Methods and Partially Linear Regression
7 Semiparametric Metods and Partially Linear Regression 7. Overview A model is called semiparametric if it is described by and were is nite-dimensional (e.g. parametric) and is in nite-dimensional (nonparametric).
More information2.8 The Derivative as a Function
.8 Te Derivative as a Function Typically, we can find te derivative of a function f at many points of its domain: Definition. Suppose tat f is a function wic is differentiable at every point of an open
More informationDifferentiation in higher dimensions
Capter 2 Differentiation in iger dimensions 2.1 Te Total Derivative Recall tat if f : R R is a 1-variable function, and a R, we say tat f is differentiable at x = a if and only if te ratio f(a+) f(a) tends
More informationC na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction
TWO-STGE SMPLE DESIGN WITH SMLL CLUSTERS Robert G. Clark and David G. Steel School of Matheatics and pplied Statistics, University of Wollongong, NSW 5 ustralia. (robert.clark@abs.gov.au) Key Words: saple
More information1 The concept of limits (p.217 p.229, p.242 p.249, p.255 p.256) 1.1 Limits Consider the function determined by the formula 3. x since at this point
MA00 Capter 6 Calculus and Basic Linear Algebra I Limits, Continuity and Differentiability Te concept of its (p.7 p.9, p.4 p.49, p.55 p.56). Limits Consider te function determined by te formula f Note
More informationName: Answer Key No calculators. Show your work! 1. (21 points) All answers should either be,, a (finite) real number, or DNE ( does not exist ).
Mat - Final Exam August 3 rd, Name: Answer Key No calculators. Sow your work!. points) All answers sould eiter be,, a finite) real number, or DNE does not exist ). a) Use te grap of te function to evaluate
More informationTaylor Series and the Mean Value Theorem of Derivatives
1 - Taylor Series and te Mean Value Teorem o Derivatives Te numerical solution o engineering and scientiic problems described by matematical models oten requires solving dierential equations. Dierential
More informationTHE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Math 225
THE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Mat 225 As we ave seen, te definition of derivative for a Mat 111 function g : R R and for acurveγ : R E n are te same, except for interpretation:
More information