Pythagoras at the Bat: An Introduction to Stats and Modeling

Size: px
Start display at page:

Download "Pythagoras at the Bat: An Introduction to Stats and Modeling"

Transcription

1 Pythagoras at the Bat: An Introduction to Stats and Modeling Cameron, Kayla and Steven Miller Williams College)

2 Acknowledgments Sal Baxamusa, Phil Birnbaum, Chris Chiang, Ray Ciccolella, Steve Johnston, Michelle Manes, Russ Mann, students of Math 162 and Math 197 at Brown, Math 150 and 399 at Williams. Dedicated to my great uncle Newt Bromberg (a lifetime Red Sox fan who promised me that I would live to see a World Series Championship in Boston). Chris Long and the San Diego Padres.

3 Acknowledgments

4 Introduction to the Pythagorean Won Loss Theorem

5 5 Intro Prob & Modeling Pythag Thm Analysis of 04 Adv Theory Summary Refs Appendices Goals of the Talk Give derivation Pythagorean Won Loss formula. Observe ideas / techniques of modeling. See how advanced theory enters in simple problems. Opportunities from inefficiencies. Xtra: further avenues for research for students.

6 Goals of the Talk Give derivation Pythagorean Won Loss formula. Observe ideas / techniques of modeling. See how advanced theory enters in simple problems. Opportunities from inefficiencies. Xtra: further avenues for research for students. GO SOX!

7 7 Intro Prob & Modeling Pythag Thm Analysis of 04 Adv Theory Summary Refs Appendices Statistics Goal is to find good statistics to describe real world.

8 Statistics Goal is to find good statistics to describe real world. Figure: Harvard Bridge, about meters.

9 Statistics Goal is to find good statistics to describe real world. Figure: Harvard Bridge, Smoots (± one ear).

10 Baseball Review Goal is to go from

11 Baseball Review

12 Baseball Review to

13 Baseball Review

14 Baseball Review to

15 Baseball Review

16 Numerical Observation: Pythagorean Won Loss Formula Parameters RS obs : average number of runs scored per game; RA obs : average number of runs allowed per game; γ: some parameter, constant for a sport.

17 Numerical Observation: Pythagorean Won Loss Formula Parameters RS obs : average number of runs scored per game; RA obs : average number of runs allowed per game; γ: some parameter, constant for a sport. James Won Loss Formula (NUMERICAL Observation) Won Loss Percentage = #Wins #Games = RS γ obs RS γ obs + RAγ obs γ originally taken as 2, numerical studies show best γ for baseball is about 1.82.

18 Pythagorean Won Loss Formula: Example James Won Loss Formula Won Loss Percentage = #Wins #Games = RS γ obs RS γ obs + RAγ obs Example (γ = 1.82): In 2009 the Red Sox were They scored 872 runs and allowed 736, for a Pythagorean prediction record of 93.4 wins and 68.6 losses; the Yankees were but predicted to be (they scored 915 runs and allowed 753).

19 Pythagorean Won Loss Formula: Example James Won Loss Formula Won Loss Percentage = #Wins #Games = RS γ obs RS γ obs + RAγ obs Example (γ = 1.82): In 2009 the Red Sox were They scored 872 runs and allowed 736, for a Pythagorean prediction record of 93.4 wins and 68.6 losses; the Yankees were but predicted to be (they scored 915 runs and allowed 753). 2011: Red Sox should be 95-67, Tampa should be

20 Applications of the Pythagorean Won Loss Formula Extrapolation: use half-way through season to predict a team s performance for rest of season. Evaluation: see if consistently over-perform or under-perform. Advantage: Other statistics / formulas (run-differential per game); this is easy to use, depends only on two simple numbers for a team.

21 Applications of the Pythagorean Won Loss Formula Extrapolation: use half-way through season to predict a team s performance for rest of season. Evaluation: see if consistently over-perform or under-perform. Advantage: Other statistics / formulas (run-differential per game); this is easy to use, depends only on two simple numbers for a team. Red Sox: 2004 Predictions: May 1: 99 wins; June 1: 93 wins; July 1: 90 wins; August 1: 92 wins. Finished season with 98 wins.

22 Probability and Modeling

23 Observed scoring distributions Goal is to model observed scoring distributions; for example, consider

24 Probability Review Let X be random variable with density p(x): p(x) 0; p(x)dx = 1; Prob(a X b) = b p(x)dx. a

25 Probability Review Let X be random variable with density p(x): p(x) 0; p(x)dx = 1; Prob(a X b) = b p(x)dx. a Mean µ = xp(x)dx.

26 Probability Review Let X be random variable with density p(x): p(x) 0; p(x)dx = 1; Prob(a X b) = b p(x)dx. a Mean µ = xp(x)dx. Variance σ 2 = (x µ)2 p(x)dx.

27 Probability Review Let X be random variable with density p(x): p(x) 0; p(x)dx = 1; Prob(a X b) = b p(x)dx. a Mean µ = xp(x)dx. Variance σ 2 = (x µ)2 p(x)dx. Independence: knowledge of one random variable gives no knowledge of the other.

28 Modeling the Real World Guidelines for Modeling: Model should capture key features of the system; Model should be mathematically tractable (solvable).

29 Modeling the Real World (cont) Possible Model: Runs Scored and Runs Allowed independent random variables; f RS (x), g RA (y): probability density functions for runs scored (allowed).

30 Modeling the Real World (cont) Possible Model: Runs Scored and Runs Allowed independent random variables; f RS (x), g RA (y): probability density functions for runs scored (allowed). Won Loss formula follows from computing [ ] f RS (x)g RA (y)dy dx or x=0 y x j<i i=0 f RS (i)g RA (j).

31 Problems with the Model Reduced to calculating x=0 [ y x ] f RS (x)g RA (y)dy dx or j<i i=0 f RS (i)g RA (j).

32 Problems with the Model Reduced to calculating x=0 [ y x ] f RS (x)g RA (y)dy dx or j<i i=0 f RS (i)g RA (j). Problems with the model: What are explicit formulas for f RS and g RA? Are the runs scored and allowed independent random variables? Can the integral (or sum) be computed in closed form?

33 Choices for f RS and g RA Uniform Distribution on [0, 10].

34 Choices for f RS and g RA Normal Distribution: mean 4, standard deviation 2.

35 Choices for f RS and g RA Exponential Distribution: e x.

36 Three Parameter Weibull Weibull distribution: f(x;α,β,γ) = { γ α ( x β ) γ 1 α e ((x β)/α) γ if x β 0 otherwise. α: scale (variance: meters versus centimeters); β: origin (mean: translation, zero point); γ: shape (behavior near β and at infinity). Various values give different shapes, but can we find α,β,γ such that it fits observed data? Is the Weibull justifiable by some reasonable hypotheses?

37 Weibull Plots: Parameters (α, β, γ): ( ) γ 1 γ x β f(x;α,β,γ) = α α e ((x β)/α) γ if x β 0 otherwise Red:(1, 0, 1) (exponential); Green:(1, 0, 2); Cyan:(1, 2, 2); Blue:(1, 2, 4)

38 Three Parameter Weibull: Applications { ( γ x β ) γ 1 f(x;α,β,γ) = α α e ((x β)/α) γ if x β 0 otherwise. Arises in many places, such as survival analysis. γ < 1: high infant mortality; γ = 1: constant failure rate; γ > 1: aging process

39 The Gamma Distribution and Weibulls For s > 0, define the Γ-function by Γ(s) = 0 e u u s 1 du = 0 e u u s du u. Generalizes factorial function: Γ(n) = (n 1)! for n 1 an integer. A Weibull distribution with parameters α, β, γ has: Mean: αγ(1+1/γ)+β. Variance: α 2 Γ(1+2/γ) α 2 Γ(1+1/γ) 2.

40 Weibull Integrations µ α,β,γ = = β β x γ α ( ) γ 1 x β e ((x β)/α)γ dx α ( ) γ 1 x β e ((x β)/α)γ dx + β. α x β α γ α α Change variables: u = ( x β µ α,β,γ = α 0 ) γ, so du = γ α αu 1/γ e u du + β = α e u u 1+1/γ du 0 u = αγ(1+1/γ) + β. ( x β ) γ 1 α dx and + β A similar calculation determines the variance.

41 The Pythagorean Theorem

42 Building Intuition: The log 5 Method Assume team A wins p percent of their games, and team B wins q percent of their games. Which formula do you think does a good job of predicting the probability that team A beats team B? Why? p + pq p + q + 2pq, p pq p + q + 2pq, p + pq p + q 2pq p pq p + q 2pq

43 Estimating Winning Percentages p+pq p + q + 2pq, p+pq p + q 2pq, p pq p+q + 2pq, p pq p+q 2pq How can we test these candidates? Can you think of answers for special choices of p and q?

44 Estimating Winning Percentages p+pq p + q + 2pq, p+pq p + q 2pq, p pq p+q + 2pq, p pq p+q 2pq Homework: explore the following: p = 1, q < 1 (do not want the battle of the undefeated). p = 0, q > 0 (do not want the Toilet Bowl). p = q. p > q (can do q < 1/2 and q > 1/2). Anything else where you know the answer?

45 Estimating Winning Percentages p+pq p + q + 2pq, p+pq p + q 2pq, p pq p+q + 2pq, p pq p+q 2pq

46 Estimating Winning Percentages p pq p + q 2pq = p(1 q) p(1 q)+(1 p)q Homework: explore the following: p = 1, q < 1 (do not want the battle of the undefeated). p = 0, q > 0 (do not want the Toilet Bowl). p = q. p > q (can do q < 1/2 and q > 1/2). Anything else where you know the answer?

47 Estimating Winning Percentages: Proof

48 Estimating Winning Percentages: Proof Figure: Two possibilities: A has a good day, or A doesn t.

49 Estimating Winning Percentages: Proof Figure: B has a good day, or doesn t.

50 Estimating Winning Percentages: Proof Figure: Two paths terminate, two start again.

51 Estimating Winning Percentages: Proof

52 Pythagorean Won Loss Formula: RS γ obs RS γ obs +RAγ obs Theorem: Pythagorean Won Loss Formula (Miller 06) Let the runs scored and allowed per game be two independent random variables drawn from Weibull distributions (α RS,β,γ) and (α RA,β,γ); α RS and α RA are chosen so that the Weibull means are the observed sample values RS and RA. If γ > 0 then the Won Loss Percentage is (RS β) γ (RS β) γ +(RA β) γ.

53 Pythagorean Won Loss Formula: RS γ obs RS γ obs +RAγ obs Theorem: Pythagorean Won Loss Formula (Miller 06) Let the runs scored and allowed per game be two independent random variables drawn from Weibull distributions (α RS,β,γ) and (α RA,β,γ); α RS and α RA are chosen so that the Weibull means are the observed sample values RS and RA. If γ > 0 then the Won Loss Percentage is (RS β) γ (RS β) γ +(RA β) γ. Take β = 1/2 (since runs must be integers). RS β estimates average runs scored, RA β estimates average runs allowed. Weibull with parameters (α,β,γ) has mean αγ(1+1/γ)+β.

54 Proof of the Pythagorean Won Loss Formula Let X and Y be independent random variables with Weibull distributions (α RS,β,γ) and (α RA,β,γ) respectively. To have means of RS β and RA β our calculations for the means imply α RS = RS β Γ(1+1/γ), α RA = RA β Γ(1+1/γ). We need only calculate the probability that X exceeds Y. We use the integral of a probability density is 1.

55 Proof of the Pythagorean Won Loss Formula (cont) 55 Prob(X > Y) = = = = x β β x=0 α RS = 1 γ γ x=0 α RS γ α RS ( x x=β ( x β α RS ( x γ x=0 α RS α RS α RS x y=β f(x;α RS,β,γ)f(y;α RA,β,γ)dy dx ) γ 1 e ( x β α RS ) γ ) [ γ 1 ( ) γ e x x α RS γ α RA γ y=0 α RA ( y β α RA ( y α RA ) γ 1 e (x/αrs) γ [ 1 e (x/αra) γ] dx ( x α RS ) γ 1 e (x/α)γ dx, where we have set 1 α γ = 1 α γ + 1 RS α γ RA = αγ RS +αγ RA α γ. RS αγ RA ) γ 1 e ( y β α RA ) γ dydx ) ] γ 1 ( ) e y γ α RA dy dx

56 Proof of the Pythagorean Won Loss Formula (cont) Prob(X > Y) = 1 αγ γ ( x ) γ 1 e α γ (x/α) γ dx RS 0 α α = 1 αγ α γ RS = 1 1 α γ RS α γ RS αγ RA α γ RS +αγ RA = α γ RS α γ. RS +αγ RA

57 Proof of the Pythagorean Won Loss Formula (cont) Prob(X > Y) = 1 αγ γ ( x ) γ 1 e α γ (x/α) γ dx RS 0 α α = 1 αγ α γ RS = 1 1 α γ RS α γ RS αγ RA α γ RS +αγ RA = α γ RS α γ. RS +αγ RA We substitute the relations for α RS and α RA and find that Prob(X > Y) = (RS β) γ (RS β) γ +(RA β) γ. 57 Note RS β estimates RS obs, RA β estimates RA obs.

58 Analysis of 2004

59 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Boston Red Sox Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).

60 Best Fit Weibulls to Data: Method of Least Squares Bin(k) is the k th bin; RS obs (k) (resp. RA obs (k)) the observed number of games with the number of runs scored (allowed) in Bin(k); A(α,γ, k) the area under the Weibull with parameters (α, 1/2,γ) in Bin(k). Find the values of (α RS,α RA,γ) that minimize #Bins k=1 + (RS obs (k) #Games A(α RS,γ, k)) 2 #Bins k=1 (RA obs (k) #Games A(α RA,γ, k)) 2.

61 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Boston Red Sox Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).

62 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the New York Yankees Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).

63 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Baltimore Orioles Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).

64 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Tampa Bay Devil Rays Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).

65 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Toronto Blue Jays Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).

66 Advanced Theory

67 Bonferroni Adjustments Fair coin: 1,000,000 flips, expect 500,000 heads.

68 Bonferroni Adjustments Fair coin: 1,000,000 flips, expect 500,000 heads. About 95% have 499, 000 #Heads 501, 000.

69 Bonferroni Adjustments Fair coin: 1,000,000 flips, expect 500,000 heads. About 95% have 499, 000 #Heads 501, 000. Consider N independent experiments of flipping a fair coin 1,000,000 times. What is the probability that at least one of set doesn t have 499, 000 #Heads 501, 000? N Probability See unlikely events happen as N increases!

70 Data Analysis: χ 2 Tests (20 and 109 degrees of freedom) Team RS RAΧ2: 20 d.f. IndepΧ2: 109 d.f Boston Red Sox New York Yankees Baltimore Orioles Tampa Bay Devil Rays Toronto Blue Jays Minnesota Twins Chicago White Sox Cleveland Indians Detroit Tigers Kansas City Royals Los Angeles Angels Oakland Athletics Texas Rangers Seattle Mariners d.f.: (at the 95% level) and (at the 99% level). 109 d.f.: (at the 95% level) and (at the 99% level). Bonferroni Adjustment: 20 d.f.: (at the 95% level) and (at the 99% level). 109 d.f.: (at the 95% level) and (at the 99% level).

71 Data Analysis: Structural Zeros For independence of runs scored and allowed, use bins [0, 1) [1, 2) [2, 3) [8, 9) [9, 10) [10, 11) [11, ). Have an r c contingency table with structural zeros (runs scored and allowed per game are never equal). (Essentially) O r,r = 0 for all r, use an iterative fitting procedure to obtain maximum likelihood estimators for E r,c (expected frequency of cell (r, c) assuming that, given runs scored and allowed are distinct, the runs scored and allowed are independent).

72 Summary

73 Testing the Model: Data from Method of Maximum Likelihood Team Obs Wins Pred Wins ObsPerc PredPerc GamesDiff Γ Boston Red Sox New York Yankees Baltimore Orioles Tampa Bay Devil Rays Toronto Blue Jays Minnesota Twins Chicago White Sox Cleveland Indians Detroit Tigers Kansas City Royals Los Angeles Angels Oakland Athletics Texas Rangers Seattle Mariners γ: mean = 1.74, standard deviation =.06, median = 1.76; close to numerically observed value of 1.82.

74 Conclusions Find parameters such that Weibulls are good fits; Runs scored and allowed per game are statistically independent; Pythagorean Won Loss Formula is a consequence of our model; Best γ (both close to observed best 1.82): Method of Least Squares: 1.79; Method of Maximum Likelihood: 1.74.

75 Future Work Micro-analysis: runs scored and allowed aren t independent (big lead, close game), run production smaller for inter-league games in NL parks,... Other sports: Does the same model work? Basketball has γ between 14 and Closed forms: Are there other probability distributions that give integrals which can be determined in closed form? 75 Valuing Runs: Pythagorean formula used to value players (10 runs equals 1 win); better model leads to better team.

76 Smoots Sieze opportunities: Never know where they will lead.

77 77 Intro Prob & Modeling Pythag Thm Analysis of 04 Adv Theory Summary Refs Appendices Smoots Sieze opportunities: Never know where they will lead.

78 Smoots Sieze opportunities: Never know where they will lead. Oliver Smoot: Chairman of the American National Standards Institute (ANSI) from 2001 to 2002, President of the International Organization for Standardization (ISO) from 2003 to 2004.

79 References

80 References Baxamusa, Sal: Weibull worksheet: Run distribution plots for various teams: Miller, Steven J.: A Derivation of James Pythagorean projection, By The Numbers The Newsletter of the SABR Statistical Analysis Committee, vol. 16 (February 2006), no. 1, A derivationof the Pythagorean Won LossFormula in baseball, Chance Magazine 20 (2007), no. 1, Pythagoras at the Bat (with Taylor Corcoran, Jennifer Gossels, Victor Luo, Jaclyn Porfilio). Book chapter in Social Networks and the Economics of Sports (organized by Victor Zamaraev), to be published by Springer-Verlag. Relieving and Readjusting Pythagoras (senior thesis of Victor Luo, 2014).

81 Appendices

82 Appendix I: Proof of the Pythagorean Won Loss Formula Let X and Y be independent random variables with Weibull distributions (α RS,β,γ) and (α RA,β,γ) respectively. To have means of RS β and RA β our calculations for the means imply α RS = RS β Γ(1+1/γ), α RA = RA β Γ(1+1/γ). We need only calculate the probability that X exceeds Y. We use the integral of a probability density is 1.

83 Appendix I: Proof of the Pythagorean Won Loss Formula (cont) Prob(X > Y) = = = = x β β x=0 α RS = 1 γ γ x=0 α RS γ α RS ( x x=β ( x β α RS ( x γ x=0 α RS α RS α RS x y=β f(x;α RS,β,γ)f(y;α RA,β,γ)dy dx ) γ 1 e ( x β α RS ) γ ) [ γ 1 ( ) γ e x x α RS γ α RA γ y=0 α RA ( y β α RA ( y α RA ) γ 1 e (x/αrs) γ [ 1 e (x/αra) γ] dx ( x α RS ) γ 1 e (x/α)γ dx, where we have set 1 α γ = 1 α γ + 1 RS α γ RA = αγ RS +αγ RA α γ. RS αγ RA ) γ 1 e ( y β α RA ) γ dydx ) ] γ 1 ( ) e y γ α RA dy dx

84 Appendix I: Proof of the Pythagorean Won Loss Formula (cont) Prob(X > Y) = 1 αγ γ ( x ) γ 1 e α γ (x/α) γ dx RS 0 α α = 1 αγ α γ RS = 1 1 α γ RS α γ RS αγ RA α γ RS +αγ RA = α γ RS α γ. RS +αγ RA We substitute the relations for α RS and α RA and find that Prob(X > Y) = (RS β) γ (RS β) γ +(RA β) γ. Note RS β estimates RS obs, RA β estimates RA obs.

85 Appendix II: Best Fit Weibulls and Structural Zeros The fits look good, but are they? Do χ 2 -tests: Let Bin(k) denote the k th bin. O r,c : the observed number of games where the team s runs scored is in Bin(r) and the runs allowed are in Bin(c). E r,c = (r, c). Then c O r,c r O r,c #Games #Rows r=1 is the expected frequency of cell #Columns c=1 (O r,c E r,c ) 2 E r,c is a χ 2 distribution with (#Rows 1)(#Columns 1) degrees of freedom.

86 Appendix II: Best Fit Weibulls and Structural Zeros (cont) For independence of runs scored and allowed, use bins [0, 1) [1, 2) [2, 3) [8, 9) [9, 10) [10, 11) [11, ). Have an r c contingency table (with r = c = 12); however, there are structural zeros (runs scored and allowed per game can never be equal). (Essentially) O r,r = 0 for all r. We use the iterative fitting procedure to obtain maximum likelihood estimators for the E r,c, the expected frequency of cell (r, c) under the assumption that, given that the runs scored and allowed are distinct, the runs scored and allowed are independent. For 1 r, c 12, let E (0) r,c = 1 if r c and 0 if r = c. Set Then and X r,+ = O r,c, X +,c = O r,c. c=1 r=1 E (l 1) E (l) r,c X r,+ / 12 c=1 E (l 1) r,c if l is odd r,c = E (l 1) r,c X +,c / 12 r=1 E (l 1) r,c if l is even, E r,c = lim l E(l) r,c ; the iterations converge very quickly. (If we had a complete two-dimensional contingency table, then the iteration reduces to the standard values, namely E r,c = c O r,c r O r,c / #Games.). Note (O r,c E r,c) 2 r=1 c=1 c r E r,c

87 Appendix III: Central Limit Theorem Convolution of f and g: h(y) = (f g)(y) = f(x)g(y x)dx = f(x y)g(x)dx. R R X 1 and X 2 independent random variables with probability density p. x+ x Prob(X i [x, x + x]) = p(t)dt p(x) x. x x+ x x1 Prob(X 1 + X 2 ) [x, x + x] = p(x 1 )p(x 2 )dx 2 dx 1. x 1 = x 2 =x x 1 As x 0 we obtain the convolution of p with itself: b Prob(X 1 + X 2 [a, b]) = (p p)(z)dz. a Exercise to show non-negative and integrates to 1.

88 Appendix III: Statement of Central Limit Theorem For simplicity, assume p has mean zero, variance one, finite third moment and is of sufficiently rapid decay so that all convolution integrals that arise converge: p an infinitely differentiable function satisfying xp(x)dx = 0, x 2 p(x)dx = 1, x 3 p(x)dx <. Assume X 1, X 2,... are independent identically distributed random variables drawn from p. Define S N = N i=1 X i. Standard Gaussian (mean zero, variance one) is 1 2π e x2 /2. Central Limit Theorem Let X i, S N be as above and assume the third moment of each X i is finite. Then S N / N converges in probability to the standard Gaussian: ( ) SN lim Prob [a, b] = N N 1 b e x2 /2 dx. 2π a

89 Appendix III: Proof of the Central Limit Theorem The Fourier transform of p is p(y) = p(x)e 2πixy dx. Derivative of ĝ is the Fourier transform of 2πixg(x); differentiation (hard) is converted to multiplication (easy). ĝ (y) = 2πix g(x)e 2πixy dx. If g is a probability density, ĝ (0) = 2πiE[x] and ĝ (0) = 4π 2 E[x 2 ]. Natural to use the Fourier transform to analyze probability distributions. The mean and variance are simple multiples of the derivatives of p at zero: p (0) = 0, p (0) = 4π 2. We Taylor expand p (need technical conditions on p): p(y) = 1 + p (0) y 2 + = 1 2π 2 y 2 + O(y 3 ). 2 Near the origin, the above shows p looks like a concave down parabola.

90 Appendix III: Proof of the Central Limit Theorem (cont) Prob(X X N [a, b]) = b a (p p)(z)dz. The Fourier transform converts convolution to multiplication. If FT[f](y) denotes the Fourier transform of f evaluated at y: FT[p p](y) = p(y) p(y). Do not want the distribution of X X N = x, but rather S N = X 1 + +X N = x. N If B(x) = A(cx) for some fixed c 0, then B(y) ( ) = 1 c  y. c Prob( X1 + +X N N ) = x = ( Np Np)(x N). [ FT ( Np Np)(x ] [ ( )] y N N) (y) = p. N

91 Appendix III: Proof of the Central Limit Theorem (cont) Can find the Fourier transform of the distribution of S N : [ ( )] y N p. N Take the limit as N for fixed y. Know p(y) = 1 2π 2 y 2 + O(y 3 ). Thus study [ 1 2π2 y 2 ( )] y 3 N + O N N 3/2. For any fixed y, [ lim 1 2π2 y 2 ( )] y 3 N + O N N N 3/2 = e 2πy2. Fourier transform of e 2πy2 at x is 1 2π e x2 /2.

92 Appendix III: Proof of the Central Limit Theorem (cont) We have shown: the Fourier transform of the distribution of S N converges to e 2πy2 ; the Fourier transform of e 2πy2 is 1 2π e x2 /2. Therefore the distribution of S N equalling x converges to 1 2π e x2 /2. We need complex analysis to justify this conclusion. Must be careful: Consider g(x) = { e 1/x2 if x 0 0 if x = 0. All the Taylor coefficients about x = 0 are zero, but the function is not identically zero in a neighborhood of x = 0.

93 Appendix IV: Best Fit Weibulls from Method of Maximum Likelihood The likelihood function depends on: α RS,α RA,β =.5,γ. Let A(α,.5,γ, k) denote the area in Bin(k) of the Weibull with parameters α,.5, γ. The sample likelihood function L(α RS,α RA,.5,γ) is ( ) #Bins #Games A(α RS,.5,γ, k) RSobs(k) RS obs (1),...,RS obs (#Bins) ( #Games RA obs (1),...,RA obs (#Bins) k=1 ) #Bins k=1 A(α RA,.5,γ, k) RAobs(k). For each team we find the values of the parameters α RS, α RA and γ that maximize the likelihood. Computationally, it is equivalent to maximize the logarithm of the likelihood, and we may ignore the multinomial coefficients are they are independent of the parameters.

Pythagoras at the Bat: An Introduction to Stats and Modeling

Pythagoras at the Bat: An Introduction to Stats and Modeling Pythagoras at the Bat: An Introduction to Stats and Modeling Cameron, Kayla and Steven Miller (sjm1@williams.edu, Williams College) http://web.williams.edu/mathematics/sjmiller/public_html/ Acknowledgments

More information

Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling

Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling Steven J Miller Williams College sjm1@williams.edu http://www.williams.edu/go/math/sjmiller Wellesley College, September 21,

More information

The Pythagorean Won-Loss Theorem: An introduction to modeling

The Pythagorean Won-Loss Theorem: An introduction to modeling The Pythagorean Won-Loss Theorem: An introduction to modeling Steven J Miller Brown University sjmiller@math.brown.edu http://www.math.brown.edu/ sjmiller Williams College, January 15, 2008. Acknowledgements

More information

Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling

Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ Hampshire College,

More information

The Pythagorean Won-Loss Theorem: An introduction to modeling

The Pythagorean Won-Loss Theorem: An introduction to modeling The Pythagorean Won-Loss Theorem: An introduction to modeling Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ Bennington College, December 5, 2008

More information

SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling

SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling Steven J. Miller (sjmiller@math.brown.edu) Brown University Boston, MA, May

More information

SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling

SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling Steven J. Miller (sjmiller@math.brown.edu) Brown University Boston, MA, May

More information

Math 341: Probability Eighteenth Lecture (11/12/09)

Math 341: Probability Eighteenth Lecture (11/12/09) Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams

More information

RELIEVING AND READJUSTING PYTHAGORAS

RELIEVING AND READJUSTING PYTHAGORAS RELIEVING AND READJUSTING PYTHAGORAS VICTOR LUO AND STEVEN J. MILLER ABSTRACT. Bill James invented the Pythagorean expectation in the late 70 s to predict a baseball team s winning percentage knowing just

More information

Extending Pythagoras

Extending Pythagoras Extending Pythagoras Steven J. Miller, Williams College sjm1@williams.edu, Steven.Miller.MC.96@aya.yale.edu http://web.williams.edu/mathematics/sjmiller/public_html/ Williams College, April 8, 2015 Goals

More information

Correlation (pp. 1 of 6)

Correlation (pp. 1 of 6) Correlation (pp. 1 of 6) Car dealers want to know how mileage affects price on used Corvettes. Biologists are studying the effects of temperature on cricket chirps. Farmers are trying to determine if there

More information

Extending Pythagoras

Extending Pythagoras Extending Pythagoras Steven J. Miller, Williams College sjm1@williams.edu, Steven.Miller.MC.96@aya.yale.edu http://web.williams.edu/mathematics/sjmiller/ public_html/ Williams College, June 30, 2015 Goals

More information

Lesson 1 - Pre-Visit Safe at Home: Location, Place, and Baseball

Lesson 1 - Pre-Visit Safe at Home: Location, Place, and Baseball Lesson 1 - Pre-Visit Safe at Home: Location, Place, and Baseball Objective: Students will be able to: Define location and place, two of the five themes of geography. Give reasons for the use of latitude

More information

RS γ γ. * Kevin D. Dayaratna is a graduate student in Mathematical Statistics at the

RS γ γ. * Kevin D. Dayaratna is a graduate student in Mathematical Statistics at the First Order Approximations of the Pythagorean Won-Loss Formula for Predicting MLB Teams Winning Percentages May 1, 01 By Kevin D Dayaratna and Steven J Miller * Abstract We mathematically prove that an

More information

Data and slope and y-intercepts, Oh My! Linear Regression in the Common Core

Data and slope and y-intercepts, Oh My! Linear Regression in the Common Core Data and slope and y-intercepts, Oh My! Linear Regression in the Common Core Jared Derksen mrmathman.com/talks jared@mrmathman.com Outline I. Data collection-cheerios II. History lesson III. More data

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

Stochastic Processes for Physicists

Stochastic Processes for Physicists Stochastic Processes for Physicists Understanding Noisy Systems Chapter 1: A review of probability theory Paul Kirk Division of Molecular Biosciences, Imperial College London 19/03/2013 1.1 Random variables

More information

12.2. Measures of Central Tendency: The Mean, Median, and Mode

12.2. Measures of Central Tendency: The Mean, Median, and Mode 12.2. Measures of Central Tendency: The Mean, Median, and Mode 1 Objective A. Find the mean, median, and mode for a set of data. 2 Introduction Tongue Twister Averages Is there a relationship between the

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Study Island Algebra 2 Post Test

Study Island Algebra 2 Post Test Study Island Algebra 2 Post Test 1. A two-sided fair coin has been tossed seven times. Three tosses have come up tails and four tosses have come up heads. What is the probability that the next toss will

More information

Statistics, Probability Distributions & Error Propagation. James R. Graham

Statistics, Probability Distributions & Error Propagation. James R. Graham Statistics, Probability Distributions & Error Propagation James R. Graham Sample & Parent Populations Make measurements x x In general do not expect x = x But as you take more and more measurements a pattern

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

The Best, The Hottest, & The Luckiest: A Statistical Tale of Extremes

The Best, The Hottest, & The Luckiest: A Statistical Tale of Extremes The Best, The Hottest, & The Luckiest: A Statistical Tale of Extremes Sid Redner, Boston University, physics.bu.edu/~redner Extreme Events: Theory, Observations, Modeling and Simulations Palma de Mallorca,

More information

Solving Quadratic Equations by Graphing 6.1. ft /sec. The height of the arrow h(t) in terms

Solving Quadratic Equations by Graphing 6.1. ft /sec. The height of the arrow h(t) in terms Quadratic Function f ( x) ax bx c Solving Quadratic Equations by Graphing 6.1 Write each in quadratic form. Example 1 f ( x) 3( x + ) Example Graph f ( x) x + 6 x + 8 Example 3 An arrow is shot upward

More information

Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems!

Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Olivial Beckwith, Murat Koloğlu, Gene Kopp, Steven J. Miller and Yinghui Wang http://www.williams.edu/mathematics/sjmiller/public html Colby

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Discrete distribution. Fitting probability models to frequency data. Hypotheses for! 2 test. ! 2 Goodness-of-fit test

Discrete distribution. Fitting probability models to frequency data. Hypotheses for! 2 test. ! 2 Goodness-of-fit test Discrete distribution Fitting probability models to frequency data A probability distribution describing a discrete numerical random variable For example,! Number of heads from 10 flips of a coin! Number

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

MSc Mas6002, Introductory Material Mathematical Methods Exercises

MSc Mas6002, Introductory Material Mathematical Methods Exercises MSc Mas62, Introductory Material Mathematical Methods Exercises These exercises are on a range of mathematical methods which are useful in different parts of the MSc, especially calculus and linear algebra.

More information

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Let X = lake depth at a randomly chosen point on lake surface If we draw the histogram so that the

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

Math 341: Probability Seventeenth Lecture (11/10/09)

Math 341: Probability Seventeenth Lecture (11/10/09) Math 341: Probability Seventeenth Lecture (11/10/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams

More information

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous

More information

Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems!

Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Murat Koloğlu, Gene Kopp, Steven J. Miller and Yinghui Wang http://www.williams.edu/mathematics/sjmiller/public html College of the Holy Cross,

More information

IB Math Standard Level Year 1: Final Exam Review Alei - Desert Academy

IB Math Standard Level Year 1: Final Exam Review Alei - Desert Academy IB Math Standard Level Year : Final Exam Review Alei - Desert Academy 0- Standard Level Year Final Exam Review Name: Date: Class: You may not use a calculator on problems #- of this review.. Consider the

More information

Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems!

Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Murat Koloğlu, Gene Kopp, Steven J. Miller and Yinghui Wang http://www.williams.edu/mathematics/sjmiller/public html Smith College, January

More information

Convergence of Random Processes

Convergence of Random Processes Convergence of Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Define convergence for random

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions

More information

Swarthmore Honors Exam 2015: Statistics

Swarthmore Honors Exam 2015: Statistics Swarthmore Honors Exam 2015: Statistics 1 Swarthmore Honors Exam 2015: Statistics John W. Emerson, Yale University NAME: Instructions: This is a closed-book three-hour exam having 7 questions. You may

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

Chapter 2 Continuous Distributions

Chapter 2 Continuous Distributions Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following

More information

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014 Probability Machine Learning and Pattern Recognition Chris Williams School of Informatics, University of Edinburgh August 2014 (All of the slides in this course have been adapted from previous versions

More information

STAT 801: Mathematical Statistics. Distribution Theory

STAT 801: Mathematical Statistics. Distribution Theory STAT 81: Mathematical Statistics Distribution Theory Basic Problem: Start with assumptions about f or CDF of random vector X (X 1,..., X p ). Define Y g(x 1,..., X p ) to be some function of X (usually

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Includes games of Wednesday, March 4, 2015 Major League Baseball

Includes games of Wednesday, March 4, 2015 Major League Baseball Includes games of Wednesday, March 4, 2015 Major League Baseball American League Eastern Division Toronto Blue Jays 1 1.500 - Baltimore Orioles 0 2.000 1 New York Yankees 0 1.000 0.5 Boston Red Sox 0 0-1.000

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

Unit 4 Probability. Dr Mahmoud Alhussami

Unit 4 Probability. Dr Mahmoud Alhussami Unit 4 Probability Dr Mahmoud Alhussami Probability Probability theory developed from the study of games of chance like dice and cards. A process like flipping a coin, rolling a die or drawing a card from

More information

Foundations of Probability and Statistics

Foundations of Probability and Statistics Foundations of Probability and Statistics William C. Rinaman Le Moyne College Syracuse, New York Saunders College Publishing Harcourt Brace College Publishers Fort Worth Philadelphia San Diego New York

More information

Practice Problems Section Problems

Practice Problems Section Problems Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,

More information

Swarthmore Honors Exam 2012: Statistics

Swarthmore Honors Exam 2012: Statistics Swarthmore Honors Exam 2012: Statistics 1 Swarthmore Honors Exam 2012: Statistics John W. Emerson, Yale University NAME: Instructions: This is a closed-book three-hour exam having six questions. You may

More information

Chapter 24. Comparing Means

Chapter 24. Comparing Means Chapter 4 Comparing Means!1 /34 Homework p579, 5, 7, 8, 10, 11, 17, 31, 3! /34 !3 /34 Objective Students test null and alternate hypothesis about two!4 /34 Plot the Data The intuitive display for comparing

More information

Physics 6720 Introduction to Statistics April 4, 2017

Physics 6720 Introduction to Statistics April 4, 2017 Physics 6720 Introduction to Statistics April 4, 2017 1 Statistics of Counting Often an experiment yields a result that can be classified according to a set of discrete events, giving rise to an integer

More information

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) UCLA STAT 35 Applied Computational and Interactive Probability Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Chris Barr Continuous Random Variables and Probability

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Continuous random variables

Continuous random variables Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from

More information

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

MATH 3C: MIDTERM 1 REVIEW. 1. Counting MATH 3C: MIDTERM REVIEW JOE HUGHES. Counting. Imagine that a sports betting pool is run in the following way: there are 20 teams, 2 weeks, and each week you pick a team to win. However, you can t pick

More information

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical

More information

From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math.

From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math. From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math. Steven J. Miller, Williams College sjm1@williams.edu, Steven.Miller.MC.96@aya.yale.edu (with Cameron and Kayla

More information

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4. UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,

More information

Math 710 Homework 1. Austin Mohr September 2, 2010

Math 710 Homework 1. Austin Mohr September 2, 2010 Math 710 Homework 1 Austin Mohr September 2, 2010 1 For the following random experiments, describe the sample space Ω For each experiment, describe also two subsets (events) that might be of interest,

More information

Week 9 The Central Limit Theorem and Estimation Concepts

Week 9 The Central Limit Theorem and Estimation Concepts Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population

More information

From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math.

From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math. From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math. Steven J. Miller, Williams College Steven.J.Miller@williams.edu http://web.williams.edu/mathematics/sjmiller/public_html/

More information

Introduction to Probability

Introduction to Probability Introduction to Probability Math 353, Section 1 Fall 212 Homework 9 Solutions 1. During a season, a basketball team plays 7 home games and 6 away games. The coach estimates at the beginning of the season

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math.

From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math. From M&Ms to Mathematics, or, How I learned to answer questions and help my kids love math. Cameron, Kayla and Steven J. Miller, Williams College sjm1@williams.edu, Steven.Miller.MC.96@aya.yale.edu http://web.williams.edu/mathematics/sjmiller/public_html

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Page Max. Possible Points Total 100

Page Max. Possible Points Total 100 Math 3215 Exam 2 Summer 2014 Instructor: Sal Barone Name: GT username: 1. No books or notes are allowed. 2. You may use ONLY NON-GRAPHING and NON-PROGRAMABLE scientific calculators. All other electronic

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Math 122L. Additional Homework Problems. Prepared by Sarah Schott

Math 122L. Additional Homework Problems. Prepared by Sarah Schott Math 22L Additional Homework Problems Prepared by Sarah Schott Contents Review of AP AB Differentiation Topics 4 L Hopital s Rule and Relative Rates of Growth 6 Riemann Sums 7 Definition of the Definite

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

ISyE 6739 Test 1 Solutions Summer 2015

ISyE 6739 Test 1 Solutions Summer 2015 1 NAME ISyE 6739 Test 1 Solutions Summer 2015 This test is 100 minutes long. You are allowed one cheat sheet. 1. (50 points) Short-Answer Questions (a) What is any subset of the sample space called? Solution:

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017 Department of Statistical Science Duke University FIRST YEAR EXAM - SPRING 017 Monday May 8th 017, 9:00 AM 1:00 PM NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam;

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

Chapter 5. Continuous Probability Distributions

Chapter 5. Continuous Probability Distributions Continuous Probability Distributions - 06 Chapter 5. Continuous Probability Distributions 5.. Introduction In this chapter, we introduce some of the common probability density functions (PDFs) for continuous

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

The Chi-Square Distributions

The Chi-Square Distributions MATH 183 The Chi-Square Distributions Dr. Neal, WKU The chi-square distributions can be used in statistics to analyze the standard deviation σ of a normally distributed measurement and to test the goodness

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information