The Pythagorean Won-Loss Theorem: An introduction to modeling
|
|
- Laureen Mason
- 5 years ago
- Views:
Transcription
1 The Pythagorean Won-Loss Theorem: An introduction to modeling Steven J Miller Brown University sjmiller@math.brown.edu sjmiller Williams College, January 15, 2008.
2 Acknowledgements Sal Baxamusa, Phil Birnbaum, Ray Ciccolella, Steve Johnston, Michelle Manes, Russ Mann, students of Math 162 at Brown. Dedicated to my great uncle Newt Bromberg (a lifetime Red Sox fan who promised me that I would live to see a World Series Championship in Boston). Chris Long and the San Diego Padres.
3 Goals of the Talk Derive James Pythagorean Won-Loss formula from a reasonable model.
4 Goals of the Talk Derive James Pythagorean Won-Loss formula from a reasonable model. Introduce some of the techniques of modeling.
5 5 Intro Prob. & Modeling Pythg. Thm Analysis of 04 Adv. Theory Summary Refs Appendices Goals of the Talk Derive James Pythagorean Won-Loss formula from a reasonable model. Introduce some of the techniques of modeling. Discuss the mathematics behind the models and model testing.
6 Goals of the Talk Derive James Pythagorean Won-Loss formula from a reasonable model. Introduce some of the techniques of modeling. Discuss the mathematics behind the models and model testing. Show how advanced theory enters in simple problems.
7 7 Intro Prob. & Modeling Pythg. Thm Analysis of 04 Adv. Theory Summary Refs Appendices Goals of the Talk Derive James Pythagorean Won-Loss formula from a reasonable model. Introduce some of the techniques of modeling. Discuss the mathematics behind the models and model testing. Show how advanced theory enters in simple problems. Further avenues for research for students.
8 Numerical Observation: Pythagorean Won-Loss Formula Parameters RS obs : average number of runs scored per game; RA obs : average number of runs allowed per game; γ: some parameter, constant for a sport.
9 Numerical Observation: Pythagorean Won-Loss Formula Parameters RS obs : average number of runs scored per game; RA obs : average number of runs allowed per game; γ: some parameter, constant for a sport. James Won-Loss Formula (NUMERICAL Observation) Won Loss Percentage = RS obs γ RS obs γ + RA obs γ γ originally taken as 2, numerical studies show best γ is about 1.82.
10 Applications of the Pythagorean Won-Loss Formula Extrapolation: use half-way through season to predict a team s performance.
11 Applications of the Pythagorean Won-Loss Formula Extrapolation: use half-way through season to predict a team s performance. Evaluation: see if consistently over-perform or under-perform.
12 Applications of the Pythagorean Won-Loss Formula Extrapolation: use half-way through season to predict a team s performance. Evaluation: see if consistently over-perform or under-perform. Advantage: Other statistics / formulas (run-differential per game); this is easy to use, depends only on two simple numbers for a team.
13 Probability Review Probability density: p(x) 0; p(x)dx = 1; X random variable with density p(x): Prob (X [a, b]) = b p(x)dx. a
14 Probability Review Probability density: p(x) 0; p(x)dx = 1; X random variable with density p(x): Prob (X [a, b]) = b p(x)dx. a Mean µ = xp(x)dx.
15 Probability Review Probability density: p(x) 0; p(x)dx = 1; X random variable with density p(x): Prob (X [a, b]) = b p(x)dx. a Mean µ = xp(x)dx. Variance σ 2 = (x µ)2 p(x)dx.
16 Probability Review Probability density: p(x) 0; p(x)dx = 1; X random variable with density p(x): Prob (X [a, b]) = b p(x)dx. a Mean µ = xp(x)dx. Variance σ 2 = (x µ)2 p(x)dx. Independence: two random variables are independent if knowledge of one does not give knowledge of the other.
17 Modeling the Real World Guidelines for Modeling: Model should capture key features of the system; Model should be mathematically tractable (solvable).
18 Modeling the Real World Guidelines for Modeling: Model should capture key features of the system; Model should be mathematically tractable (solvable). In general these are conflicting goals. How should we try and model baseball games?
19 Modeling the Real World (cont) Possible Model: Runs Scored and Runs Allowed independent random variables; f RS (x), g RA (y): probability density functions for runs scored (allowed).
20 Modeling the Real World (cont) Possible Model: Runs Scored and Runs Allowed independent random variables; f RS (x), g RA (y): probability density functions for runs scored (allowed). Reduced to calculating x [ y x ] f RS (x)g RA (y)dy dx or j<i i f RS (i)g RA (j).
21 Problems with the Model Reduced to calculating x [ y x ] f RS (x)g RA (y)dy dx or i j<i f RS (i)g RA (j).
22 Problems with the Model Reduced to calculating x [ y x ] f RS (x)g RA (y)dy dx or i j<i f RS (i)g RA (j). Problems with the model: Can the integral (or sum) be completed in closed form? Are the runs scored and allowed independent random variables? What are f RS and g RA?
23 Three Parameter Weibull Weibull distribution: f (x; α, β, γ) = { γ α ( x β ) γ 1 α e ((x β)/α) γ if x β 0 otherwise. α: scale (variance: meters versus centimeters); β: origin (mean: translation, zero point); γ: shape (behavior near β and at infinity). Various values give different shapes, but can we find α, β, γ such that it fits observed data? Is the Weibull theoretically tractable?
24 Weibull Plots: Parameters (α, β, γ) Red:(1, 0, 1) (exponential); Green:(1, 0, 2); Cyan:(1, 2, 2); Blue:(1, 2, 4)
25 Gamma Distribution For s C with the real part of s greater than 0, define the Γ-function: Γ(s) = e u u s 1 du = 0 0 e u u s du u. Generalizes factorial function: Γ(n) = (n 1)! for n 1 an integer.
26 Weibull Integrations µ α,β,γ = β x γ α ( ) γ 1 x β e ((x β)/α)γ dx α
27 Weibull Integrations µ α,β,γ = = β β x γ α α x β α ( ) γ 1 x β e ((x β)/α)γ dx α γ ( ) γ 1 x β e ((x β)/α)γ dx + β. α α
28 Weibull Integrations µ α,β,γ = x γ ( ) γ 1 x β e ((x β)/α)γ dx β α α = α x β β α γ ( ) γ 1 x β e ((x β)/α)γ dx + β. α α Change variables: u = ( ) x β γ. α ( Then du = γ x β ) γ 1 α α dx
29 Weibull Integrations µ α,β,γ = x γ ( ) γ 1 x β e ((x β)/α)γ dx β α α = α x β β α γ ( ) γ 1 x β e ((x β)/α)γ dx + β. α α Change variables: u = ( ) x β γ. α ( Then du = γ x β ) γ 1 α α dx and µ α,β,γ = 0 αu γ 1 e u du + β
30 Weibull Integrations µ α,β,γ = x γ ( ) γ 1 x β e ((x β)/α)γ dx β α α = α x β β α γ ( ) γ 1 x β e ((x β)/α)γ dx + β. α α Change variables: u = ( ) x β γ. α ( Then du = γ x β ) γ 1 α α dx and µ α,β,γ = = α 0 0 αu γ 1 e u du + β e u u 1+γ 1 du u + β
31 Weibull Integrations µ α,β,γ = x γ ( ) γ 1 x β e ((x β)/α)γ dx β α α = α x β β α γ ( ) γ 1 x β e ((x β)/α)γ dx + β. α α Change variables: u = ( ) x β γ. α ( Then du = γ x β ) γ 1 α α dx and µ α,β,γ = = α 0 0 αu γ 1 e u du + β e u u 1+γ 1 du u + β = αγ(1 + γ 1 ) + β.
32 Pythagorean Won-Loss Formula Theorem (Pythagorean Won-Loss Formula) Let the runs scored and allowed per game be two independent random variables drawn from Weibull distributions (α RS, β, γ) and (α RA, β, γ); α RS and α RA are chosen so that the means are RS and RA. If γ > 0 then Won-Loss Percentage(RS, RA, β, γ) = (RS β) γ (RS β) γ + (RA β) γ
33 Pythagorean Won-Loss Formula Theorem (Pythagorean Won-Loss Formula) Let the runs scored and allowed per game be two independent random variables drawn from Weibull distributions (α RS, β, γ) and (α RA, β, γ); α RS and α RA are chosen so that the means are RS and RA. If γ > 0 then Won-Loss Percentage(RS, RA, β, γ) = (RS β) γ (RS β) γ + (RA β) γ In baseball take β = 1/2 (from runs must be integers). RS β estimates average runs scored, RA β estimates average runs allowed.
34 Best Fit Weibulls to Data: Method of Least Squares Bin(k) is the k th bin;
35 Best Fit Weibulls to Data: Method of Least Squares Bin(k) is the k th bin; RS obs (k) (resp. RA obs (k)) the observed number of games with the number of runs scored (allowed) in Bin(k);
36 Best Fit Weibulls to Data: Method of Least Squares Bin(k) is the k th bin; RS obs (k) (resp. RA obs (k)) the observed number of games with the number of runs scored (allowed) in Bin(k); A(α, β, γ, k) the area under the Weibull with parameters (α, β, γ) in Bin(k).
37 Best Fit Weibulls to Data: Method of Least Squares Bin(k) is the k th bin; RS obs (k) (resp. RA obs (k)) the observed number of games with the number of runs scored (allowed) in Bin(k); A(α, β, γ, k) the area under the Weibull with parameters (α, β, γ) in Bin(k). Find the values of (α RS, α RA, γ) that minimize #Bins k=1 + (RS obs (k) #Games A(α RS, 1/2, γ, k)) 2 #Bins k=1 (RA obs (k) #Games A(α RA, 1/2, γ, k)) 2.
38 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Boston Red Sox Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).
39 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the New York Yankees Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).
40 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Baltimore Orioles Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).
41 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Tampa Bay Devil Rays Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).
42 Best Fit Weibulls to Data (Method of Maximum Likelihood) Plots of RS predicted vs observed and RA predicted vs observed for the Toronto Blue Jays Using as bins [.5,.5] [.5, 1.5] [7.5, 8.5] [8.5, 9.5] [9.5, 11.5] [11.5, ).
43 Bonferroni Adjustments Fair coin: 1,000,000 flips, expect 500,000 heads.
44 Bonferroni Adjustments Fair coin: 1,000,000 flips, expect 500,000 heads. About 95% have 499, 000 #Heads 501, 000.
45 Bonferroni Adjustments Fair coin: 1,000,000 flips, expect 500,000 heads. About 95% have 499, 000 #Heads 501, 000. Consider N independent experiments of flipping a fair coin 1,000,000 times. What is the probability that at least one of set doesn t have 499, 000 #Heads 501, 000? N Probability See unlikely events happen as N increases!
46 Data Analysis: χ 2 Tests Team RS RA Χ2: 20 d.f. Indep Χ2: 109 d.f Boston Red Sox New York Yankees Baltimore Orioles Tampa Bay Devil Rays Toronto Blue Jays Minnesota Twins Chicago White Sox Cleveland Indians Detroit Tigers Kansas City Royals Los Angeles Angels Oakland Athletics Texas Rangers Seattle Mariners d.f.: (at the 95% level) and (at the 99% level). 109 d.f.: (at the 95% level) and (at the 99% level). Bonferroni Adjustment: 20 d.f.: (at the 95% level) and (at the 99% level). 109 d.f.: (at the 95% level) and (at the 99% level).
47 Data Analysis: Structural Zeros For independence of runs scored and allowed, use bins [0, 1) [1, 2) [2, 3) [8, 9) [9, 10) [10, 11) [11, ). Have an r c contingency table with structural zeros (runs scored and allowed per game are never equal). (Essentially) O r,r = 0 for all r, use an iterative fitting procedure to obtain maximum likelihood estimators for E r,c (expected frequency of cell (r, c) assuming that, given runs scored and allowed are distinct, the runs scored and allowed are independent).
48 Testing the Model: Data from Method of Maximum Likelihood Team Obs Wins Pred Wins ObsPerc PredPerc GamesDiff Γ Boston Red Sox New York Yankees Baltimore Orioles Tampa Bay Devil Rays Toronto Blue Jays Minnesota Twins Chicago White Sox Cleveland Indians Detroit Tigers Kansas City Royals Los Angeles Angels Oakland Athletics Texas Rangers Seattle Mariners γ: mean = 1.74, standard deviation =.06, median = 1.76; close to numerically observed value of 1.82.
49 Conclusions Find parameters such that Weibulls are good fits;
50 Conclusions Find parameters such that Weibulls are good fits; Runs scored and allowed per game are statistically independent;
51 Conclusions Find parameters such that Weibulls are good fits; Runs scored and allowed per game are statistically independent; Pythagorean Won-Loss Formula is a consequence of our model;
52 Conclusions Find parameters such that Weibulls are good fits; Runs scored and allowed per game are statistically independent; Pythagorean Won-Loss Formula is a consequence of our model; Best γ (both close to observed best 1.82): Method of Least Squares: 1.79; Method of Maximum Likelihood: 1.74.
53 Future Work Micro-analysis: runs scored and allowed are not entirely independent (big lead, close game), run production smaller for inter-league games in NL parks, et cetera. Other sports: Does the same model work? How does γ depend on the sport? Closed forms: Are there other probability distributions that give integrals which can be determined in closed form? Valuing Runs: Pythagorean formula used to value players (10 runs equals 1 win); better model leads to better team.
54 References Baxamusa, Sal: Weibull worksheet: Run distribution plots for various teams: Miller, Steven J.: A Derivation of James Pythagorean projection, By The Numbers The Newsletter of the SABR Statistical Analysis Committee, vol. 16 (February 2006), no. 1, A derivation of the Pythagorean Won-Loss Formula in baseball, Chance Magazine 20 (2007), no. 1, sjmiller/math/papers/pythagwonloss Paper.pdf
55 55 Intro Prob. & Modeling Pythg. Thm Analysis of 04 Adv. Theory Summary Refs Appendices Appendix I: Proof of the Pythagorean Won-Loss Formula Let X and Y be independent random variables with Weibull distributions (α RS, β, γ) and (α RA, β, γ) respectively. To have means of RS β and RA β our calculations for the means imply α RS = RS β Γ(1 + γ 1 ), α RA = RA β Γ(1 + γ 1 ). We need only calculate the probability that X exceeds Y. We use the integral of a probability density is 1.
56 Appendix I: Proof of the Pythagorean Won-Loss Formula (cont) Prob(X > Y ) = = = = x β β x=0 α RS = 1 γ γ x=0 α RS γ α RS ( x x=β ( x β α RS ( x γ x=0 α RS α RS α RS x y=β f (x; α RS, β, γ)f (y; α RA, β, γ)dy dx ) γ 1 e ( x β α RS ) γ ) [ γ 1 ( ) γ e x x α RS γ α RA γ y=0 α RA ( y β α RA ( y α RA ) γ 1 e (x/αrs) γ [ 1 e (x/αra) γ ] dx ( x α RS ) γ 1 e (x/α)γ dx, where we have set 1 α γ = 1 α γ + 1 RS α γ RA = αγ RS + αγ RA α γ. RS αγ RA ) γ 1 e ( y β α RA ) γ dydx ) ] γ 1 ( ) e y γ α RA dy dx
57 Appendix I: Proof of the Pythagorean Won-Loss Formula (cont) Prob(X > Y ) = 1 αγ γ ( x ) γ 1 e α γ (x/α) γ dx RS 0 α α = 1 αγ α γ RS = 1 1 α γ RS α γ RS αγ RA α γ RS + αγ RA = α γ RS α γ RS +. αγ RA We substitute the relations for α RS and α RA and find that Prob(X > Y ) = (RS β) γ (RS β) γ + (RA β) γ. 57 Note RS β estimates RS obs, RA β estimates RA obs.
58 Appendix II: Best Fit Weibulls and Structural Zeros The fits look good, but are they? Do χ 2 -tests: Let Bin(k) denote the k th bin. O r,c : the observed number of games where the team s runs scored is in Bin(r) and the runs allowed are in Bin(c). E r,c = (r, c). Then c O r,c r O r,c #Games #Rows r=1 is the expected frequency of cell #Columns c=1 (O r,c E r,c ) 2 E r,c is a χ 2 distribution with (#Rows 1)(#Columns 1) degrees of freedom.
59 Appendix II: Best Fit Weibulls and Structural Zeros (cont) For independence of runs scored and allowed, use bins [0, 1) [1, 2) [2, 3) [8, 9) [9, 10) [10, 11) [11, ). Have an r c contingency table (with r = c = 12); however, there are structural zeros (runs scored and allowed per game can never be equal). (Essentially) O r,r = 0 for all r. We use the iterative fitting procedure to obtain maximum likelihood estimators for the E r,c, the expected frequency of cell (r, c) under the assumption that, given that the runs scored and allowed are distinct, the runs scored and allowed are independent. For 1 r, c 12, let E (0) r,c = 1 if r c and 0 if r = c. Set Then and X r,+ = O r,c, X +,c = O r,c. c=1 r=1 E (l 1) E (l) r,c X r,+ / 12 c=1 E(l 1) r,c if l is odd r,c = E (l 1) r,c X +,c / 12 r=1 E(l 1) r,c if l is even, E r,c = lim l E(l) r,c ; the iterations converge very quickly. (If we had a complete two-dimensional contingency table, then the iteration reduces to the standard values, namely E r,c = c O r,c r O r,c / #Games.). Note (O r,c E r,c) 2 r=1 c=1 c r E r,c
60 Appendix III: Central Limit Theorem Convolution of f and g: h(y) = (f g)(y) = f (x)g(y x)dx = f (x y)g(x)dx. R R X 1 and X 2 independent random variables with probability density p. x+ x Prob(X i [x, x + x]) = p(t)dt p(x) x. x x+ x x1 Prob(X 1 + X 2 ) [x, x + x] = p(x 1 )p(x 2 )dx 2 dx 1. x 1 = x 2 =x x 1 As x 0 we obtain the convolution of p with itself: b Prob(X 1 + X 2 [a, b]) = (p p)(z)dz. a Exercise to show non-negative and integrates to 1.
61 Appendix III: Statement of Central Limit Theorem For simplicity, assume p has mean zero, variance one, finite third moment and is of sufficiently rapid decay so that all convolution integrals that arise converge: p an infinitely differentiable function satisfying xp(x)dx = 0, x 2 p(x)dx = 1, x 3 p(x)dx <. Assume X 1, X 2,... are independent identically distributed random variables drawn from p. Define S N = N i=1 X i. Standard Gaussian (mean zero, variance one) is 1 2π e x2 /2. Central Limit Theorem Let X i, S N be as above and assume the third moment of each X i is finite. Then S N / N converges in probability to the standard Gaussian: ( ) SN lim Prob [a, b] = N N 1 b e x2 /2 dx. 2π a
62 Appendix III: Proof of the Central Limit Theorem The Fourier transform of p is p(y) = p(x)e 2πixy dx. Derivative of ĝ is the Fourier transform of 2πixg(x); differentiation (hard) is converted to multiplication (easy). ĝ (y) = 2πix g(x)e 2πixy dx. If g is a probability density, ĝ (0) = 2πiE[x] and ĝ (0) = 4π 2 E[x 2 ]. Natural to use the Fourier transform to analyze probability distributions. The mean and variance are simple multiples of the derivatives of p at zero: p (0) = 0, p (0) = 4π 2. We Taylor expand p (need technical conditions on p): p(y) = 1 + p (0) y 2 + = 1 2π 2 y 2 + O(y 3 ). 2 Near the origin, the above shows p looks like a concave down parabola.
63 Appendix III: Proof of the Central Limit Theorem (cont) Prob(X X N [a, b]) = b a (p p)(z)dz. The Fourier transform converts convolution to multiplication. If FT[f ](y) denotes the Fourier transform of f evaluated at y: FT[p p](y) = p(y) p(y). Do not want the distribution of X X N = x, but rather S N = X 1 + +X N = x. N If B(x) = A(cx) for some fixed c 0, then B(y) ( ) = 1 c  y. c Prob ( X1 + +X N N ) = x = ( Np Np)(x N). [ FT ( Np Np)(x ] [ ( )] y N N) (y) = p. N
64 Appendix III: Proof of the Central Limit Theorem (cont) Can find the Fourier transform of the distribution of S N : [ ( )] y N p. N Take the limit as N for fixed y. Know p(y) = 1 2π 2 y 2 + O(y 3 ). Thus study [ 1 2π2 y 2 ( )] y 3 N + O N N 3/2. For any fixed y, [ lim 1 2π2 y 2 ( )] y 3 N + O N N N 3/2 = e 2πy2. Fourier transform of e 2πy2 at x is 1 2π e x2 /2.
65 Appendix III: Proof of the Central Limit Theorem (cont) We have shown: the Fourier transform of the distribution of S N converges to e 2πy2 ; the Fourier transform of e 2πy2 is 1 2π e x2 /2. Therefore the distribution of S N equalling x converges to 1 2π e x2 /2. We need complex analysis to justify this conclusion. Must be careful: Consider g(x) = { e 1/x2 if x 0 0 if x = 0. All the Taylor coefficients about x = 0 are zero, but the function is not identically zero in a neighborhood of x = 0.
66 Appendix IV: Best Fit Weibulls from Method of Maximum Likelihood The likelihood function depends on: α RS, α RA, β =.5, γ. Let A(α,.5, γ, k) denote the area in Bin(k) of the Weibull with parameters α,.5, γ. The sample likelihood function L(α RS, α RA,.5, γ) is ( ) #Bins #Games A(α RS,.5, γ, k) RSobs(k) RS obs (1),..., RS obs (#Bins) ( #Games RA obs (1),..., RA obs (#Bins) k=1 ) #Bins k=1 A(α RA,.5, γ, k) RAobs(k). For each team we find the values of the parameters α RS, α RA and γ that maximize the likelihood. Computationally, it is equivalent to maximize the logarithm of the likelihood, and we may ignore the multinomial coefficients are they are independent of the parameters.
The Pythagorean Won-Loss Theorem: An introduction to modeling
The Pythagorean Won-Loss Theorem: An introduction to modeling Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ Bennington College, December 5, 2008
More informationPythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling
Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ Hampshire College,
More informationSOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling
SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling Steven J. Miller (sjmiller@math.brown.edu) Brown University Boston, MA, May
More informationSOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling
SOCIETY FOR AMERICAN BASEBALL RESEARCH The Pythagorean Won-Loss Formula in Baseball: An Introduction to Statistics and Modeling Steven J. Miller (sjmiller@math.brown.edu) Brown University Boston, MA, May
More informationPythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling
Pythagoras at the Bat: An Introduction to Statistics and Mathematical Modeling Steven J Miller Williams College sjm1@williams.edu http://www.williams.edu/go/math/sjmiller Wellesley College, September 21,
More informationPythagoras at the Bat: An Introduction to Stats and Modeling
Pythagoras at the Bat: An Introduction to Stats and Modeling Cameron, Kayla and Steven Miller (sjm1@williams.edu, Williams College) http://web.williams.edu/mathematics/sjmiller/public_html/ Acknowledgments
More informationPythagoras at the Bat: An Introduction to Stats and Modeling
Pythagoras at the Bat: An Introduction to Stats and Modeling Cameron, Kayla and Steven Miller (sjm1@williams.edu, Williams College) http://web.williams.edu/mathematics/sjmiller/public_html/ Acknowledgments
More informationMath 341: Probability Eighteenth Lecture (11/12/09)
Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationCorrelation (pp. 1 of 6)
Correlation (pp. 1 of 6) Car dealers want to know how mileage affects price on used Corvettes. Biologists are studying the effects of temperature on cricket chirps. Farmers are trying to determine if there
More informationRELIEVING AND READJUSTING PYTHAGORAS
RELIEVING AND READJUSTING PYTHAGORAS VICTOR LUO AND STEVEN J. MILLER ABSTRACT. Bill James invented the Pythagorean expectation in the late 70 s to predict a baseball team s winning percentage knowing just
More informationLesson 1 - Pre-Visit Safe at Home: Location, Place, and Baseball
Lesson 1 - Pre-Visit Safe at Home: Location, Place, and Baseball Objective: Students will be able to: Define location and place, two of the five themes of geography. Give reasons for the use of latitude
More informationData and slope and y-intercepts, Oh My! Linear Regression in the Common Core
Data and slope and y-intercepts, Oh My! Linear Regression in the Common Core Jared Derksen mrmathman.com/talks jared@mrmathman.com Outline I. Data collection-cheerios II. History lesson III. More data
More informationRS γ γ. * Kevin D. Dayaratna is a graduate student in Mathematical Statistics at the
First Order Approximations of the Pythagorean Won-Loss Formula for Predicting MLB Teams Winning Percentages May 1, 01 By Kevin D Dayaratna and Steven J Miller * Abstract We mathematically prove that an
More informationStatistics, Probability Distributions & Error Propagation. James R. Graham
Statistics, Probability Distributions & Error Propagation James R. Graham Sample & Parent Populations Make measurements x x In general do not expect x = x But as you take more and more measurements a pattern
More information12.2. Measures of Central Tendency: The Mean, Median, and Mode
12.2. Measures of Central Tendency: The Mean, Median, and Mode 1 Objective A. Find the mean, median, and mode for a set of data. 2 Introduction Tongue Twister Averages Is there a relationship between the
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationExtending Pythagoras
Extending Pythagoras Steven J. Miller, Williams College sjm1@williams.edu, Steven.Miller.MC.96@aya.yale.edu http://web.williams.edu/mathematics/sjmiller/public_html/ Williams College, April 8, 2015 Goals
More informationStatistics 100A Homework 5 Solutions
Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to
More informationThe Best, The Hottest, & The Luckiest: A Statistical Tale of Extremes
The Best, The Hottest, & The Luckiest: A Statistical Tale of Extremes Sid Redner, Boston University, physics.bu.edu/~redner Extreme Events: Theory, Observations, Modeling and Simulations Palma de Mallorca,
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationMath/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions
Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationUnit 4 Probability. Dr Mahmoud Alhussami
Unit 4 Probability Dr Mahmoud Alhussami Probability Probability theory developed from the study of games of chance like dice and cards. A process like flipping a coin, rolling a die or drawing a card from
More informationDiscrete distribution. Fitting probability models to frequency data. Hypotheses for! 2 test. ! 2 Goodness-of-fit test
Discrete distribution Fitting probability models to frequency data A probability distribution describing a discrete numerical random variable For example,! Number of heads from 10 flips of a coin! Number
More informationCookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems!
Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Murat Koloğlu, Gene Kopp, Steven J. Miller and Yinghui Wang http://www.williams.edu/mathematics/sjmiller/public html Smith College, January
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationMath 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is
Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i
More informationStochastic Processes for Physicists
Stochastic Processes for Physicists Understanding Noisy Systems Chapter 1: A review of probability theory Paul Kirk Division of Molecular Biosciences, Imperial College London 19/03/2013 1.1 Random variables
More informationWhat is a random variable
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr
More informationSolving Quadratic Equations by Graphing 6.1. ft /sec. The height of the arrow h(t) in terms
Quadratic Function f ( x) ax bx c Solving Quadratic Equations by Graphing 6.1 Write each in quadratic form. Example 1 f ( x) 3( x + ) Example Graph f ( x) x + 6 x + 8 Example 3 An arrow is shot upward
More informationCookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems!
Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Olivial Beckwith, Murat Koloğlu, Gene Kopp, Steven J. Miller and Yinghui Wang http://www.williams.edu/mathematics/sjmiller/public html Colby
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationLecture The Sample Mean and the Sample Variance Under Assumption of Normality
Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions
More informationCookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems!
Cookie Monster Meets the Fibonacci Numbers. Mmmmmm Theorems! Murat Koloğlu, Gene Kopp, Steven J. Miller and Yinghui Wang http://www.williams.edu/mathematics/sjmiller/public html College of the Holy Cross,
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationIncludes games of Wednesday, March 4, 2015 Major League Baseball
Includes games of Wednesday, March 4, 2015 Major League Baseball American League Eastern Division Toronto Blue Jays 1 1.500 - Baltimore Orioles 0 2.000 1 New York Yankees 0 1.000 0.5 Boston Red Sox 0 0-1.000
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More informationMath 341: Probability Seventeenth Lecture (11/10/09)
Math 341: Probability Seventeenth Lecture (11/10/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationStatistics Handbook. All statistical tables were computed by the author.
Statistics Handbook Contents Page Wilcoxon rank-sum test (Mann-Whitney equivalent) Wilcoxon matched-pairs test 3 Normal Distribution 4 Z-test Related samples t-test 5 Unrelated samples t-test 6 Variance
More informationStudy Island Algebra 2 Post Test
Study Island Algebra 2 Post Test 1. A two-sided fair coin has been tossed seven times. Three tosses have come up tails and four tosses have come up heads. What is the probability that the next toss will
More informationn N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)
CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) S.T. is the key to understanding driving forces. e.g., determines if a process proceeds spontaneously. Let s start with entropy
More informationExtending Pythagoras
Extending Pythagoras Steven J. Miller, Williams College sjm1@williams.edu, Steven.Miller.MC.96@aya.yale.edu http://web.williams.edu/mathematics/sjmiller/ public_html/ Williams College, June 30, 2015 Goals
More informationComputations - Show all your work. (30 pts)
Math 1012 Final Name: Computations - Show all your work. (30 pts) 1. Fractions. a. 1 7 + 1 5 b. 12 5 5 9 c. 6 8 2 16 d. 1 6 + 2 5 + 3 4 2.a Powers of ten. i. 10 3 10 2 ii. 10 2 10 6 iii. 10 0 iv. (10 5
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )
UCLA STAT 35 Applied Computational and Interactive Probability Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Chris Barr Continuous Random Variables and Probability
More informationSTAT 801: Mathematical Statistics. Distribution Theory
STAT 81: Mathematical Statistics Distribution Theory Basic Problem: Start with assumptions about f or CDF of random vector X (X 1,..., X p ). Define Y g(x 1,..., X p ) to be some function of X (usually
More informationBemidji Area Schools Outcomes in Mathematics Algebra 2A. Based on Minnesota Academic Standards in Mathematics (2007) Page 1 of 6
9.2.1.1 Understand the definition of a function. Use functional notation and evaluate a function at a given point in its domain. For example: If f x 1, find f(-4). x2 3 10, Algebra Understand the concept
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationProbability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008
Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)
More informationProbability Distributions for Continuous Variables. Probability Distributions for Continuous Variables
Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Let X = lake depth at a randomly chosen point on lake surface If we draw the histogram so that the
More informationBemidji Area Schools Outcomes in Mathematics Algebra 2 Applications. Based on Minnesota Academic Standards in Mathematics (2007) Page 1 of 7
9.2.1.1 Understand the definition of a function. Use functional notation and evaluate a function at a given point in its domain. For example: If f x 1, find f(-4). x2 3 Understand the concept of function,
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationStatistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg
Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationBasics of Stochastic Modeling: Part II
Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference
More informationMath 122L. Additional Homework Problems. Prepared by Sarah Schott
Math 22L Additional Homework Problems Prepared by Sarah Schott Contents Review of AP AB Differentiation Topics 4 L Hopital s Rule and Relative Rates of Growth 6 Riemann Sums 7 Definition of the Definite
More informationMath 341: Probability Eighth Lecture (10/6/09)
Math 341: Probability Eighth Lecture (10/6/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationThe Chi-Square Distributions
MATH 183 The Chi-Square Distributions Dr. Neal, WKU The chi-square distributions can be used in statistics to analyze the standard deviation σ of a normally distributed measurement and to test the goodness
More informationCHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution
CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationContinuous Random Variables
MATH 38 Continuous Random Variables Dr. Neal, WKU Throughout, let Ω be a sample space with a defined probability measure P. Definition. A continuous random variable is a real-valued function X defined
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationIE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.
Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full
More informationLecture 8: Continuous random variables, expectation and variance
Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: autumn 2013 Lejla Batina Version: autumn 2013 Wiskunde
More informationIB Math Standard Level Year 1: Final Exam Review Alei - Desert Academy
IB Math Standard Level Year : Final Exam Review Alei - Desert Academy 0- Standard Level Year Final Exam Review Name: Date: Class: You may not use a calculator on problems #- of this review.. Consider the
More informationProbability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014
Probability Machine Learning and Pattern Recognition Chris Williams School of Informatics, University of Edinburgh August 2014 (All of the slides in this course have been adapted from previous versions
More informationMSc Mas6002, Introductory Material Mathematical Methods Exercises
MSc Mas62, Introductory Material Mathematical Methods Exercises These exercises are on a range of mathematical methods which are useful in different parts of the MSc, especially calculus and linear algebra.
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More informationPhysics 6720 Introduction to Statistics April 4, 2017
Physics 6720 Introduction to Statistics April 4, 2017 1 Statistics of Counting Often an experiment yields a result that can be classified according to a set of discrete events, giving rise to an integer
More information18.440: Lecture 19 Normal random variables
18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationChapter 4: Continuous Probability Distributions
Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random
More informationOrder Statistics and Distributions
Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density
More informationConvergence of Random Processes
Convergence of Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Define convergence for random
More informationIntroduction to Probability
Introduction to Probability Math 353, Section 1 Fall 212 Homework 9 Solutions 1. During a season, a basketball team plays 7 home games and 6 away games. The coach estimates at the beginning of the season
More information18.175: Lecture 17 Poisson random variables
18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More
More informationChapter 1 Principles of Probability
Chapter Principles of Probability Combining independent probabilities You have applied to three medical schools: University of California at San Francisco (UCSF), Duluth School of Mines (DSM), and Harvard
More informationMath 576: Quantitative Risk Management
Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1
More informationData Analysis and Monte Carlo Methods
Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -
More informationSTAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.
STAT 450: Statistical Theory Distribution Theory Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6. Example: Why does t-statistic have t distribution? Ingredients: Sample X 1,...,X n from
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationLecture 4. David Aldous. 2 September David Aldous Lecture 4
Lecture 4 David Aldous 2 September 2015 The specific examples I m discussing are not so important; the point of these first lectures is to illustrate a few of the 100 ideas from STAT134. Ideas used in
More informationOther Continuous Probability Distributions
CHAPTER Probability, Statistics, and Reliability for Engineers and Scientists Second Edition PROBABILITY DISTRIBUTION FOR CONTINUOUS RANDOM VARIABLES A. J. Clar School of Engineering Department of Civil
More informationWeek 1 Quantitative Analysis of Financial Markets Distributions A
Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationProbability 1 (MATH 11300) lecture slides
Probability 1 (MATH 11300) lecture slides Márton Balázs School of Mathematics University of Bristol Autumn, 2015 December 16, 2015 To know... http://www.maths.bris.ac.uk/ mb13434/prob1/ m.balazs@bristol.ac.uk
More informationComputational Cognitive Science
Computational Cognitive Science Lecture 8: Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk Based on slides by Sharon Goldwater October 14, 2016 Frank Keller Computational
More informationFoundations of Probability and Statistics
Foundations of Probability and Statistics William C. Rinaman Le Moyne College Syracuse, New York Saunders College Publishing Harcourt Brace College Publishers Fort Worth Philadelphia San Diego New York
More informationSome Statistics. V. Lindberg. May 16, 2007
Some Statistics V. Lindberg May 16, 2007 1 Go here for full details An excellent reference written by physicists with sample programs available is Data Reduction and Error Analysis for the Physical Sciences,
More informationGeneralized Linear Models and Exponential Families
Generalized Linear Models and Exponential Families David M. Blei COS424 Princeton University April 12, 2012 Generalized Linear Models x n y n β Linear regression and logistic regression are both linear
More information