Physics 1140 Lecture 6: Gaussian Distributions

Similar documents
PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Stochastic processes and stopping time Exercises

Random variables (section 6.1)

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

MP203 Statistical and Thermal Physics. Problem set 7 - Solutions

Sampling Distributions

Discrete Random Variables. David Gerard Many slides borrowed from Linda Collins

Advantages of Sampling. Cheaper Often the only possible way Better quality control over measurement Investigation can be destructive

4.2 Probability Models

TRANSFORMATIONS OF RANDOM VARIABLES

Introduction to Probability 2017/18 Supplementary Problems

Chapter 01 : What is Statistics?

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

MATH Introduction to MATLAB

Elementary Statistics for Geographers, 3 rd Edition

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Pre-Calc Lesser Topics Not as important for studying Calculus (These will, however, be on your final PreCalculus exam!) Click when ready to proceed!

The Wallis Product, a Connection to Pi and Probability, and Maybe the Gamma Function Perhaps?

Notes 10.1 Probability

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Two Heads Are Better Than None

MA 1125 Lecture 15 - The Standard Normal Distribution. Friday, October 6, Objectives: Introduce the standard normal distribution and table.

THE PHYSICS OF STUFF: WHY MATTER IS MORE THAN THE SUM OF ITS PARTS

MAT01A1. Numbers, Inequalities and Absolute Values. (Appendix A)

CMPSCI 240: Reasoning Under Uncertainty

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

1 The Rocky Mountain News (Denver, Colorado), Dec

CS 361: Probability & Statistics

Physics 294H. lectures will be posted frequently, mostly! every day if I can remember to do so

MAT01A1. Numbers, Inequalities and Absolute Values. (Appendix A)

P (second card 7 first card King) = P (second card 7) = %.

MA 1125 Lecture 33 - The Sign Test. Monday, December 4, Objectives: Introduce an example of a non-parametric test.

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

Unit 4 Probability. Dr Mahmoud Alhussami

ANSWERS EXERCISE 1.1 EXERCISE 1.2

New Topic. PHYS 1021: Chap. 10, Pg 2. Page 1

CENTRAL LIMIT THEOREM (CLT)

Elisha Mae Kostka 243 Assignment Mock Test 1 due 02/11/2015 at 09:01am PST

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Exam Two. Phu Vu. test Two. Take home group test April 13 ~ April 18. Your group alias: Your group members: Student name

Probably About Probability p <.05. Probability. What Is Probability?

k P (X = k)

May bring 1 sheet of paper with notes!! Both sides!! Printed/handwritten/whatever.. I don t really care

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

The remains of the course

PHYS 275 Experiment 2 Of Dice and Distributions

Methods of Mathematics

STAT 111 Recitation 1

ECE531: Principles of Detection and Estimation Course Introduction

Probability (Devore Chapter Two)

Statistical Methods for the Social Sciences, Autumn 2012

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Math 105 Course Outline

Probability and Independence Terri Bittner, Ph.D.

Conditional Probability & Independence. Conditional Probabilities

Performance Evaluation

Methods of Mathematics

RANDOM NUMBER GENERATION USING A BIASED SOURCE

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Statistics 100A Homework 1 Solutions

STAT 515 sp 2017 Lecture Notes. Karl B. Gregory

Statistics for Engineers

STA 247 Solutions to Assignment #1

noise = function whose amplitude is is derived from a random or a stochastic process (i.e., not deterministic)

Chapter 2 Class Notes

213 Midterm coming up

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Probability: Understanding the likelihood of something happening.

Conditional Probability & Independence. Conditional Probabilities

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction

Lecture 1: Probability Fundamentals

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Topic 5 Basics of Probability

Methods of Mathematics

Introduction to Stochastic Processes

Question 2 We saw in class that if we flip a coin K times, the the distribution in runs of success in the resulting sequence is equal to

Introductory Probability

Lecture 1. ABC of Probability

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

12:40-2:40 3:00-4:00 PM

CS 361: Probability & Statistics

Midterm Exam 1 Solution

Intro to probability concepts

6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld. Random Variables

Exercises. Class choose one. 1.11: The birthday problem. 1.12: First to fail: Weibull.

2 3 x = 6 4. (x 1) 6

MP ) 12:40-2:40 3:00-4:00 PM

ECE531: Principles of Detection and Estimation Course Introduction

PHYS 202. Lecture 24 Professor Stephen Thornton April 27, 2005

Lecture 6 - Random Variables and Parameterized Sample Spaces

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

1 Review of Probability

Stellar Astronomy 1401 Spring 2009

Discrete Binary Distributions

What does independence look like?

Math 3C Midterm 1 Study Guide

Ch 14 Randomness and Probability

Transcription:

Physics 1140 Lecture 6: Gaussian Distributions February 21/22, 2008 Homework #3 due Monday, 5 PM Should have taken data for Lab 3 this week - due Tues. Mar. 4, 5:30 PM Final (end of lectures) is next week in your correct (Thursday or Friday) lecture section

Exam Next Week (Feb 28 or 29) Come to your regular Thursday or Friday lecture section Bring a scientific calculator and one (1) sheet of paper on which you can hand-write anything about the course (no xeroxes or printouts) The material covered is Chapter 1-5 of Taylor and what I presented in the lectures Most of what I covered in lecture is in Chapter 1-5 of Taylor (except for some pretty basic probability and correlated errors) No correlated errors on exam Can t promise there won t be some multiple choice on it, but the vast majority will not be multiple choice (show your work for partial credit!!) My office hours (Wed 2-4) next week will be in 1140 lab area (not office) answering any questions you may have about exam material

Review of last week: Standard Deviation x A series of measurements of a quantity is taken with apparatus A. The distribution of A results is shown. Apparatus A is replaced with apparatus B, and a new series of measurements of the same quantity is taken. The B results are shown (same x min and x max ) CQ1: Which series of measurements has the smaller standard deviation σ? A. A B. B C. Both A and B have the same standard deviation D. Impossible to tell from the information given Answer: B. The B distribution has a width about ½ that of A

Review of last week: Error on Mean x The standard deviation of distribution A is twice that of B. However, there are 4 times as many measurements in A as in B CQ2: Which series of measurements has the smaller error (or uncertainty) on the mean σ μ? A. A B. B C. Both A and B have the same error on the mean D. Impossible to tell from the information given Answer: C. σ μ A = σ N-1 A / N A = 2σ N-1 B / 4N B = σ N-1 B / N B = σ μ B

μ TRUE, σ TRUE Do Not Change (Though Our Estimates of Them May) We assume that our measurements are drawn from a distribution with properties μ TRUE, σ TRUE (often called the parent distribution) Errors are random, and if we only draw a few measurements, μ MEAS and σ N-1 will most likely not give the values μ TRUE, σ TRUE But if parent distribution a Gaussian, can make the same probabilistic predictions true to all Gaussian distributions (68% probability μ TRUE between [μ MEAS -σ μ, μ MEAS +σ μ ],...)

Functional Form of Gaussian Distribution G(x) = [( 1/(σ 2π )] e -(x-μ) 2 /(2σ 2 ) Equation symmetric around μ (mean) -> G(μ - Δ) = G(μ + Δ) Factor of 1/(σ 2π) insures that - + G(x)dx = 1 Often use f(x) = A e -(x-μ) 2 /2σ 2 -> f(μ) = A - + f(x)dx = Aσ 2π (area of unnormalized Gaussian)

Calculating Probabilities with Gaussians Two weeks ago introduced concept of t = x predicted -x measured / σ If x has an uncertainty associated with it, or it is another predicted measurement we expect x measured to be identical to, remember to add errors in quadrature if uncorrelated ( σ = σ 2 predicted +σ2 measured ), otherwise σ = σ measured Probability(t > 1.0)=0.3173 Prob(t > 1.5)=0.1336 Prob(t > 2.0)=0.0455 Prob(t > 3.0)=0.0027... Difficult to calculate (not on most calculators), but results holds true for all Gaussians. See Appendix A in Taylor for Tables.

Which σ to Use? σ μ = σ N-1 / N σ tells us how widely individual measurements (like 1 race) are N-1 distributed σ μ tells us how widely measurements of the mean are distributed CQ3: A quantity x is measured 100 times and the mean μ and the standard deviation σ are determined. What is the probability that the 101st measurement of x will give a value x > μ + σ? A. 0 B. 1 C. 0.32 D. 0.16 Answer: D Fraction of area of Gaussian between [μ - σ, μ + σ] is 68%, fraction above μ + σ is 16%, fraction below μ - σ is 16%

Full Width at Half Maximum (of Gaussian) G(x) = [( 1/(σ 2π )] e -(x-μ) 2 /(2σ 2 ) (normalized version) At peak G(μ) = 1/(σ 2π) At what positions (x ) is the value of the function half this ±½ [ G(x ± ) = 1/(2σ 2π) ]? ½ 1/(2σ 2π) = 1/(σ 2π)e -(x-μ) 2 /(2σ 2 ) -> 1/2 = e -(x-μ) 2 /(2σ 2 ) -(x-μ) 2 /(2σ 2 ) = ln(1/2) = -ln(2) -> (x-μ) 2 = 2σ 2 ln(2) x ±½ = μ ± 2σ2 ln(2) FWHM = x +½ - x = 2σ 2ln(2) = 2.35σ -½ So you can estimate the σ of a Gaussian curve by looking for the 2 points where the curve is 1/2 its max, and dividing the (x) difference by 2.35 Just like we can estimate μ from the x value of the peak

Standard Deviation CQ Could also estimate σ from G(x) peak value [ σ = 1/(G(μ) 2π ) ] but this requires the Gaussian is properly normalized (which it often is not) and that you know that (which you often don t) FWHM does not required normalized Gaussian CQ4: What approximately is the standard deviation (or σ) of this Gaussian? A. 1 B. 50 C. 120 D. 250 Answer: B FWHM ~ 120 σ = FWHM/2.35 ~ 50

Where Do Gaussians Come From? Probably nothing less Gaussian looking than the distribution of outcomes of many flips of a fair coin Assign 1 to a HEADS, 0 to a TAILS - plot is expected outcome Your exact results might vary, but we can predict the statistical behavior Break experiment down into 1600 sets of 4 contiguous coin flips and add up the score in each set Only 1 way to get a score of 0 -> TTTT Same for a score of 4 -> HHHH

Some Scores Come From >1 Configuration 4 ways to get a score of 1 -> HTTT, THTT, TTHT, TTTH Also 4 ways to get a score of 3 -> THHH, HTHH, HHTH, HHHT 6 ways to get a score of 2 -> HHTT, HTHT, HTTH, THHT, THTH, TTHH Can calculate frequencies for various scores (even for unfair coins) from the Binomial Theorem Plot the expected outcome of the 1600 sets -> these fit to a pretty good Gaussian with μ=2, σ~1 Only real problem is there are no > 2σ tails Entries/Score Score

Central Limit Theorem This is an example of the Central Limit Theorem at work Better tails with bigger sets, or something that partitions finer (dice) -> see www.stat.sc.edu/~west/javahtml/clt.html Will not try to prove C.L.T. (Laplace), but will try to give you an intuitive feel for (after CQs)

Simple Probability CQs We flip a coin 6 times each, for two sets. The first set comes out HHHHHH. The second set comes out HTHTHT CQ4: Which has the more probable outcome? A. First set B. Second set C. Both sets are equally probable Answer: C (each 1/64 likely). Each coin flip is 50:50 and unaffected by previous flips We flip a coin 6 times each, for two sets. The first set comes out 6 Heads. The second set comes out 3 Heads, 3 Tails CQ5: Which has the more probable outcome? A. First set B. Second set C. Both sets are equally probable Answer: B. The first set has a probability of 1/64=0.0156. The second set has a probability of 20/64=0.3125

How Does This Affect My Measurements? Imagine 14 different factors, each of which can knock your measurement +ε/2 or -ε/2 (equally probable) These 14 factors are uncorrelated with each other Chance they all line up +ε/2 is 1/16384 Chance they all line up -ε/2 is also 1/16384 Most likely (3432/16384) they cancel each other out But frequently (3003/16384) they ll be 2 more +ε/2 than -ε/2 (& vice versa) Expect 52 that are >2.95σ from mean for Gaussian; see 30 (out to 3.75σ) Entries/Score Score