Wikipedia: Lecture 1 Probability ad Statistics Bejami Disraeli, British statesma ad literary figure (1804 1881): There are three kids of lies: lies, damed lies, ad statistics. popularized i US by Mark Twai the statemet shows the persuasive power of umbers use of statistics to bolster weak argumets tedecy of people to disparage statistics that do ot support their positios The purpose of P3700: how to uderstad the statistical ucertaity of observatio/measuremet how to use statistics to argue agaist a weak argumet (or bolster a weak argumet?) how to argue agaist people disparagig statistics that do ot support their positios how to lie with statistics? K.K. Ga L1: Probability ad Statistics 1
Itroductio: Uderstadig of may physical pheomea deped o statistical ad probabilistic cocepts: Statistical Mechaics (physics of systems composed of may parts: gases, liquids, solids.) 1 mole of aythig cotais 6x10 23 particles (Avogadro's umber) impossible to keep track of all 6x10 23 particles eve with the fastest computer imagiable resort to learig about the group properties of all the particles partitio fuctio: calculate eergy, etropy, pressure... of a system Quatum Mechaics (physics at the atomic or smaller scale) wavefuctio = probability amplitude probability of a electro beig located at (x,y,z) at a certai time. Uderstadig/iterpretatio of experimetal data deped o statistical ad probabilistic cocepts: how do we extract the best value of a quatity from a set of measuremets? how do we decide if our experimet is cosistet/icosistet with a give theory? how do we decide if our experimet is iterally cosistet? how do we decide if our experimet is cosistet with other experimets? I this course we will cocetrate o the above experimetal issues! K.K. Ga L1: Probability ad Statistics 2
Defiitio of probability: Suppose we have N trials ad a specified evet occurs r times. example: rollig a dice ad the evet could be rollig a 6. defie probability (P) of a evet (E) occurrig as: P(E) = r/n whe N examples: six sided dice: P(6) = 1/6 coi toss: P(heads) = 0.5 P(heads) should approach 0.5 the more times you toss the coi. for a sigle coi toss we ca ever get P(heads) = 0.5! by defiitio probability is a o-egative real umber bouded by 0 P 1 if P = 0 the the evet ever occurs if P = 1 the the evet always occurs sum (or itegral) of all probabilities if they are mutually exclusive must = 1. evets are idepedet if: P(A B) = P(A)P(B) itersectio, uio coi tosses are idepedet evets, the result of ext toss does ot deped o previous toss. evets are mutually exclusive (disjoit) if: P(A B) = 0 or P(A B) = P(A) + P(B) i coi tossig, we either get a head or a tail. K.K. Ga L1: Probability ad Statistics 3
Probability ca be a discrete or a cotiuous variable. Discrete probability: P ca have certai values oly. examples: tossig a six-sided dice: P(x i ) = P i here x i = 1, 2, 3, 4, 5, 6 ad P i = 1/6 for all x i. tossig a coi: oly 2 choices, heads or tails. for both of the above discrete examples (ad i geeral) whe we sum over all mutually exclusive possibilities: " P( x i ) =1 Cotiuous probability: P ca be ay umber betwee 0 ad 1. defie a probability desity fuctio, pdf, f ( x) f ( x)dx = dp( x " # " x + dx) with α a cotiuous variable probability for x to be i the rage a x b is: P(a " x " b) = f x dx a just like the discrete case the sum of all probabilities must equal 1. +# $ f x dx =1 f(x) is ormalized to oe. i "# ( ) b # ( ) probability for x to be exactly some umber is zero sice: x=a ( ) " f x dx = 0 x=a Notatio: x i is called a radom variable K.K. Ga L1: Probability ad Statistics 4
Examples of some commo P(x) s ad f(x) s: Discrete = P(x) Cotiuous = f(x) biomial uiform, i.e. costat Poisso Gaussia expoetial chi square How do we describe a probability distributio? mea, mode, media, ad variace for a cotiuous distributio, these quatities are defied by: Mea Mode Media Variace average most probable 50% poit width of distributio + " a + " $f ( x) µ = # xf (x)dx = 0 0.5 = # f (x)dx % 2 = # f (x) x! µ! " $x x = a! "! " ( ) 2 dx for a discrete distributio, the mea ad variace are defied by: µ = 1 " x i " 2 = 1 $ (x i # µ) 2 K.K. Ga L1: Probability ad Statistics 5
Some cotiuous pdf: Probability is the area uder the curves! mode media mea! symmetric distributio (gaussia) Asymmetric distributio showig the mea, media ad mode For a Gaussia pdf, the mea, mode, ad media are all at the same x. For most pdfs, the mea, mode, ad media are at differet locatios. K.K. Ga L1: Probability ad Statistics 6
Calculatio of mea ad variace: example: a discrete data set cosistig of three umbers: {1, 2, 3} average (µ) is just: x µ = i 1+ 2+ 3 " = = 2 3 complicatio: suppose some measuremet are more precise tha others. if each measuremet x i have a weight w i associated with it: µ = " x i w i / " w i weighted average variace (σ 2 ) or average squared deviatio from the mea is just: " 2 = 1 $ (x i # µ) 2 variace describes the width of the pdf! σ is called the stadard deviatio rewrite the above expressio by expadig the summatios: " 2 = 1 % x 2 i + # µ 2 ( '# $ 2µ # x i * & ) = 1 # x 2 i + µ 2 $ 2µ 2 = 1 # x 2 i $ µ 2 = x 2 $ x 2 < > average i the deomiator would be -1 if we determied the average (µ) from the data itself. K.K. Ga L1: Probability ad Statistics 7
usig the defiitio of µ from above we have for our example of {1,2,3}: " 2 = 1 $ x 2 i # µ 2 = 4.67 # 2 2 = 0.67 the case where the measuremets have differet weights is more complicated: " 2 = # w i (x i $ µ) 2 2 /# w i = # w i x i /# w i $ µ 2 µ is the weighted mea if we calculated µ from the data, σ 2 gets multiplied by a factor /( 1). example: a cotiuous probability distributio, f (x) = si 2 x for 0 " x " 2# has two modes! has same mea ad media, but differ from the mode(s). 2" f(x) is ot properly ormalized: # si 2 xdx = " $1 0 ormalized pdf: f (x) = si 2 2" x / # si 2 xdx = 1 0 " si2 x K.K. Ga L1: Probability ad Statistics 8
for cotiuous probability distributios, the mea, mode, ad media are calculated usig either itegrals or derivatives: µ = 1 2" # x si 2 xdx = " " 0 mode : $ $x si2 x = 0 % " 2, 3" 2 media : 1 & # " si2 xdx = 1 2 % & = " 0 example: Gaussia distributio fuctio, a cotiuous probability distributio p(x) = 1! 2" e # (x #µ ) 2 2! 2 gaussia K.K. Ga L1: Probability ad Statistics 9
Accuracy ad Precisio: Accuracy: The accuracy of a experimet refers to how close the experimetal measuremet is to the true value of the quatity beig measured. Precisio: This refers to how well the experimetal result has bee determied, without regard to the true value of the quatity beig measured. just because a experimet is precise it does ot mea it is accurate!! accurate but ot precise precise but ot accurate K.K. Ga L1: Probability ad Statistics 10
Measuremet Errors (Ucertaities) Use results from probability ad statistics as a way of idicatig how good a measuremet is. most commo quality idicator: relative precisio = [ucertaity of measuremet]/measuremet example: we measure a table to be 10 iches with ucertaity of 1 ich. relative precisio = 1/10 = 0.1 or 10% (% relative precisio) ucertaity i measuremet is usually square root of variace: σ = stadard deviatio usually calculated usig the techique of propagatio of errors. Statistics ad Systematic Errors Results from experimets are ofte preseted as: N ± XX ± YY N: value of quatity measured (or determied) by experimet. XX: statistical error, usually assumed to be from a Gaussia distributio. with the assumptio of Gaussia statistics we ca say (calculate) somethig about how well our experimet agrees with other experimets ad/or theories. Expect a 68% chace that the true value is betwee N - XX ad N + XX. YY: systematic error. Hard to estimate, distributio of errors usually ot kow. examples:mass of proto = 0.9382769 ± 0.0000027 GeV (oly statistical error give) mass of W boso = 80.8 ± 1.5 ± 2.4 GeV K.K. Ga L1: Probability ad Statistics 11
What s the differece betwee statistical ad systematic errors? N ± XX ± YY statistical errors are radom i the sese that if we repeat the measuremet eough times: XX -> 0 systematic errors do ot -> 0 with repetitio. examples of sources of systematic errors: voltmeter ot calibrated properly a ruler ot the legth we thik is (meter stick might really be < meter!) because of systematic errors, a experimetal result ca be precise, but ot accurate! How do we combie systematic ad statistical errors to get oe estimate of precisio? big problem! two choices: σ tot = XX + YY add them liearly σ tot = (XX 2 + YY 2 ) 1/2 add them i quadrature Some other ways of quotig experimetal results lower limit: the mass of particle X is > 100 GeV upper limit: the mass of particle X is < 100 GeV asymmetric errors: mass of particle +4 X = 100!3 GeV Do t quote ay measuremet to more tha three digits as it is difficult to achieve 0.1% precisio: 0.231 ± 0.013, 791± 57, (5.98± 0.43)x10-5 measuremet ad ucertaity should have the same umber of digits K.K. Ga L1: Probability ad Statistics 12