BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO
|
|
- Emma Moody
- 5 years ago
- Views:
Transcription
1 ACKGROUND NOTES FYS 4550/FYS EXERIMENTAL HIGH ENERGY HYSICS AUTUMN 2016 ROAILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO
2 efore embarking on the concept of probability, we will first define a set of other concepts. A stochastic experiment is characterized by: All possible elementary outcomes of the experiment are known Only one of the outcomes can occur in a single experiment The outcome of an experiment is not known a priori Example: throwing a dice Outcomes are: S={1,2,3,4,5,6} Can only observe one of these each time you throw Don t know beforehand what you will observe The set S is called the sample space of the experiment
3 An event A is one or more outcomes which satisfy certain specifications Example: A= odd number when throwing a dice An event is therefore also a subset of S Here: A={1,3,5} If = even number, what is the subset of S describing? The probability of occurence of an event A, A, is a number between 0 and 1 Intuitively a number for A close to 0 means that A is supposed to occur very rarely in an experiment, whereas a number close to 1 means that A occurs very often
4 There are three ways of quantifying probability 1. Classical approach, valid when all outcomes can be assumed equally likely. robability is defined as number of favourable outcomes for a given event divided by total number of outcomes. Example: throwing a dice has N=6 different outcomes. Assume that the event A = observing 6 eyes. Only n=1 of the outcomes are favourable for A. A=n/N=1/6= Approach based on convergence value of relative frequency for a very large number of repeated, identical experiments. Example: throwing a dice, recording relative frequency of occurence of A for various numbers of trials 3. Subjective approach, reflecting degree of belief of occurence of a certain event A. ossible guideline: convergence value of a large number of hypothetical experiments
5 Convergence of relative frequency true probability relative frequency logarithm base 10 of trials
6 Approach 2 forms the basis of frequentist statistics, whereas approach 3 is the baseline of ayesian statistics Two different schools When estimating parameters from a set of data, the two approaches usually give the same numbers for the estimates if there is a large amount of data If there is little available data, estimates might differ No easy way of determining which approach is best oth approaches advocated in high-energy physics experiments Will not enter any further into such questions in this course
7 Will now look at probabilities of combinations of events Need some concepts from set theory: The union is a new event which occurs if A or or both events occur. To events are disjoint if they cannot occur simultaneously The intersection A is a new event which occurs if both A and occurs The complement A is a new event which occurs if A does not occur
8 A robability A S outcomes VENN DIAGRAM A C disjoint with A and
9 The mathematical axioms of probability: 1. robability is never negative, A 0 2. The probability for the event which corresponds to the entire sample space S i.e. the probability of observing any of the possible outcomes of the experiment is equal to the unit value, i. e. S = 1 3. robability must comply with the addition rule of disjoint events: A A2 An A1 A2 A 1 n A couple of useful formulas which can be derived from the axioms: A 1 A A A A
10 A robability A Concept of conditional probability: What is the probability of occurence of A given that we know will occur, i. e. A?
11 Recalling the definition of probability as the number of favourable outcomes divided by the total number of outcomes, we get: Example: throwing dice. A = {2, 4, 6}, = {3, 4, 5, 6} What is A?? 3 1 4,6} { A A / / A N N N N N N A tot tot A A / 3 1/ 3 A A
12 A robability A Important observation: and are disjoint! A A
13 Therefore: Expressing A in terms of a subdivision of S in a set of other, disjoint events is called the law of total probability. The general formulation of this law is: where all { } are disjoint and span the entire sample space S. A A A A A A A i i i A A i
14 From the definition of conditional probability it follows: A quick manipulation gives: which is called ayes theorem. A A A A A A A
15 y using the law of total probability, one ends up with the general formulation of ayes theorem: which is an extremely important result in statistics. articularly in ayesian statistics this theorem is often used to update or refine the knowledge about a set of unknown parameters by the introduction of information from new data. i i i j j j A A A
16 This can be explained by a rewrite of ayes theorem: parameters data α data parameters parameters. data parameters is often called the likelihood, parameters denotes the prior knowledge of the parameters, whereas parameters data is the posterior probability of the parameters given the data. If parameters cannot be deduced by any objective means, a subjective belief of its value is used in ayesian statistics. Since there is no fundamental rule describing how to deduce this prior probability, ayesian statistics is still debated also in highenergy physics!
17 Definition of independence of events A and : A = A, i.e. any given information about does not affect the probability of observing A. hysically this means that the events A and are uncorrelated. For practical applications such independence can not be derived but rather has to be assumed, given the nature of the physical problem one intends to model. General multiplication rule for independent events A, A2,, : 1 A n A 2 A1 n 1 2 n A A A A
18 Stochastic or random variable: Number which can be attached to all outcomes of an experiment Example: throwing two dice, sum of number of spots Mathematical terminology: real-valued function defined over the elements of the sample space S of an experiment A capital letter is often used to denote a random variable, for instance X Simulation experiment: throwing two dice N times, recording sum of spots each time and calculating the relative frequency of occurence for each of the outcomes
19 N=10 lue columns: observed rel. freq. Red columns: teoretically expected rel. freq.
20 N=20 robability
21 N=100 robability
22 N=1000 robability
23 N=10000 robability
24 N= robability
25 N= robability
26 N= robability
27 The relative frequencies seem to converge towards the theoretically expected probabilities Such a diagram is an expression of a probability distribution: A list of all different values of a random variable together with the associated probabilities Mathematically: a function fx = X=x defined for all possible values x of X given by the experiment at hand The values of X can be discrete like in the previous example, or continuous For continuous x, fx is called a probability density function Simulation experiment: height of Norwegian men Collecting data, calculating relative frequencies of occurences in intervals of various widths
28 interval width 10 cm
29 interval width 5 cm
30 interval width 1 cm
31 interval width 0.5 cm
32 interval width 0 continuous probability distribution
33 Cumulative distribution function: Fa=X a For discrete, random variables: For continuous, random variables: a x a x i i i i x f x X a F a dx x f a F
34 It follows: a X b F b F a For continuous variables: a X b f x dx b a
35 shaded area is a < X < b a b
36 shaded area is X < b b
37 shaded area is X > a a
38 A function ux of a random variable X is also a random variable. The expectation value of such a function is: u X Two very important special cases are: E u x f x dx E X x f x dx mean 2 2 X 2 Var X E x f x dx variance
39 The mean μ is the most important measure of the centre of the distribution of X. The variance, or its square root σ, the standard deviation, is the most important measure of the spread of the distribution of X around the mean. The mean is the first moment of X, whereas the variance is the second central moment of X. In general, the n th moment of X is n X n n E x f x dx
40 The n th central moment is n X n mn E 1 x 1 f x dx Another measure of the centre of the distribution of X is the median, defined as 1 F x med 2 or, in words, the value of of X of which half of the probability lies above and half lies below.
41 Assume now that X and Y are two random variables with a joint probability distribution function pdf fx,y. The marginal pdf of X is f 1 x f x, y dy whereas the marginal pdf of Y is f 2 y f x, y dx
42 The mean values of X and Y are X x f x, y dxdy x f1 x Y y f x, y dxdy y f2 y The covariance of X and Y is dx dy cov X, Y E X X Y Y EXY X Y
43 If several random variables are considered simultaneously, one frequently arranges the variables in a stochastic or random vector The covariances are then naturally displayed in a covariance matrix T X n X,,, X 2 1 X, cov, cov, cov, cov, cov, cov, cov, cov, cov cov n n n n n n X X X X X X X X X X X X X X X X X X X
44 If two variables X and Y are independent, the joint pdf can be written f x, y f1 x f2 y The covariance of X and Y vanishes in this case why?, and the variances add: VX+Y=VX+VY. If X and Y are not independent, the general formula is: VX+Y=VX+VY+2CovX,Y. For n mutually independent random variables the covariance matrix becomes diagonal i.e. all off-diagonal terms are identically zero.
45 If a random vector is related to a vector X with pdf fx by a function YX, the pdf of Y is where J is the absolute value of the determinant of a matrix J. This matrix is the so-called Jacobian of the transformation from Y to X: Y n Y Y,, 2, 1 Y J y x y g f n n n n y x y x y x y x J
46 The transformation of the covariance matrix is where the inverse of J is The transformation from x to y must be one-to-one, such that the inverse functional relationship exists. 1 T 1 cov cov J X J Y n n n n x y x y x y x y J
47 Obtaining covy from covx as in the previous slide is a very much used technique in high-energy physics data analysis. It is called linear error propagation and is applicable any time one wants to transform from one set of estimated parameters to another Transformation between different sets of parameters describing a reconstructed particle track Transport of track parameters from one location in a detector to another. Will see examples later in the course
48 The characteristic function Φu associated with the pdf fx is the Fourier transform of fx: u iux E e Such functions are useful in deriving results about moments of random variables. The relation between Φu and the moments of X are i n n d n du u0 x If Φu is known, all moments of fx can be calculated without the knowledge of fx itself n e iux f x dx f x dx n
49 Some common probability distributions: inomial distribution oisson distribution Gaussian distribution Chisquare distribution Student s t distribution Gamma distribution We will take a closer look at some of them
50 inomial distribution: Assume that we make n identical experiments with only two possible outcomes: success or no success The probability of success p is the same for all experiments The individual experiments are independent of each other The probability of x successes out of n trials is then X Example: throwing dice n times Defining event of success to be occurence of six spots in a throw robability p=1/6 n x x x p 1 p n x
51 probability distribution for number of successes in 5 throws
52 probability distribution for number of successes in 15 throws
53 probability distribution for number of successes in 50 throws anything familiar with the shape of this distribution?
54 Mean value and variance: E X np Var X np1 p Five throws with a dice: E# six spots = 5/6 Var# six spots = 25/36 Std# six spots = 5/6
55 oisson distribution: Number of occurences of event A per given time length, area, volume, interval is constant and equal to λ. robability distribution of observing x occurences in the interval is X x oth mean value and variance of X is λ. x! Example: number of particles in a beam passing through a given area in a given time must be oisson distributed. If the average number λ is known, the probabilities for all x can be calculated according to the formula above. x e
56 Gaussian distribution: Most frequently occurring distribution in nature. Most measurement uncertainties, disturbances of directions of charged particles when penetrating through enough matter, number of ionizations created by charged particle in a slab of material etc. follow Gaussian distribution. Main reason: CENTRAL LIMIT THEOREM States that sum of n independent random variables converges to a Gaussian distribution when n is large enough, irrespective of the individual distributions of the variables. Abovementioned examples are typically of this type.
57 Gaussian probability density function with mean value μ and standard deviation σ: For a random vector X of size n with mean value μ and covariance matrix V the function is multivariate Gaussian distribution: / , ; x e x f μ x V μ x V V μ x 1 2 / 2 1 exp det 2 1, ; T n f
58 Usual terminology: X ~ Nμ,σ : X is distributed according to a Gaussian normal with mean value μ and standard deviation σ. 68 % of distribution within plus/minus one σ. 95 % of distribution within plus/minus two σ % of distribution within plus/minus three σ. Standard normal variable Z~N0,1: Z=X- μ/ σ Quantiles of the standard normal distribution: z Z z 1 The value is denoted the 100 * α % quantile of the standard normal distribution Such quantiles can be found in tables or by computer programs
59 10 % quantile
60 5 % quantile 1.64
61 95 % of area within plus/ minus 2.5 % quantile 1.96
62 2 distribution: X,, If are independent, Gaussian random variables, then follow a 1 2 X n 2 i1 distribution with n degrees of freedom. Often used in evaluating level of compatibility between observed data and assumed pdf of the data Example: is position of measurement in a particle detector compatible with the assumed distribution of the measurement? Mean value is n and variance 2n. n X i i 2 i 2
63 chisquare distribution with 10 degrees of freedom
LECTURE NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2013 PART I A. STRANDLIE GJØVIK UNIVERSITY COLLEGE AND UNIVERSITY OF OSLO
LECTURE NOTES FYS 4550/FYS9550 - EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2013 PART I PROBABILITY AND STATISTICS A. STRANDLIE GJØVIK UNIVERSITY COLLEGE AND UNIVERSITY OF OSLO Before embarking on the concept
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationLectures on Statistical Data Analysis
Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationData Analysis and Monte Carlo Methods
Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationSingle Maths B: Introduction to Probability
Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationTopic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.
Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick
More informationStatistische Methoden der Datenanalyse. Kapitel 1: Fundamentale Konzepte. Professor Markus Schumacher Freiburg / Sommersemester 2009
Prof. M. Schumacher Stat Meth. der Datenanalyse Kapi,1: Fundamentale Konzepten Uni. Freiburg / SoSe09 1 Statistische Methoden der Datenanalyse Kapitel 1: Fundamentale Konzepte Professor Markus Schumacher
More informationMath 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationStatistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg
Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical
More informationReview of Basic Probability Theory
Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory
More informationPreliminary Statistics. Lecture 3: Probability Models and Distributions
Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationStatistical Methods in Particle Physics. Lecture 2
Statistical Methods in Particle Physics Lecture 2 October 17, 2011 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2011 / 12 Outline Probability Definition and interpretation Kolmogorov's
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationIntroduction to Statistical Methods for High Energy Physics
Introduction to Statistical Methods for High Energy Physics 2011 CERN Summer Student Lectures Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationRWTH Aachen Graduiertenkolleg
RWTH Aachen Graduiertenkolleg 9-13 February, 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan Course web page: www.pp.rhul.ac.uk/~cowan/stat_aachen.html
More informationChapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory
Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationReview of probability. Nuno Vasconcelos UCSD
Review of probability Nuno Vasconcelos UCSD robability probability is the language to deal with processes that are non-deterministic examples: if I flip a coin 00 times how many can I expect to see heads?
More informationRecap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS
Recap. Probability (section 1.1) The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY Population Sample INFERENTIAL STATISTICS Today. Formulation
More informationStatistics and Data Analysis
Statistics and Data Analysis The Crash Course Physics 226, Fall 2013 "There are three kinds of lies: lies, damned lies, and statistics. Mark Twain, allegedly after Benjamin Disraeli Statistics and Data
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationAnalysis of Experimental Designs
Analysis of Experimental Designs p. 1/? Analysis of Experimental Designs Gilles Lamothe Mathematics and Statistics University of Ottawa Analysis of Experimental Designs p. 2/? Review of Probability A short
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationSTAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.
STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationUQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables
UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables To be provided to students with STAT2201 or CIVIL-2530 (Probability and Statistics) Exam Main exam date: Tuesday, 20 June 1
More informationIAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES
IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationRevised May 1996 by D.E. Groom (LBNL) and F. James (CERN), September 1999 by R. Cousins (UCLA), October 2001 and October 2003 by G. Cowan (RHUL).
31. Probability 1 31. PROBABILITY Revised May 1996 by D.E. Groom (LBNL) and F. James (CERN), September 1999 by R. Cousins (UCLA), October 2001 and October 2003 by G. Cowan (RHUL). 31.1. General [1 8] An
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationChapter 2. Probability
2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More informationSome Concepts of Probability (Review) Volker Tresp Summer 2018
Some Concepts of Probability (Review) Volker Tresp Summer 2018 1 Definition There are different way to define what a probability stands for Mathematically, the most rigorous definition is based on Kolmogorov
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationProbability Theory Review Reading Assignments
Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"
More information2. A Basic Statistical Toolbox
. A Basic Statistical Toolbo Statistics is a mathematical science pertaining to the collection, analysis, interpretation, and presentation of data. Wikipedia definition Mathematical statistics: concerned
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationHuman-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015
Probability Refresher Kai Arras, University of Freiburg Winter term 2014/2015 Probability Refresher Introduction to Probability Random variables Joint distribution Marginalization Conditional probability
More informationTransformation of Probability Densities
Transformation of Probability Densities This Wikibook shows how to transform the probability density of a continuous random variable in both the one-dimensional and multidimensional case. In other words,
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationStatistical Data Analysis 2017/18
Statistical Data Analysis 2017/18 London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationMaking Hard Decision. Probability Basics. ENCE 627 Decision Analysis for Engineering
CHAPTER Duxbury Thomson Learning Making Hard Decision Probability asics Third Edition A. J. Clark School of Engineering Department of Civil and Environmental Engineering 7b FALL 003 y Dr. Ibrahim. Assakkaf
More informationStat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota
Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution Charles J. Geyer School of Statistics University of Minnesota 1 The Dirichlet Distribution The Dirichlet Distribution is to the beta distribution
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationModeling Uncertainty in the Earth Sciences Jef Caers Stanford University
Probability theory and statistical analysis: a review Modeling Uncertainty in the Earth Sciences Jef Caers Stanford University Concepts assumed known Histograms, mean, median, spread, quantiles Probability,
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationReview of probability and statistics 1 / 31
Review of probability and statistics 1 / 31 2 / 31 Why? This chapter follows Stock and Watson (all graphs are from Stock and Watson). You may as well refer to the appendix in Wooldridge or any other introduction
More informationIntro to Probability. Andrei Barbu
Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems
More informationYETI IPPP Durham
YETI 07 @ IPPP Durham Young Experimentalists and Theorists Institute Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan Course web page: www.pp.rhul.ac.uk/~cowan/stat_yeti.html
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationCS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro
CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your
More informationProbability theory basics
Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationStatistical Methods in Particle Physics Lecture 1: Bayesian methods
Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationStatistical Methods for Astronomy
Statistical Methods for Astronomy Probability (Lecture 1) Statistics (Lecture 2) Why do we need statistics? Useful Statistics Definitions Error Analysis Probability distributions Error Propagation Binomial
More informationSet Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline
Set Theory A, B sets e.g. A = {ζ 1,...,ζ n } A = { c x y d} S space (universe) A,B S Outline Pattern Recognition III Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague
More informationDependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.
Practitioner Course: Portfolio Optimization September 10, 2008 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y ) (x,
More informationLecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1
Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More information