Probability Theory Refresher
|
|
- Osborn Cobb
- 6 years ago
- Views:
Transcription
1 Machine Learning WS24 Module IN264 Sheet 2 Page Machine Learning Worksheet 2 Probabilit Theor Refresher Basic Probabilit Problem : A secret government agenc has developed a scanner which determines whether a person is a terrorist. The scanner is fairl reliable; 95% of all scanned terrorists are identified as terrorists, and 95% of all upstanding citizens are identified as such. An informant tells the agenc that eactl one passenger of aboard an aeroplane in which ou are seated is a terrorist. The agenc decide to scan each passenger and the shift looking man sitting net to ou is tested as TERRORIST. What are the chances that this man is a terrorist? Show our work! The random variable T indicates, if a person is a terrorist or not, i.e. with the given information above: p(t = ) =. and p(t = ) =.99. The random variable S indicates, if the scanner tests TERRORIST or not. So, p(s = T = ) =.95 (given in the tet) and therefore (laws of probabilit) p(s = T = ) =.5. Similaril, p(s = T = ) =.95 and p(s = T = ) =.5. We are interested in p(t= S=): p(t = S = ) = p(s = T = )p(t = ) p(s = T = )p(t = ) + p(s = T = )p(t = ) = Note that in the denominator, we compute p(s = ) using the law of total probabilit (also known as sum rule). Problem 2: Two balls are placed in a bo as follows: A fair coin is tossed and a white ball is placed in the bo if a head occurs, otherwise a red ball is placed in the bo. The coin is tossed again and a red ball is placed in the bo if a tail occurs, otherwise a white ball is placed in the bo. Balls are drawn from the bo three times in succession (alwas with replacing the drawn ball back in the bo). It is found that on all three occasions a red ball is drawn. What is the probabilit that both balls in the bo are red? Show our work! Denote b RRR the event that 3 red balls are drawn. Similaril, denote b rr the event that 2 red balls are placed in the bo, rw the event that first a white and then a red ball are placed in the bo, and wr and ww for the remaining two possibilities. We know that Furthermore Therefore p(rw) = p(rw) = p(wr) = p(ww) = 4 p(rrr rr) =, p(rrr rw) = p(rrr wr) = /8, p(rrr ww) = p(rr RRR) = p(rrr rr)p(rr) p(rrr) = /4 5/6 = 4 5 where p(rrr) = {rr,rw,wr,ww} p(rrr )p() is computed with the sum rule. Submit to homework@class.brml.org with subject line homework sheet 2 b 24//2, :59 CET
2 Machine Learning WS24 Module IN264 Sheet 2 Page 2 Problem 3: There are eleven urns labeled b u {,, 2,..., }, each containing ten balls. Urn u contains u black balls and u white balls. Alice selects an urn u at random and draws N times with replacement from that urn, obtaining n B black balls and N n B white balls. If after N = draws n B = 3 black balls have been drawn, what is the probabilit that the urn Alice is using is urn u? Problem 4: Now, let Alice draw another ball from the same urn. What is the probabilit that the net drawn ball is black (show our work)? Our goal is to compute P (u n B, N). Given are P (n B u, N) and P (u): Use the above rules: Assuming that P (N u) = P (N): Thus: P (n B u, N) = ( N )b n B n u ( b u ) N n B B P (u, n B, N) = P (u n B, N)P (n B, N) P (u, n B, N) = P (n B u, N)P (u, N) = P (n B u, N)P (N)P (u) P (u n B, N) = P (n B u, N)P (u)p (N) P (n B, N) Appling again the sum rule for the denominator: finall resulting in: P (n B, N) = P (n B u, N)P (u)p (N) u P (u n B, N) = We derive the following conditional distribution: P (n B u, N)P (u) u P (n B u, N)P (u) u P (u n B = 3, N = ) Now, let Alice draw another ball from the same urn. What is the probabilit that the net drawn ball is black? Submit to homework@class.brml.org with subject line homework sheet 2 b 24//2, :59 CET
3 Machine Learning WS24 Module IN264 Sheet 2 Page 3 Let B N+ denote net ball is black. Thus, using the sum rule: P (B N+ n B, N) = P (B N+ u, n B, N)P (u n B, N) u For a fied u, balls are drawn with replacement, therefore P (B N+ u, n B, N) = b u, thus P (B N+ n B, N) = b u P (u n B, N).333 u Problem 5: Calculate the mean and the variance of the uniform random variable X with PDF p() =, [, ], and elsewhere. Let X Uniform(, ), i.e. and E[X] = E[X 2 ] = + + d = 2 d = and thus V ar[x] = E[X 2 ] E[X] 2 = /3 /4 = /2. d = [ 2 2 ] =.5 2 d = [ 3 3 ] = 3 Problem 6: results: Consider two random variables X and Y with joint densit p(, ). Prove the following two E[X] = E Y [E X Y [X]] () V ar[x] = E Y [V ar X Y [X]] + V ar Y [E X Y [X]] (2) Here E X Y [X] denotes the epectation of X under the conditional densit p( ), with a similar notation for the conditional variance. Consider the discrete case onl, using the sum rule and term reordering as the onl necessar strategies: ( p( )) p() = p( )p() = p(, ) = p() = E[X] Using this result and carefull deal with the meanings of the various smbols, we can derive also the second Submit to homework@class.brml.org with subject line homework sheet 2 b 24//2, :59 CET
4 Machine Learning WS24 Module IN264 Sheet 2 Page 4 equation: E Y [V ar X Y [X]] = = = (( E X Y [X]) 2 p( )) p() 2 p(, ) 2 ( 2 p() 2 E 2 X Y [X]p() + = E X [X 2 ] E Y [E 2 X Y [X]] V ar Y [E X Y [X]] = (E X Y [X] E Y [E X Y [X]]) 2 p() = p( )) E X Y [X]p() + E 2 X Y [X]p() E 2 X Y [X]p() 2E X[X] E X Y [X]p() + EX[X] 2 = E Y [E 2 X Y [X]] 2E2 X[X] + E 2 X[X] = E Y [E 2 X Y [X]] E2 X[X] E 2 X Y [X]p(, ) Thus we get: E Y [V ar X Y [X]] + V ar Y [E X Y [X]] = E X [X 2 ] E 2 X[X] = var[x] There is an alternative, shorter wa: E Y [V ar X Y [X]] + V ar Y [E X Y [X]] = E Y [E X Y [X 2 ]] E Y [E X Y [X 2 ]] + E Y [E X Y [X 2 ]] E 2 Y [E X Y [X]] = E Y [E X Y [X 2 ]] E 2 X [X] Consider these two results with setting X as a parameter θ that we want to learn and Y as a random variable representing a possible observed data set D. Thus E θ [θ] = E D [E θ [θ D]] sas that the posterior mean of θ (this is the right hand side of the above equation), averaged over the distribution generating the data (out of which one particular realization of D is choosen) is equal to the prior mean of θ. This is wired! Here, one mies baesian analsis (posteriors) with frequentist view points (epectation over all possible data sets) and thus gets a rather strange result. With var[θ] = E D [var D [θ D]] + var D [E θ [θ D]] one gets an interesting statement regarding the epected posterior variance of θ (this is the first term on the right hand side). It is alwas smaller than the original prior variance (left hand side)! (but onl in average, as alread said), because the variance of the posterior mean (second term on the right side) is a positive quanitit. 2 Probabilit Inequalities Inequalities are useful for bounding quantities that might otherwise be hard to compute. We ll begin with a simple inequalit, called the Markov inequalit after Andrei A. Markov, a student of Pafnut Submit to homework@class.brml.org with subject line homework sheet 2 b 24//2, :59 CET
5 Machine Learning WS24 Module IN264 Sheet 2 Page 5 Chebshev. 2. Markov Inequalit Let X be a non-negative, discrete random variable, and let c > be a positive constant. Problem 7: Show that P (X > c) E[X]. c Now, consider flipping a fair coin n times. Using the Markov Inequalit, what is the probabilit of getting more than (3/4)n heads? E[X] = p() = c we get an upper bound of 2/3. p() + >c p() >c p() c p() = cp (X > c) >c 2.2 Chebshev Inequalit Appl the Markov Inequalit to the deviation of a random variable from its mean, i.e. for a general random variable X we wish to bound the probabilit of the event { X E[X] > a}, which is the same as the event {(X E[X]) 2 > a 2 }. Problem 8: Prove that P ( X E[X] > a) V ar(x) a 2 holds. Again, consider flipping a fair coin n times. Now use the Chebshev Inequalit to bound the probabilit of getting more than (3/4)n heads. P ( X E[X] > a) = P ((X E[X]) 2 > a 2 ) V ar(x) a 2 Then start from the statement and form it into something similar to the above: P (X > 3/4n) = P (X /2n > /4n) P ( X E[X] /4n) n/4 n/6 = 4 n Note that we are considering here X as the sum of independent coin flips and thus using the rule: variance of the sum of independent variables is the sum of the variances. Interpretation of the result: with more tosses, the epected result converges to the mean value, so we have a convergence to zero in this case. Submit to homework@class.brml.org with subject line homework sheet 2 b 24//2, :59 CET
6 Machine Learning WS24 Module IN264 Sheet 2 Page Jensen s Inequalit Let f be a conve function. If λ,..., λ n are positive numbers with λ + + λ n =, then for an,..., n I: f(λ + + λ n n ) λ f( ) + + λ n f( n ) Problem 9: Prove this inequalit b using induction on n. For n = 2 nothing is to prove, definition of conveit. Assume the statement is true for n: f( i= λ i i ) = f(λ + ( λ ) i=2 where we used the fact that i λ i i λ f( ) + ( λ )f( λ i=2 λ i i ) λ i= λ i λ = and thus could appl the induction assumption. λ i f( i ) Submit to homework@class.brml.org with subject line homework sheet 2 b 24//2, :59 CET
Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom
1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationMATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM
MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are
More informationEXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS
EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker
More informationf(x) = 2x 2 + 2x - 4
4-1 Graphing Quadratic Functions What You ll Learn Scan the tet under the Now heading. List two things ou will learn about in the lesson. 1. Active Vocabular 2. New Vocabular Label each bo with the terms
More informationReview of Elementary Probability Lecture I Hamid R. Rabiee
Stochastic Processes Review o Elementar Probabilit Lecture I Hamid R. Rabiee Outline Histor/Philosoph Random Variables Densit/Distribution Functions Joint/Conditional Distributions Correlation Important
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationLecture Stat 302 Introduction to Probability - Slides 5
Lecture Stat 302 Introduction to Probability - Slides 5 AD Jan. 2010 AD () Jan. 2010 1 / 20 Conditional Probabilities Conditional Probability. Consider an experiment with sample space S. Let E and F be
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More information3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES
3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions
More informationEcon 371 Problem Set #1 Answer Sheet
Econ 371 Problem Set #1 Answer Sheet 2.1 In this question, you are asked to consider the random variable Y, which denotes the number of heads that occur when two coins are tossed. a. The first part of
More informationChapter 18 Sampling Distribution Models
Chapter 18 Sampling Distribution Models The histogram above is a simulation of what we'd get if we could see all the proportions from all possible samples. The distribution has a special name. It's called
More informationMathematical Statistics. Gregg Waterman Oregon Institute of Technology
Mathematical Statistics Gregg Waterman Oregon Institute of Technolog c Gregg Waterman This work is licensed under the Creative Commons Attribution. International license. The essence of the license is
More informationSection 5.1: Functions
Objective: Identif functions and use correct notation to evaluate functions at numerical and variable values. A relationship is a matching of elements between two sets with the first set called the domain
More informationProblems and results for the ninth week Mathematics A3 for Civil Engineering students
Problems and results for the ninth week Mathematics A3 for Civil Engineering students. Production line I of a factor works 0% of time, while production line II works 70% of time, independentl of each other.
More informationPhysics Gravitational force. 2. Strong or color force. 3. Electroweak force
Phsics 360 Notes on Griffths - pluses and minuses No tetbook is perfect, and Griffithsisnoeception. Themajorplusisthat it is prett readable. For minuses, see below. Much of what G sas about the del operator
More informationMathematical Foundations of Computer Science Lecture Outline October 18, 2018
Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or
More informationINF Introduction to classifiction Anne Solberg
INF 4300 8.09.17 Introduction to classifiction Anne Solberg anne@ifi.uio.no Introduction to classification Based on handout from Pattern Recognition b Theodoridis, available after the lecture INF 4300
More informationChapter 3. Expectations
pectations - Chapter. pectations.. Introduction In this chapter we define five mathematical epectations the mean variance standard deviation covariance and correlation coefficient. We appl these general
More informationMath 180B Homework 4 Solutions
Math 80B Homework 4 Solutions Note: We will make repeated use of the following result. Lemma. Let (X n ) be a time-homogeneous Markov chain with countable state space S, let A S, and let T = inf { n 0
More informationProbability and Inference
Deniz Yuret ECOE 554 Lecture 3 Outline 1 Probabilities and ensembles 2 3 Ensemble An ensemble X is a triple (x, A X, P X ), where the outcome x is the value of a random variable, which takes on one of
More informationUnit 26 Solving Inequalities Inequalities on a Number Line Solution of Linear Inequalities (Inequations)
UNIT Solving Inequalities: Student Tet Contents STRAND G: Algebra Unit Solving Inequalities Student Tet Contents Section. Inequalities on a Number Line. of Linear Inequalities (Inequations). Inequalities
More informationProbability & Statistics - FALL 2008 FINAL EXAM
550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is
More informationSome Basic Concepts of Probability and Information Theory: Pt. 2
Some Basic Concepts of Probability and Information Theory: Pt. 2 PHYS 476Q - Southern Illinois University January 22, 2018 PHYS 476Q - Southern Illinois University Some Basic Concepts of Probability and
More informationUniversity of California, Los Angeles Department of Statistics. Joint probability distributions
Universit of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Joint probabilit distributions So far we have considered onl distributions with one random variable.
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationSTAT 516 Midterm Exam 3 Friday, April 18, 2008
STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationSome Concepts of Probability (Review) Volker Tresp Summer 2018
Some Concepts of Probability (Review) Volker Tresp Summer 2018 1 Definition There are different way to define what a probability stands for Mathematically, the most rigorous definition is based on Kolmogorov
More informationThe Central Limit Theorem
The Central Limit Theorem Patrick Breheny September 27 Patrick Breheny University of Iowa Biostatistical Methods I (BIOS 5710) 1 / 31 Kerrich s experiment Introduction 10,000 coin flips Expectation and
More informationLinear Equations and Arithmetic Sequences
CONDENSED LESSON.1 Linear Equations and Arithmetic Sequences In this lesson, ou Write eplicit formulas for arithmetic sequences Write linear equations in intercept form You learned about recursive formulas
More informationApplied Algebra II Semester 2 Practice Exam A DRAFT. 6. Let f ( x) = 2x A. 47 B. 92 C. 139 D. 407
Applied Algebra II Semester Practice Eam A. Find the solution set of { + 0, 0} { + i 0, i 0} { + i, i } { +, } + = 9.. Let f ( ) = and ( ) 0 g =. Which epression is equivalent to f g? ( ) ( ). What is
More informationWe have examined power functions like f (x) = x 2. Interchanging x
CHAPTER 5 Eponential and Logarithmic Functions We have eamined power functions like f =. Interchanging and ields a different function f =. This new function is radicall different from a power function
More information8.4. If we let x denote the number of gallons pumped, then the price y in dollars can $ $1.70 $ $1.70 $ $1.70 $ $1.
8.4 An Introduction to Functions: Linear Functions, Applications, and Models We often describe one quantit in terms of another; for eample, the growth of a plant is related to the amount of light it receives,
More information6. This sum can be rewritten as 4( ). We then recall the formula n =
. c = 9b = 3 b = 3 a 3 = a = = 6.. (3,, ) = 3 + + 3 = 9 + + 3 = 6 6. 3. We see that this is equal to. 3 = ( +.) 3. Using the fact that (x + ) 3 = x 3 + 3x + 3x + and replacing x with., we find that. 3
More informationMonte Carlo integration
Monte Carlo integration Eample of a Monte Carlo sampler in D: imagine a circle radius L/ within a square of LL. If points are randoml generated over the square, what s the probabilit to hit within circle?
More informationx y plane is the plane in which the stresses act, yy xy xy Figure 3.5.1: non-zero stress components acting in the x y plane
3.5 Plane Stress This section is concerned with a special two-dimensional state of stress called plane stress. It is important for two reasons: () it arises in real components (particularl in thin components
More informationRELATIONS AND FUNCTIONS through
RELATIONS AND FUNCTIONS 11.1.2 through 11.1. Relations and Functions establish a correspondence between the input values (usuall ) and the output values (usuall ) according to the particular relation or
More information1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise
Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationThis exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.
Probability and Statistics FS 2017 Session Exam 22.08.2017 Time Limit: 180 Minutes Name: Student ID: This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided
More informationThe Force Table Introduction: Theory:
1 The Force Table Introduction: "The Force Table" is a simple tool for demonstrating Newton s First Law and the vector nature of forces. This tool is based on the principle of equilibrium. An object is
More informationFinal Examination. Your name: Circle the name of your Tutorial Instructor: David Hanson Jelani Sayan
Massachusetts Institute of Technology 6.042J/18.062J, Fall 05: Mathematics for Computer Science December 21 Prof. Albert R. Meyer and Prof. Ronitt Rubinfeld revised December 22, 2005, 1118 minutes Circle
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationSection 3.1. ; X = (0, 1]. (i) f : R R R, f (x, y) = x y
Paul J. Bruillard MATH 0.970 Problem Set 6 An Introduction to Abstract Mathematics R. Bond and W. Keane Section 3.1: 3b,c,e,i, 4bd, 6, 9, 15, 16, 18c,e, 19a, 0, 1b Section 3.: 1f,i, e, 6, 1e,f,h, 13e,
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationV-0. Review of Probability
V-0. Review of Probabilit Random Variable! Definition Numerical characterization of outcome of a random event!eamles Number on a rolled die or dice Temerature at secified time of da 3 Stock Market at close
More informationCMPT 882 Machine Learning
CMPT 882 Machine Learning Lecture Notes Instructor: Dr. Oliver Schulte Scribe: Qidan Cheng and Yan Long Mar. 9, 2004 and Mar. 11, 2004-1 - Basic Definitions and Facts from Statistics 1. The Binomial Distribution
More informationThe Random Variable for Probabilities Chris Piech CS109, Stanford University
The Random Variable for Probabilities Chris Piech CS109, Stanford University Assignment Grades 10 20 30 40 50 60 70 80 90 100 10 20 30 40 50 60 70 80 90 100 Frequency Frequency 10 20 30 40 50 60 70 80
More informationTHE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA
THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future
More informationCourse 15 Numbers and Their Properties
Course Numbers and Their Properties KEY Module: Objective: Rules for Eponents and Radicals To practice appling rules for eponents when the eponents are rational numbers Name: Date: Fill in the blanks.
More informationMath489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7
Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7 Steve Dunbar Due Mon, November 2, 2009. Time to review all of the information we have about coin-tossing fortunes
More informationSTAT 311 Practice Exam 2 Key Spring 2016 INSTRUCTIONS
STAT 311 Practice Exam 2 Key Spring 2016 Name: Key INSTRUCTIONS 1. Nonprogrammable calculators (or a programmable calculator cleared in front of the professor before class) are allowed. Exam is closed
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationINF Introduction to classifiction Anne Solberg Based on Chapter 2 ( ) in Duda and Hart: Pattern Classification
INF 4300 151014 Introduction to classifiction Anne Solberg anne@ifiuiono Based on Chapter 1-6 in Duda and Hart: Pattern Classification 151014 INF 4300 1 Introduction to classification One of the most challenging
More informationDiscrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20
CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute
More informationLimits. Calculus Module C06. Copyright This publication The Northern Alberta Institute of Technology All Rights Reserved.
e Calculus Module C Limits Copright This publication The Northern Alberta Institute of Technolog. All Rights Reserved. LAST REVISED March, Introduction to Limits Statement of Prerequisite Skills Complete
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationTable of Contents. Module 1
Table of Contents Module 1 11 Order of Operations 16 Signed Numbers 1 Factorization of Integers 17 Further Signed Numbers 13 Fractions 18 Power Laws 14 Fractions and Decimals 19 Introduction to Algebra
More informationRecursive Estimation
Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables,
More informationHomework 9 (due November 24, 2009)
Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More informationMA Game Theory 2005, LSE
MA. Game Theor, LSE Problem Set The problems in our third and final homework will serve two purposes. First, the will give ou practice in studing games with incomplete information. Second (unless indicated
More informationExpectation, inequalities and laws of large numbers
Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would
More informationMEP Pupil Text 16. The following statements illustrate the meaning of each of them.
MEP Pupil Tet Inequalities. Inequalities on a Number Line An inequalit involves one of the four smbols >,, < or. The following statements illustrate the meaning of each of them. > : is greater than. :
More informationChapter 11 Sampling Distribution. Stat 115
Chapter 11 Sampling Distribution Stat 115 1 Definition 11.1 : Random Sample (finite population) Suppose we select n distinct elements from a population consisting of N elements, using a particular probability
More informationThe Random Variable for Probabilities Chris Piech CS109, Stanford University
The Random Variable for Probabilities Chris Piech CS109, Stanford University Assignment Grades 10 20 30 40 50 60 70 80 90 100 10 20 30 40 50 60 70 80 90 100 Frequency Frequency 10 20 30 40 50 60 70 80
More informationFunctions. Introduction
Functions,00 P,000 00 0 970 97 980 98 990 99 000 00 00 Figure Standard and Poor s Inde with dividends reinvested (credit "bull": modification of work b Praitno Hadinata; credit "graph": modification of
More informationFinding Limits Graphically and Numerically. An Introduction to Limits
8 CHAPTER Limits and Their Properties Section Finding Limits Graphicall and Numericall Estimate a it using a numerical or graphical approach Learn different was that a it can fail to eist Stud and use
More informationPh.D. Qualifying Exam Monday Tuesday, January 4 5, 2016
Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Find the maximum likelihood estimate of θ where θ is a parameter
More informationRecall Discrete Distribution. 5.2 Continuous Random Variable. A probability histogram. Density Function 3/27/2012
3/7/ Recall Discrete Distribution 5. Continuous Random Variable For a discrete distribution, for eample Binomial distribution with n=5, and p=.4, the probabilit distribution is f().7776.59.3456.34.768.4.3
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More information18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages
Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution
More informationMath 407: Probability Theory 5/10/ Final exam (11am - 1pm)
Math 407: Probability Theory 5/10/2013 - Final exam (11am - 1pm) Name: USC ID: Signature: 1. Write your name and ID number in the spaces above. 2. Show all your work and circle your final answer. Simplify
More information18.05 Practice Final Exam
No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For
More informationM.Sc.(Mathematics with Applications in Computer Science) Probability and Statistics
MMT-008 Assignment Booklet M.Sc.(Mathematics with Applications in Computer Science) Probability and Statistics (Valid from 1 st July, 013 to 31 st May, 014) It is compulsory to submit the assignment before
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationExam 2 Practice Questions, 18.05, Spring 2014
Exam 2 Practice Questions, 18.05, Spring 2014 Note: This is a set of practice problems for exam 2. The actual exam will be much shorter. Within each section we ve arranged the problems roughly in order
More informationFinding Limits Graphically and Numerically. An Introduction to Limits
60_00.qd //0 :05 PM Page 8 8 CHAPTER Limits and Their Properties Section. Finding Limits Graphicall and Numericall Estimate a it using a numerical or graphical approach. Learn different was that a it can
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationModule 1. Probability
Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive
More informationEntropy, Relative Entropy and Mutual Information Exercises
Entrop, Relative Entrop and Mutual Information Exercises Exercise 2.: Coin Flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips required. (a) Find the entrop H(X)
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationMAT 271E Probability and Statistics
MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr 3.3 6.3, Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis,
More informationProbability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008
Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize
More informationTABLE OF CONTENTS - UNIT 1 CHARACTERISTICS OF FUNCTIONS
TABLE OF CONTENTS - UNIT CHARACTERISTICS OF FUNCTIONS TABLE OF CONTENTS - UNIT CHARACTERISTICS OF FUNCTIONS INTRODUCTION TO FUNCTIONS RELATIONS AND FUNCTIONS EXAMPLES OF FUNCTIONS 4 VIEWING RELATIONS AND
More informationCH5 CH6(Sections 1 through 5) Homework Problems
550.40 CH5 CH6(Sections 1 through 5) Homework Problems 1. Part of HW #6: CH 5 P1. Let X be a random variable with probability density function f(x) = c(1 x ) 1 < x < 1 (a) What is the value of c? (b) What
More information0.24 adults 2. (c) Prove that, regardless of the possible values of and, the covariance between X and Y is equal to zero. Show all work.
1 A socioeconomic stud analzes two discrete random variables in a certain population of households = number of adult residents and = number of child residents It is found that their joint probabilit mass
More information8.1 Exponents and Roots
Section 8. Eponents and Roots 75 8. Eponents and Roots Before defining the net famil of functions, the eponential functions, we will need to discuss eponent notation in detail. As we shall see, eponents
More informationIntroduction to Machine Learning
What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes
More informationRefresher on Discrete Probability
Refresher on Discrete Probability STAT 27725/CMSC 25400: Machine Learning Shubhendu Trivedi University of Chicago October 2015 Background Things you should have seen before Events, Event Spaces Probability
More informationConditional Probability & Independence. Conditional Probabilities
Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F
More informationINF Anne Solberg One of the most challenging topics in image analysis is recognizing a specific object in an image.
INF 4300 700 Introduction to classifiction Anne Solberg anne@ifiuiono Based on Chapter -6 6inDuda and Hart: attern Classification 303 INF 4300 Introduction to classification One of the most challenging
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationNotes for Math 324, Part 17
126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and
More information11. Probability Sample Spaces and Probability
11. Probability 11.1 Sample Spaces and Probability 1 Objectives A. Find the probability of an event. B. Find the empirical probability of an event. 2 Theoretical Probabilities 3 Example A fair coin is
More informationMINIMUM PROGRAMME FOR AISSCE
KENDRIYA VIDYALAYA DANAPUR CANTT MINIMUM PROGRAMME FOR AISSCE 8 SUBJECT : MATHEMATICS (CLASS XII) Prepared By : K. N. P. Singh Vice-Principal K.V. Danapur Cantt () MATRIX. Find X and Y if 7 y & y 7 8 X,,
More information