EE 302 Division 1. Homework 6 Solutions.
|
|
- Cecil Randall
- 6 years ago
- Views:
Transcription
1 EE 3 Division. Homework 6 Solutions. Problem. A random variable X has probability density { C f X () e λ,,, otherwise, where λ is a positive real number. Find (a) The constant C. Solution. Because of the normalization aiom, the PDF should integrate to. On the other hand, we showed in class that the second moment of an eponential random variable with parameter λ is equal to /λ in other words, λe λ is equal to /λ. We therefore have: f X ()d C λ C λ λ C λ3. (b) The cumulative distribution function of X. Solution. By definition, F X () f X (u)du λe λ d From the epression of PDF we can see that if <, the integral is equal to. When, we use integration by parts twice: F X () λ3 λ3 λ3 λ3 u e λu du u d ( λ ) e λu [ u λ ] e λu + λ e λu d(u ) [ λ3 λ e λ + ud ( λ )] λ e λu [ λ e λ λ u λ e λu + ] λ λ e λu du [ λ3 λ e λ λ e λ ] λ λ e λu [ λ3 λ 3 +λ + λ ] λ 3 e λ (+λ + λ ) e λ.
2 λ () f X F X ().5 / λ 4/ λ 6 / λ 8 / λ. / λ 4/ λ 6 / λ 8/ λ (a) PDF of X (b) CDF of X Figure : PDF and CDF of X in Problem The PDF and CDF are depicted in Figure. (c) The probability P( X /λ). Solution. By definition of CDF, we have: P( X /λ) F X (/λ) F X (). By plugging in /λ and into the epression of CDF obtained in Part(b), we get: F X (/λ) (++/)e 5 e, F X (), P( X /λ) F X (/λ) 5 e.83. Problem. A random variable X has probability density f X () Ce µ, where µ is a real number and is a positive real number. Find (a) The constant C. Solution. Denoting by X the random variable from the end-of-chapter Problem, and substituting λ /, we see that our random variable is X X + µ. (To see this, we use the same argument as the one we used for the shifted eponential random variable in Homework 5). According to Problem from Chapter 3, the constant should be C λ/ /( ) in order to make the PDF correctly normalized. Similarly to Problem (a), we can also get the value of C directly, by applying the normalization aiom:
3 .7 f X () µ Figure : PDF of X in Problem C f X ()d C µ C C C. e µ d e ( µ) d (due to symmetry about µ) ( µ) e u du (u ) (b) The mean of X. Solution. Since X X + µ, E[X] E[X ]+µ µ. This makes sense, since, as shown in Figure, the PDF of X is symmetric about µ. Alternatively, we can derive this by applying the definition of epectation: E[X] f X ()d Making a change of variable u µ, wehave: E[X] (u + µ)ce u C ue u du }{{} µ. Ce µ d. du +µ Ce u du } {{ } (c) The variance of X. Solution. According to the end-of-chapter Problem, we have: var(x) /λ. 3
4 Again, we can derive this directly by applying the definition of variance: var[x] ( E[X]) f X ()d ( µ) e µ u e u du u u e du the second moment of an eponential r.v. with λ ( ). d Problem 3. A speedskating race has eactly three participants:,, and 3. They finish the race with respective times of X, X, and X 3, where the X i are uniformly distributed from 5 to 6 minutes. Find the probability that skater came in second. Solution. Since there is no difference among the three skaters, each one is equally likely to be the second one. The answer is therefore /3. We can also arrive at this by the following calculations, which are not necessary for this problem, but constitute an important technique for many other problems. Given that X i (i,, 3) are uniformly distributed on [5, 6], their PDF s are : f Xi ( i ) { 5 i 6 otherwise. (i,, 3) We assume the three participants are independent. The joint PDF of X,X,X 3 is the product of the three PDF s, i.e.: f XX X 3 (,, 3 ) f X ( )f X ( )f X3 ( 3 ) { 5 6, 5 6, otherwise. Let C {(,, 3 ):5 6, 5 6, 5 3 6} to be the sample space of the result of the race. We can see that the outcome of the race is uniformly distributed in the unit cube C. Let M {(,, 3 ) C : < < 3 } M {(,, 3 ) C : > > 3 } The probability that skater came in second is equal to P(M )+P(M ). Because of symmetry, P(M )P(M ) and we only need to calculate one of them. Let s do it for M. As depicted in Figure 3, the area above the plane EDAC corresponds to < and the area below BCFD corresponds to 3 >. The intersection of the two, tetrahedron ABCD, is equal to M. Because of uniformity, the probability of M is equal to its volume. Remember that the volume of a tetrahedron is equal to /3 times the area of the base times the height. 4
5 6 E B C D 5 F 6 3 A 6 Figure 3: The sample space of Problem 3 Therefore P(M )/6. It can also be obtained by integration as follows: P(M ) f XX X 3 (,, 3 )d d d 3 (,, 3) M d d d 3 (,, 3) M d 3 d d (6 )d d 6 ( 3 + )d 6. 5 (6 )( 5)d Therefore, The probability that skater came in second is equal to P(M )/3. In general, as long as their distributions of finishing time are identical, the answer should be /3. Problem 4. Let X be a normal random variable with mean µ and variance. (a) Compute the probabilities P(X µ + ) P(X µ ) P(X µ +) Solution. If we define random variable Y (X µ)/, Y is a standard normal random variable 5
6 with zero mean and unit variance. ( ) X µ P(X µ + ) P P(Y ) Φ() , ( ) X µ P(X µ ) P P(Y ) Φ( ) , ( ) X µ P(X µ +) P P(Y ) Φ() (b) Compute the probabilities Solution. P(µ X µ + ) P(µ X µ +) P(µ X µ + ) P(X µ + ) P(X µ ) , P(µ X µ +) P(X µ +) P(X µ ) ( ) A table of approimate values of Φ(/) /, as a function of /, is posted on the course web site. E.g., to look up the value of Φ(.3), add.5 to the number which is in row. and column.3. Do not forget that each number in the table is preceded by a decimal point, so, for eample, Φ(.3) Problem 5. A signal s 3 is transmitted from a satellite but is corrupted by noise, and the received signal is X s + W. When the weather is good, which happens with probability /3, W is normal with zero mean and variance 4. When the weather is bad, W is normal with zero mean and variance 9. In the absence of any weather information: (a) What is the PDF of X? Solution. What we can do is to first find the CDF of X and take derivative. By definition, we have: F X () P(X ) Since we don t have any weather information, if we can calculate the above probability conditioned on all the weather situations, we can apply the total probability theorem to get the final answer. Now let s suppose the weather is good, the random variable X 3+W is a Gaussian random variable with mean µ 3 and variance 4. Under this case, the probability that X is less than or equal to some is: ( ) ( ) µ 3 P(X Good) Φ Φ 6
7 Similarly, if we suppose the weather is bad, the conditional probability is: ( ) ( ) µ 3 P(X Bad) Φ Φ 3 Apply the total probability theorem to get: F X () P(X ) P(X Good)P(Good)+P(X bad)p(bad) ( ) ( ) 3 3 Φ 3 +Φ 3 3 Finally, we take derivative to get: f X () df X() e ( 3) d 3 + e ( 3) 3 π 3 π3 e ( 3) 8 + e ( 3) 8 3 π 9 π In general, we can get from Total Probability Theorem that: If B,B,,B n is a partition of the sample space, the following is true for any continuous random variable X: f X () f X B ()P(B )+f X B ()P(B )+ + f X Bn ()P(B n ), as stated on Page 3 of Chapter 3. (b) Calculate the probability that X is between and 4. Solution. By definition of CDF, we have: P( <X 4) P(X 4) P(X ) F X (4) F X () ( ) ( ) [ ( ) ( ) ] Φ 3 +Φ 3 3 Φ 3 +Φ 3 3 ( ) ( ) [ ( ) ( ) ] Φ 3 +Φ 3 3 Φ 3 +Φ Problem 6. Stations A and B are connected by two parallel message channels. A message from A to B is sent over both channels at the same time. Continuous random variables X and Y represent the message delays (in hours) over parallel channels I and II, respectively. These two random variables are independent, and both are uniformly distributed from to hours. A message is considered received as soon as it arrives on any one channel, and it is considered verified as soon as it has arrived over both channels. (a) Determine the probability that a message is received within 5 minutes after it is sent. Solution. (This problem is similar to Eample.5, the Romeo & Juliet problem in the tetbook.) Because the marginal PDF s are uniform and the X, Y are independent, the pair (X, Y ) 7
8 /4 /4 y /4 /4 y /4 /4 y (a) (b) (d) Figure 4: The events for Parts (a), (b), and (d) for Problem 6. The joint PDF of X and Y is uniform over the unit square. is uniformly distributed in the unit square as depicted in Figure 4. The probability of any event in this sample space is equal to its area. The event that a message is received within 5 minute is equivalent to X /4 ory /4, which corresponds to the shaded region in Figure 4(a). The probability of this event is equal to the area which is 7/6. (b) Determine the probability that the message is received but not verified within 5 minutes after it is sent. Solution. This event is equivalent to (X /4 and Y > /4) or (Y /4 and X>/4), which is depicted as the shaded region in Figure 4 (b). The probability of this event is equal to 3/6+3/63/8. (c) Let T represent the time (in hours) between transmission at A and verification at B. Determine the cumulative distribution function F T (t), and then differentiate it to obtain the PDF f T (t). Solution. By definition of CDF, F T (t) P(T t) P(X t and Y t) P(X t)p(y t) (due to independence), t <, F X (t)f Y (t) t, t,, t >. Differentiating, we get: f T (t) df T (t) dt { t, t,, otherwise. (d) If the attendant at B goes home 5 minutes after the message is received, what is the probability that he is present when the message should be verified? Solution. If he is present for verification, it means that the message arrived over one channel within 5 minutes of arriving over the other one: X Y /4. This is illustrated in Figure 4(d). The probability of this event is equal to minus the area of the two triangles and we get: P( X Y /4) ( /4) 7/6. 8
9 (e) If the attendant at B leaves for a 5-minute coffee break right after the message is received, what is the probability that he is present at the proper time for verification? Solution. This event is equivalent to X Y > /4. Its probability is: P( X Y /4) 9/6. (f) The management wishes to have the maimum probability of having the attendant present at both reception and verification. Would they do better to let him take his coffee break as described above or simply allow him to go home 45 minutes after transmission? Solution. If the attendant is allowed to go home 45 minutes after the transmission, the probability that he will be present at the both reception and verification is equal to the probability that both messages arrive within 45 minutes. That is: P(X 3/4 and Y 3/4) (3/4) 9/6. This probability is equal to what we obtained in Part (e). So it does not matter which decision the manager will make in terms of maimizing the probability of the attendant s presence at both reception and verification. (But note that the coffee break strategy will keep the attendant at work for a longer time on average. Indeed, when the message is received more than 45 minutes after the transmission, he will need to stay there for more than 45 minutes. In the second scheme, his working time is fied to be 45 minutes.) 9
Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationLecture Notes 2 Random Variables. Random Variable
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationMath 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.
Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More informationLecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables
EE5110: Probability Foundations for Electrical Engineers July-November 015 Lecture 13: Conditional Distributions and Joint Continuity Lecturer: Dr. Krishna Jagannathan Scribe: Subrahmanya Swamy P 13.1
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationBeautiful homework # 4 ENGR 323 CESSNA Page 1/5
Beautiful homework # 4 ENGR 33 CESSNA Page 1/5 Problem 3-14 An operator records the time to complete a mechanical assembly to the nearest second with the following results. seconds 30 31 3 33 34 35 36
More informationSTAT 430/510: Lecture 10
STAT 430/510: Lecture 10 James Piette June 9, 2010 Updates HW2 is due today! Pick up your HW1 s up in stat dept. There is a box located right when you enter that is labeled "Stat 430 HW1". It ll be out
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationChapter 3 Single Random Variables and Probability Distributions (Part 1)
Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationProblem Set 7 Due March, 22
EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationProblem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30
Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationTaylor Series and Series Convergence (Online)
7in 0in Felder c02_online.te V3 - February 9, 205 9:5 A.M. Page CHAPTER 2 Taylor Series and Series Convergence (Online) 2.8 Asymptotic Epansions In introductory calculus classes the statement this series
More informationNotes for Math 324, Part 20
7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]
More information2 Continuous Random Variables and their Distributions
Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any
More informationMonte Carlo Integration I
Monte Carlo Integration I Digital Image Synthesis Yung-Yu Chuang with slides by Pat Hanrahan and Torsten Moller Introduction L o p,ωo L e p,ωo s f p,ωo,ω i L p,ω i cosθ i i d The integral equations generally
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,
More informationMODULE 6 LECTURE NOTES 1 REVIEW OF PROBABILITY THEORY. Most water resources decision problems face the risk of uncertainty mainly because of the
MODULE 6 LECTURE NOTES REVIEW OF PROBABILITY THEORY INTRODUCTION Most water resources decision problems ace the risk o uncertainty mainly because o the randomness o the variables that inluence the perormance
More informationContinuous Variables Chris Piech CS109, Stanford University
Continuous Variables Chris Piech CS109, Stanford University 1906 Earthquak Magnitude 7.8 Learning Goals 1. Comfort using new discrete random variables 2. Integrate a density function (PDF) to get a probability
More informationMath 218 Supplemental Instruction Spring 2008 Final Review Part A
Spring 2008 Final Review Part A SI leaders: Mario Panak, Jackie Hu, Christina Tasooji Chapters 3, 4, and 5 Topics Covered: General probability (probability laws, conditional, joint probabilities, independence)
More informationChapter 4. Probability-The Study of Randomness
Chapter 4. Probability-The Study of Randomness 4.1.Randomness Random: A phenomenon- individual outcomes are uncertain but there is nonetheless a regular distribution of outcomes in a large number of repetitions.
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution
More informationContinuous distributions
CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)
More informationPractice Exam 1: Long List 18.05, Spring 2014
Practice Eam : Long List 8.05, Spring 204 Counting and Probability. A full house in poker is a hand where three cards share one rank and two cards share another rank. How many ways are there to get a full-house?
More informationBasics of Stochastic Modeling: Part II
Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationWhere now? Machine Learning and Bayesian Inference
Machine Learning and Bayesian Inference Dr Sean Holden Computer Laboratory, Room FC6 Telephone etension 67 Email: sbh@clcamacuk wwwclcamacuk/ sbh/ Where now? There are some simple take-home messages from
More informationIntroduction to Statistical Data Analysis Lecture 3: Probability Distributions
Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis
More informationStochastic processes Lecture 1: Multiple Random Variables Ch. 5
Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and
More informationShort course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda
Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis Lecture Recalls of probability theory Massimo Piccardi University of Technology, Sydney,
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab
University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationExpected value of r.v. s
10 Epected value of r.v. s CDF or PDF are complete (probabilistic) descriptions of the behavior of a random variable. Sometimes we are interested in less information; in a partial characterization. 8 i
More informationMAT 271E Probability and Statistics
MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday
More informationChapter 2: The Random Variable
Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome
More informationMath Review Sheet, Fall 2008
1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More information6. Vector Random Variables
6. Vector Random Variables In the previous chapter we presented methods for dealing with two random variables. In this chapter we etend these methods to the case of n random variables in the following
More informationMTH135/STA104: Probability
MTH5/STA4: Probability Homework # Due: Tuesday, Dec 6, 5 Prof Robert Wolpert Three subjects in a medical trial are given drug A After one week, those that do not respond favorably are switched to drug
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationProblems for 2.6 and 2.7
UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probability and Random Processes Practice Problems for Midterm: SOLUTION # Fall 7 Issued: Thurs, September 7, 7 Solutions: Posted
More informationFinal Solutions Fri, June 8
EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on
More informationThe binary entropy function
ECE 7680 Lecture 2 Definitions and Basic Facts Objective: To learn a bunch of definitions about entropy and information measures that will be useful through the quarter, and to present some simple but
More informationOne Solution Two Solutions Three Solutions Four Solutions. Since both equations equal y we can set them equal Combine like terms Factor Solve for x
Algebra Notes Quadratic Systems Name: Block: Date: Last class we discussed linear systems. The only possibilities we had we 1 solution, no solution or infinite solutions. With quadratic systems we have
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationProblem Set 3 Due Oct, 5
EE6: Random Processes in Systems Lecturer: Jean C. Walrand Problem Set 3 Due Oct, 5 Fall 6 GSI: Assane Gueye This problem set essentially reviews detection theory. Not all eercises are to be turned in.
More informationHomework 3 solution (100points) Due in class, 9/ (10) 1.19 (page 31)
Homework 3 solution (00points) Due in class, 9/4. (0).9 (page 3) (a) The density curve forms a rectangle over the interval [4, 6]. For this reason, uniform densities are also called rectangular densities
More informationProbability review. September 11, Stoch. Systems Analysis Introduction 1
Probability review Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ September 11, 2015 Stoch.
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationEE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002
EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model
More informationChapter 6: Large Random Samples Sections
Chapter 6: Large Random Samples Sections 6.1: Introduction 6.2: The Law of Large Numbers Skip p. 356-358 Skip p. 366-368 Skip 6.4: The correction for continuity Remember: The Midterm is October 25th in
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationProbability Distribution
Probability Distribution Prof. (Dr.) Rajib Kumar Bhattacharjya Indian Institute of Technology Guwahati Guwahati, Assam Email: rkbc@iitg.ernet.in Web: www.iitg.ernet.in/rkbc Visiting Faculty NIT Meghalaya
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationCS 237 Fall 2018, Homework 06 Solution
0/9/20 hw06.solution CS 237 Fall 20, Homework 06 Solution Due date: Thursday October th at :59 pm (0% off if up to 24 hours late) via Gradescope General Instructions Please complete this notebook by filling
More informationECE Homework Set 2
1 Solve these problems after Lecture #4: Homework Set 2 1. Two dice are tossed; let X be the sum of the numbers appearing. a. Graph the CDF, FX(x), and the pdf, fx(x). b. Use the CDF to find: Pr(7 X 9).
More informationECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)
ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationCDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function
More informationComputer Problems for Taylor Series and Series Convergence
Computer Problems for Taylor Series and Series Convergence The two problems below are a set; the first should be done without a computer and the second is a computer-based follow up. 1. The drawing below
More informationECEn 370 Introduction to Probability
RED- You can write on this exam. ECEn 370 Introduction to Probability Section 00 Final Winter, 2009 Instructor Professor Brian Mazzeo Closed Book Non-graphing Calculator Allowed No Time Limit IMPORTANT!
More information2 Statistical Estimation: Basic Concepts
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:
More informationEE 302: Probabilistic Methods in Electrical Engineering
EE : Probabilistic Methods in Electrical Engineering Print Name: Solution (//6 --sk) Test II : Chapters.5 4 //98, : PM Write down your name on each paper. Read every question carefully and solve each problem
More informationContinuous random variables
Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from
More information[Chapter 6. Functions of Random Variables]
[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More informationWill Landau. Feb 21, 2013
Iowa State University Feb 21, 2013 Iowa State University Feb 21, 2013 1 / 31 Outline Iowa State University Feb 21, 2013 2 / 31 random variables Two types of random variables: Discrete random variable:
More information6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.
604/643 Fall 200 Final Exam Solutions Wednesday, December 5, 9:00AM - 2:00noon Problem (32 points) Consider a Markov chain {X n ; n 0,, }, specified by the following transition diagram 06 05 09 04 03 2
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More informationThe story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill
The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 2 2 March 22 X a c.r.v. with p.d.f. f and g : R R: then Y g(x is a random variable and E(Y g(f(d variance:
More informationLecture Note 2: Estimation and Information Theory
Univ. of Michigan - NAME 568/EECS 568/ROB 530 Winter 2018 Lecture Note 2: Estimation and Information Theory Lecturer: Maani Ghaffari Jadidi Date: April 6, 2018 2.1 Estimation A static estimation problem
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationMAT 271E Probability and Statistics
MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday
More informationCS145: Probability & Computing Lecture 11: Derived Distributions, Functions of Random Variables
CS145: Probability & Computing Lecture 11: Derived Distributions, Functions of Random Variables Instructor: Erik Sudderth Brown University Computer Science March 5, 2015 Homework Submissions Electronic
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.
EECS 6 Probability and Random Processes University of California, Berkeley: Spring 08 Kannan Ramchandran February 4, 08 Midterm Last Name First Name SID You have 0 minutes to read the exam and 90 minutes
More information