Probability review (week 3) Solutions
|
|
- Samuel Henry
- 6 years ago
- Views:
Transcription
1 Probability review (week 3) Solutions 1 Decision making. A Simulate individual experiment. Solution: Here, you can find three versions of the function. Each one revealing a programming technique (The ideas are formed through discussions with you during office hours.) Version 1: % ESE 33 HW3 part A - version 1, using a while loop function [accepted_rank,time_of_acceptance]= car_sale_v1(j,,l) offers=randperm(j); rejected_offers=offers(1:); sorted_rejected_offers=sort(rejected_offers); Lth_best_amongst_rejected=sorted_rejected_offers(1,L); i=+1; while (offers(i)>lth_best_amongst_rejected && i<j) i=i+1; % weird while! accepted_rank=offers(i); time_of_acceptance=i; Version 2: % ESE 33 HW3 part A - version 2, using a for loop function [accepted_rank,time_of_acceptance]= car_sale_v2(j,,l) offers=randperm(j); rejected_offers=offers(1:); sorted_rejected_offers=sort(rejected_offers); Lth_best_amongst_rejected=sorted_rejected_offers(1,L); for i=+1:j if (offers(i)<lth_best_amongst_rejected) accepted_rank=offers(i); time_of_acceptance=i; return;
2 accepted_rank=offers(j); time_of_acceptance=j; Version 3: % ESE 33 HW3 part A - version 3, using "find" and no loop function [accepted_rank,time_of_acceptance]= car_sale_v3(j,,l) offers=randperm(j); rejected_offers=offers(1:); sorted_rejected_offers=sort(rejected_offers); Lth_best_amongst_rejected=sorted_rejected_offers(1,L); i=find(offers(+1:)<lth_best_amongst_rejected,1); if isempty(i) time_of_acceptance=+i; accepted_rank=offers(time_of_acceptance); else time_of_acceptance=j; accepted_rank=offers(j); B Probability distribution of the rank of the selected offer. Solution: Here, we will look at thepmf for the rank of the selected offer, fixing J=5, =3 and L=1, 2, 5. Provided are two different versions. Note specifically how the function developed for part A is called. Version 1: % plots the pmf of the accepted ranks % J,,L: refer to HW3 for explanation % N: number of repetitions of the experiment % Version 1: counting the frequency of each rank function []=pmf_of_ranks_v1(j,,l,n) frequencies=zeros(1,j); % initialization of the vector of frequencies of % each rank for i=1:n [accepted_rank,ignore_the_time]=car_sale_v1(j,,l); frequencies(1,accepted_rank)=frequencies(1,accepted_rank)+1; pmf_vector=frequencies/n;
3 bar(1:j,pmf_vector, r ) xlabel( X, FontSize,14) ylabel( pmf, FontSize,14) title([ N=,num2str(N)], FontSize,14, FontWeight, b ) axis([,51,,.4]) Version 2: % plots the pmf of the accepted ranks % J,,L: refer to HW3 for explanation % N: number of repetitions of the experiment % Version 2: using "hist" to do the task of counting function []=pmf_of_ranks_v2(j,,l,n) accepted_ranks=zeros(1,n); % initialization of the vector of the ranks of % the selected offers of the experiments! for i=1:n [accepted_ranks(i),ignore_the_time]=car_sale_v1(j,,l); [frequencies, bins_locations]=hist(accepted_ranks,j); pmf_vector=frequencies/n; bar(1:j,pmf_vector, r ) xlabel( X, FontSize,14) ylabel( pmf, FontSize,14) title([ N=,num2str(N)], FontSize,14, FontWeight, b ) axis([,51,,.4]) The function is called in a separate m-file: close all clear all clc J=5; =3; L=1; % refer to HW3 for explanation % number of repetitions of the experiment i=1; for N=1.ˆ[ ] figure(1)
4 subplot(2,2,i) pmf_of_ranks_v1(j,,l,n) i=i+1; The plots for L=1 are provided in fig. 1. As you will see when plotting L = 2, 5, the first L ranks are approximately equiprobable and the behavior for ranks L + 1 to 5 is similar. When N = 1, the pmf does not have a smooth shape, thus the number of experiments is increased from 1 2 to 1 5. As it can be seen, the variations from N = 1 4 to N = 1 5 is insignificant and we can choose N = 1 4 for the consequent references (part C.) (This observation is quite intriguing, as there are J! different permutations all equiprobable...) Remarks: As it is observed, the proposed policy for selling the car performs quite favorably, as the likelihood of selling the car to best offer is quite high (more than.3) and the likelihood of accepting one of the top three offers is more other.4, and P (X) = i for i > 4 is less than.15. C Probability of selecting best offer. Solution: From part B, we observed that N = 1, yields acceptable results for calculating the probability. In this section we will create a function to determine the probability distribution for selecting the best offer, while varying the number of rejected offers,, between 1 and J 1. As in previous sections, several versions of code are provided. Version 1: % plots the probability of the best rank versus for a given J, L % J,,L: refer to HW3 for explanation % Version 1: adaptation of version 1 of part B (counting frequency) function []=prob_1_versus v1(j,l) N=1; _vector=l:j-1; prob_of_rank_1=zeros(1,j-l); _index=1; for =_vector frequency_of_rank_1=; % initialization of the number of time accepted % rank has been 1, i.e., the best offer was accepted for i=1:n [accepted_rank,ignore_the_time]=car_sale_v1(j,,l); frequency_of_rank_1=frequency_of_rank_1 + (accepted_rank==1); %counter prob_of_rank_1(1,_index)=frequency_of_rank_1/n; _index=_index+1; bar(_vector,prob_of_rank_1) %create figure xlabel(, FontSize,14) ylabel( P(X)=1, FontSize,14)
5 title([ P(X)=1 for different, J=,num2str(J),..., L=, num2str(l),, N=,num2str(N)], FontSize,... 14, FontWeight, b ) axis([,j,,.4]) Version 2: % plots the probability of the best rank versus for a given J, L % J,,L: refer to HW3 for explanation % Version 2: adaptation of version 2 of part B (using hist() function) function []=prob_1_versus v2(j,l) N=1; _vector=l:j-1; prob_of_rank_1=zeros(1,j-l); _index=1; for =_vector accepted_ranks=zeros(1,n); % initialization of the vector of the ranks of % the selected offers of the experiments! for i=1:n [accepted_ranks(i),ignore_the_time]=car_sale_v1(j,,l); [frequencies, bins_locations]=hist(accepted_ranks,j); pmf_vector=frequencies/n; prob_of_rank_1(1,_index)=pmf_vector(1,1); _index=_index+1; bar(_vector,prob_of_rank_1) %create figure xlabel(, FontSize,14) ylabel( P(X)=1, FontSize,14) title([ P(X)=1 for different, J=,num2str(J),..., L=, num2str(l),, N=,num2str(N)], FontSize,... 14, FontWeight, b ) axis([,j,,.4]) The execution code: close all clear all clc J=5; =3; % refer to HW3 for explanation
6 % number of repetitions of the experiment for L=[1 2 5] figure prob_1_versus v2(j,l) The results are depicted in fig. 2 Remarks: First, we observe that by selecting L = 1 and around 18 we can achieve likelihood for accepting the best offer as high as.37 and for a wide range of, that is for 1 < < 3 the chance of accepting the best offer is at least one-third. Increasing L from 1 hurts the probability of accepting the best offer. Notice that this observation was difficult to make, have we not set the scale of the axes equal. For a larger L, better be appropriately larger too.
7 D Probability of selecting last offer (L=1). Solution: One accepts the last offer if the best offer, X = 1, is in the first offers OR the second best offer, X = 2, is in the first offers and the best offer is the last offer. Mathematically this is: P {X = X J } = P {[ i {1,..., } : X i = 1] OR [ i {1,..., } : X i = 2 & X J = 1]} The probability of this happening is given by The logic behind this probability is: Similarly, P {[ i {1,..., } : X i = 1]} P {X = X J } = ( J ) + ( 1 J J 1 ) = P {the best offer being amongst the first positions of the J equiprobable positions.} = 1 J = J P { i {1,..., } : X i = 2 & X J = 1]} the best offer is at the last position AND (i.e., fixing that) the = P second best offer is amongst the fisr of the remaining J 1 positions = 1 J J 1. E Probability of selecting best offer (L = 1). Solution: From Eq. 3.4 in Ross s Introduction to Probability Models, it follows that P {X = 1} = E[X = 1] = = P {X = 1 X i = 1}P {X i = 1} i=1 P {X = 1 X i = 1} 1 J = 1 J i=1 P {X = 1 X i = 1} However, if i is less than, then the probability of X=1 when X i =1 is zero, because if the best offer is in the first offers we automatically reject it. Thus, we can alter the summation to i=+1 to J = 1 P {X = 1 X i = 1} J i=+1 The above probability statement is saying that the best offer is chosen at time i. For this to happen, the value X (the best value in the first offers) can not have been beaten between the offers preceding X i. In other words, the best offer of offers 1 through X n-1 must have occurred in the first offers. So, Therefore Eq. 1 follows: i=1 P {X = 1 X i = 1} = P {X = 1} = J i=+1 i 1 1 i 1
8 F In mathematical form, this is: P {X = 1} = E [ P { X = 1 Xn = 1 }] = = P { X = 1 Xn = 1 } P {X n = 1} n=1 n=1 P { X = 1 Xn = 1 } 1 J = 1 J [ P { X = 1 Xn = 1 } + n=1 = 1 J [ + = 1 J = 1 J = J n=+1 n=+1 n=+1 n=+1 n=+1 P { X = 1 Xn = 1 } ] P { X = 1 Xn = 1 } ] P { the best offer amongst the fist (n 1) offers appears amongst the first offers n 1 1 n 1 Optimal number of rejected offers (L=1). Solution: The optimal * is determined below using the approximation: 1 J 1 i 1 1 x dx Which can also be written as: J 1 1 x dx < i= i=+1 i=+1 i= 1 J 1 i 1 < 1 i= x dx J } using the approximated relation: P {X = 1} = J n=+1 1 n 1 J 1 J 1 i= x dx = J ln J 1 J ln J (1) (2)
9 In order to find the optimum which maximizes P {X = 1}, we need to take the derivative with respect to and set it equal to zero and solve the resulting equation; Right? With a caveat: Indeed, is an integer number, and taking the derivative presumes R. This is not uncommon in solving such optimization problems where the meaningful domain of the variables is the integer set, (and they are referred to as Integer Programming). We have relaxed the constraint of being an integer: d P {X = 1} d d d [ J ln J ] = 1 J ln J J J J 2 = 1 J ln J 1 J Hence, d d P {X = 1} = ln J = 1 J = e J/e, where is the optimum. For J = 5, this number is about 18, remarkably compatible with fig. 2(a). Now, using the approximation in (2), it follows: P {X = 1} J = J/e J = 1/e ln J ln J J/e 1/e.37, quite high a probability, which matches well with fig. 2(a).
10 .4 N=1.4 N=1.3.3 pmf.2 pmf X N= X N=1.3.3 pmf.2 pmf X 2 4 X Fig. 1. pmf of accepted ranks for J = 5, = 3, L = 1 and varying N = (part B)
11 .4 P(X)=1 for different, J=5, L=1, N= P(X)= (a) L = 1.4 P(X)=1 for different, J=5, L=2, N= P(X)= (b) L = 2.4 P(X)=1 for different, J=5, L=5, N= P(X)= (c) L = 5 Fig. 2. Probability of accepting the best offer for J = 5, varying and L = 1, 2, 5. Here, N = 1,, the number of repeating the experiment to calculate the probability. Notice that the scale of the y-axis is from to.4.
Week 3: Probability review Decision making Solutions
Week 3: Probability review Decision making Solutions A Simulate an individual experiment. Below you find two versions of the function, each using a different programming technique. 1 % Version 1: for loop
More informationContinuous time Markov chains (week 8) Solutions
Continuous time Markov chains (week 8) Solutions 1 Insurance cash flow. A CTMC states. Because c and d are assumed to be integers, and the premiums are each 1, the cash flow X(t) of the insurance company
More informationAssignment 6, Math 575A
Assignment 6, Math 575A Part I Matlab Section: MATLAB has special functions to deal with polynomials. Using these commands is usually recommended, since they make the code easier to write and understand
More informationMarkov chains (week 6) Solutions
Markov chains (week 6) Solutions 1 Ranking of nodes in graphs. A Markov chain model. The stochastic process of agent visits A N is a Markov chain (MC). Explain. The stochastic process of agent visits A
More informationSolutions to Homework 8 - Continuous-Time Markov Chains
Solutions to Homework 8 - Continuous-Time Markov Chains 1) Insurance cash flow. A) CTMC states. Since we assume that c, d and X max are integers, while the premiums that the customers pay are worth 1,
More information16.4. Power Series. Introduction. Prerequisites. Learning Outcomes
Power Series 6.4 Introduction In this Section we consider power series. These are examples of infinite series where each term contains a variable, x, raised to a positive integer power. We use the ratio
More informationTUTORIAL 8 SOLUTIONS #
TUTORIAL 8 SOLUTIONS #9.11.21 Suppose that a single observation X is taken from a uniform density on [0,θ], and consider testing H 0 : θ = 1 versus H 1 : θ =2. (a) Find a test that has significance level
More informationENGR Spring Exam 2
ENGR 1300 Spring 013 Exam INSTRUCTIONS: Duration: 60 minutes Keep your eyes on your own work! Keep your work covered at all times! 1. Each student is responsible for following directions. Read carefully..
More informationMathematics 1a, Section 4.3 Solutions
Mathematics 1a, Section 4.3 Solutions Alexander Ellis November 30, 2004 1. f(8) f(0) 8 0 = 6 4 8 = 1 4 The values of c which satisfy f (c) = 1/4 seem to be about c = 0.8, 3.2, 4.4, and 6.1. 2. a. g is
More informationDr. Maddah ENMG 617 EM Statistics 10/15/12. Nonparametric Statistics (2) (Goodness of fit tests)
Dr. Maddah ENMG 617 EM Statistics 10/15/12 Nonparametric Statistics (2) (Goodness of fit tests) Introduction Probability models used in decision making (Operations Research) and other fields require fitting
More informationPractice Problems Section Problems
Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,
More informationLecture 3. Conditional distributions with applications
Lecture 3. Conditional distributions with applications Jesper Rydén Department of Mathematics, Uppsala University jesper.ryden@math.uu.se Statistical Risk Analysis Spring 2014 Example: Wave parameters
More informationHypothesis Testing: The Generalized Likelihood Ratio Test
Hypothesis Testing: The Generalized Likelihood Ratio Test Consider testing the hypotheses H 0 : θ Θ 0 H 1 : θ Θ \ Θ 0 Definition: The Generalized Likelihood Ratio (GLR Let L(θ be a likelihood for a random
More informationMath 180B Problem Set 3
Math 180B Problem Set 3 Problem 1. (Exercise 3.1.2) Solution. By the definition of conditional probabilities we have Pr{X 2 = 1, X 3 = 1 X 1 = 0} = Pr{X 3 = 1 X 2 = 1, X 1 = 0} Pr{X 2 = 1 X 1 = 0} = P
More informationIntroduction to Probability
LECTURE NOTES Course 6.041-6.431 M.I.T. FALL 2000 Introduction to Probability Dimitri P. Bertsekas and John N. Tsitsiklis Professors of Electrical Engineering and Computer Science Massachusetts Institute
More informationProbabilistic classification CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2016
Probabilistic classification CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2016 Topics Probabilistic approach Bayes decision theory Generative models Gaussian Bayes classifier
More information1. Algebraic and geometric treatments Consider an LP problem in the standard form. x 0. Solutions to the system of linear equations
The Simplex Method Most textbooks in mathematical optimization, especially linear programming, deal with the simplex method. In this note we study the simplex method. It requires basically elementary linear
More informationPolytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009
Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009 Print Name: Signature: Section: ID #: Directions: You have 55 minutes to answer the following questions. You must show all your work as neatly
More informationData selection. Lower complexity bound for sorting
Data selection. Lower complexity bound for sorting Lecturer: Georgy Gimel farb COMPSCI 220 Algorithms and Data Structures 1 / 12 1 Data selection: Quickselect 2 Lower complexity bound for sorting 3 The
More informationNaive Bayes classification
Naive Bayes classification Christos Dimitrakakis December 4, 2015 1 Introduction One of the most important methods in machine learning and statistics is that of Bayesian inference. This is the most fundamental
More informationSolutions - Homework #2
45 Scatterplot of Abundance vs. Relative Density Parasite Abundance 4 35 3 5 5 5 5 5 Relative Host Population Density Figure : 3 Scatterplot of Log Abundance vs. Log RD Log Parasite Abundance 3.5.5.5.5.5
More informationSection 9.1 (Part 2) (pp ) Type I and Type II Errors
Section 9.1 (Part 2) (pp. 547-551) Type I and Type II Errors Because we are basing our conclusion in a significance test on sample data, there is always a chance that our conclusions will be in error.
More informationCIS519: Applied Machine Learning Fall Homework 5. Due: December 10 th, 2018, 11:59 PM
CIS59: Applied Machine Learning Fall 208 Homework 5 Handed Out: December 5 th, 208 Due: December 0 th, 208, :59 PM Feel free to talk to other members of the class in doing the homework. I am more concerned
More informationMATH 552 Spectral Methods Spring Homework Set 5 - SOLUTIONS
MATH 55 Spectral Methods Spring 9 Homework Set 5 - SOLUTIONS. Suppose you are given an n n linear system Ax = f where the matrix A is tridiagonal b c a b c. A =.........,. a n b n c n a n b n with x =
More informationApplication 1 - People Allocation in Line Balancing
Chapter 9 Workforce Planning Introduction to Lecture This chapter presents some applications of Operations Research models in workforce planning. Work force planning would be more of a generic application
More informationPH 222-2A Spring 2015
PH 222-2A Spring 215 Electric Potential Lectures 5-6 Chapter 24 (Halliday/Resnick/Walker, Fundamentals of Physics 9 th edition) 1 Chapter 24 Electric Potential In this chapter we will define the electric
More informationPattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, 2000 with the permission of the authors
More informationBayesian Analysis - A First Example
Bayesian Analysis - A First Example This script works through the example in Hoff (29), section 1.2.1 We are interested in a single parameter: θ, the fraction of individuals in a city population with with
More informationJanuary 18, 2008 Steve Gu. Reference: Eta Kappa Nu, UCLA Iota Gamma Chapter, Introduction to MATLAB,
Introduction to MATLAB January 18, 2008 Steve Gu Reference: Eta Kappa Nu, UCLA Iota Gamma Chapter, Introduction to MATLAB, Part I: Basics MATLAB Environment Getting Help Variables Vectors, Matrices, and
More informationMath 152. Rumbos Fall Solutions to Assignment #12
Math 52. umbos Fall 2009 Solutions to Assignment #2. Suppose that you observe n iid Bernoulli(p) random variables, denoted by X, X 2,..., X n. Find the LT rejection region for the test of H o : p p o versus
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationModels of Computation
Models of Computation Analysis of Algorithms Week 1, Lecture 2 Prepared by John Reif, Ph.D. Distinguished Professor of Computer Science Duke University Models of Computation (RAM) a) Random Access Machines
More information2D Plotting with Matlab
GEEN 1300 Introduction to Engineering Computing Class Meeting #22 Monday, Nov. 9 th Engineering Computing and Problem Solving with Matlab 2-D plotting with Matlab Script files User-defined functions Matlab
More informationExam 3, Math Fall 2016 October 19, 2016
Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationHypothesis testing (cont d)
Hypothesis testing (cont d) Ulrich Heintz Brown University 4/12/2016 Ulrich Heintz - PHYS 1560 Lecture 11 1 Hypothesis testing Is our hypothesis about the fundamental physics correct? We will not be able
More informationMaster s Written Examination - Solution
Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2
More informationLesson 11: Mass-Spring, Resonance and ode45
Lesson 11: Mass-Spring, Resonance and ode45 11.1 Applied Problem. Trucks and cars have springs and shock absorbers to make a comfortable and safe ride. Without good shock absorbers, the truck or car will
More informationLecture 4: Two-point Sampling, Coupon Collector s problem
Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms
More informationSolutions - Homework #1
Solutions - Homework #1 1. Problem 1: Below appears a summary of the paper The pattern of a host-parasite distribution by Schmid & Robinson (197). Using the gnat Culicoides crepuscularis as a host specimen
More informationClick Prediction and Preference Ranking of RSS Feeds
Click Prediction and Preference Ranking of RSS Feeds 1 Introduction December 11, 2009 Steven Wu RSS (Really Simple Syndication) is a family of data formats used to publish frequently updated works. RSS
More informationProblem 1. Produce the linear and quadratic Taylor polynomials for the following functions:
Problem. Produce the linear and quadratic Taylor polynomials for the following functions: (a) f(x) = e cos(x), a = (b) log( + e x ), a = The general formula for any Taylor Polynomial is as follows: (a)
More informationMath 502 Fall 2005 Solutions to Homework 3
Math 502 Fall 2005 Solutions to Homework 3 (1) As shown in class, the relative distance between adjacent binary floating points numbers is 2 1 t, where t is the number of digits in the mantissa. Since
More informationEE/CpE 345. Modeling and Simulation. Fall Class 9
EE/CpE 345 Modeling and Simulation Class 9 208 Input Modeling Inputs(t) Actual System Outputs(t) Parameters? Simulated System Outputs(t) The input data is the driving force for the simulation - the behavior
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationCounting principles, including permutations and combinations.
1 Counting principles, including permutations and combinations. The binomial theorem: expansion of a + b n, n ε N. THE PRODUCT RULE If there are m different ways of performing an operation and for each
More informationMath RE - Calculus II Antiderivatives and the Indefinite Integral Page 1 of 5
Math 201-203-RE - Calculus II Antiderivatives and the Indefinite Integral Page 1 of 5 What is the Antiderivative? In a derivative problem, a function f(x) is given and you find the derivative f (x) using
More informationAnalysis of Variance (ANOVA)
Analysis of Variance (ANOVA) Much of statistical inference centers around the ability to distinguish between two or more groups in terms of some underlying response variable y. Sometimes, there are but
More informationHypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses
Testing Hypotheses MIT 18.443 Dr. Kempthorne Spring 2015 1 Outline Hypothesis Testing 1 Hypothesis Testing 2 Hypothesis Testing: Statistical Decision Problem Two coins: Coin 0 and Coin 1 P(Head Coin 0)
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationPHYS 301 First Hour Exam
PHYS 30 First Hour Exam Spring 20 This is a closed book, closed note exam. You will not need nor be allowed to use calculators or other electronic devices on this test. Do all your writing in your blue
More informationStatistics Assignment 2 HET551 Design and Development Project 1
Statistics Assignment HET Design and Development Project Michael Allwright - 74634 Haddon O Neill 7396 Monday, 3 June Simple Stochastic Processes Mean, Variance and Covariance Derivation The following
More information4. List of program goals/learning outcomes to be met. Goal-1: Students will have the ability to analyze and graph polynomial functions.
Course of Study Advanced Algebra 1. Introduction: Common Core Advanced Algebra, Mathematics Department, Grade 9-12, 2 semester course. 2. Course Description Common Core Advanced Algebra is a one year advanced
More informationCSCI 2033 Spring 2016 ELEMENTARY COMPUTATIONAL LINEAR ALGEBRA
CSCI 2033 Spring 2016 ELEMENTARY COMPUTATIONAL LINEAR ALGEBRA Class time : MW 4:00-5:15pm Room : Vincent Hall 16 Instructor : Yousef Saad URL : www-users.cselabs.umn.edu/classes/spring-2016 /csci2033 afternoon
More informationSolutions - Homework #2
Solutions - Homework #2 1. Problem 1: Biological Recovery (a) A scatterplot of the biological recovery percentages versus time is given to the right. In viewing this plot, there is negative, slightly nonlinear
More informationMAE 107 HW 5 Solutions
MAE 107 HW 5 Solutions 1. eulersmethod.m function [tn,xn] = eulersmethod(t0,tt,x0,n) % Objective % This function solves the ODE dx/dt = f(x,t) using Euler s method. % Inputs % t0 - initial time % tt -
More informationChapter 1 Principles of Probability
Chapter Principles of Probability Combining independent probabilities You have applied to three medical schools: University of California at San Francisco (UCSF), Duluth School of Mines (DSM), and Harvard
More informationProblem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30
Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain
More informationComparison of Accident Rates Using the Likelihood Ratio Testing Technique
50 TRANSPORTATION RESEARCH RECORD 101 Comparison of Accident Rates Using the Likelihood Ratio Testing Technique ALI AL-GHAMDI Comparing transportation facilities (i.e., intersections and road sections)
More informationk-protected VERTICES IN BINARY SEARCH TREES
k-protected VERTICES IN BINARY SEARCH TREES MIKLÓS BÓNA Abstract. We show that for every k, the probability that a randomly selected vertex of a random binary search tree on n nodes is at distance k from
More informationProbability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008
Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize
More informationWritten Exam Linear and Integer Programming (DM554)
Written Exam Linear and Integer Programming (DM554) Department of Mathematics and Computer Science University of Southern Denmark Monday, June 22, 2015, 10:00 14:00, Festsalen, Niels Bohr Allé 1 The exam
More informationTu: 9/3/13 Math 471, Fall 2013, Section 001 Lecture 1
Tu: 9/3/13 Math 71, Fall 2013, Section 001 Lecture 1 1 Course intro Notes : Take attendance. Instructor introduction. Handout : Course description. Note the exam days (and don t be absent). Bookmark the
More informationSummary of Chapters 7-9
Summary of Chapters 7-9 Chapter 7. Interval Estimation 7.2. Confidence Intervals for Difference of Two Means Let X 1,, X n and Y 1, Y 2,, Y m be two independent random samples of sizes n and m from two
More informationProblem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30
Problem Set MAS 6J/1.16J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain
More informationLecture 5: Two-point Sampling
Randomized Algorithms Lecture 5: Two-point Sampling Sotiris Nikoletseas Professor CEID - ETY Course 2017-2018 Sotiris Nikoletseas, Professor Randomized Algorithms - Lecture 5 1 / 26 Overview A. Pairwise
More informationIn an adjacency matrix which encodes for a directed Hamiltonian path, a non-zero determinant
In an adjacency matrix which encodes for a directed Hamiltonian path, a non-zero determinant value certifies the existence of a directed Hamiltonian path when no zero rows (columns) and no similar rows
More information7. Shortest Path Problems and Deterministic Finite State Systems
7. Shortest Path Problems and Deterministic Finite State Systems In the next two lectures we will look at shortest path problems, where the objective is to find the shortest path from a start node to an
More informationDIGITAL SIGNAL PROCESSING LABORATORY
L AB 5 : DISCRETE T IME SYSTEM IN TIM E DOMAIN NAME: DATE OF EXPERIMENT: DATE REPORT SUBMITTED: 1 1 THEORY Mathematically, a discrete-time system is described as an operator T[.] that takes a sequence
More informationTopic Contents. Factoring Methods. Unit 3: Factoring Methods. Finding the square root of a number
Topic Contents Factoring Methods Unit 3 The smallest divisor of an integer The GCD of two numbers Generating prime numbers Computing prime factors of an integer Generating pseudo random numbers Raising
More informationBNAD 276 Lecture 10 Simple Linear Regression Model
1 / 27 BNAD 276 Lecture 10 Simple Linear Regression Model Phuong Ho May 30, 2017 2 / 27 Outline 1 Introduction 2 3 / 27 Outline 1 Introduction 2 4 / 27 Simple Linear Regression Model Managerial decisions
More informationSolutions - Homework # 3
ECE-34: Signals and Systems Summer 23 PROBLEM One period of the DTFS coefficients is given by: X[] = (/3) 2, 8. Solutions - Homewor # 3 a) What is the fundamental period 'N' of the time-domain signal x[n]?
More informationGeneral Least Squares Fitting
Appendix B General Least Squares Fitting B.1 Introduction Previously you have done curve fitting in two dimensions. Now you will learn how to extend that to multiple dimensions. B.1.1 Linearizable Non-linear
More informationReview. December 4 th, Review
December 4 th, 2017 Att. Final exam: Course evaluation Friday, 12/14/2018, 10:30am 12:30pm Gore Hall 115 Overview Week 2 Week 4 Week 7 Week 10 Week 12 Chapter 6: Statistics and Sampling Distributions Chapter
More informationUsing Matlab for Laboratory Data Reduction
Data Collected Using Matlab for Laboratory Data Reduction Munson, Young, and Okiishi [1] provide laboratory data for the measurement of the viscosity of water with a capillary tube viscometer. The viscometer
More informationEE263 Review Session 1
EE263 Review Session 1 October 5, 2018 0.1 Importing Variables from a MALAB.m file If you are importing variables given in file vars.m, use the following code at the beginning of your script. close a l
More informationLAB 1: MATLAB - Introduction to Programming. Objective:
LAB 1: MATLAB - Introduction to Programming Objective: The objective of this laboratory is to review how to use MATLAB as a programming tool and to review a classic analytical solution to a steady-state
More informationStatistical Methods for Astronomy
Statistical Methods for Astronomy Probability (Lecture 1) Statistics (Lecture 2) Why do we need statistics? Useful Statistics Definitions Error Analysis Probability distributions Error Propagation Binomial
More informationEnumerate all possible assignments and take the An algorithm is a well-defined computational
EMIS 8374 [Algorithm Design and Analysis] 1 EMIS 8374 [Algorithm Design and Analysis] 2 Designing and Evaluating Algorithms A (Bad) Algorithm for the Assignment Problem Enumerate all possible assignments
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationChapter 2. Review of Mathematics. 2.1 Exponents
Chapter 2 Review of Mathematics In this chapter, we will briefly review some of the mathematical concepts used in this textbook. Knowing these concepts will make it much easier to understand the mathematical
More informationCHAPTER 3. THE IMPERFECT CUMULATIVE SCALE
CHAPTER 3. THE IMPERFECT CUMULATIVE SCALE 3.1 Model Violations If a set of items does not form a perfect Guttman scale but contains a few wrong responses, we do not necessarily need to discard it. A wrong
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationAB.Q103.NOTES: Chapter 2.4, 3.1, 3.2 LESSON 1. Discovering the derivative at x = a: Slopes of secants and tangents to a curve
AB.Q103.NOTES: Chapter 2.4, 3.1, 3.2 LESSON 1 Discovering the derivative at x = a: Slopes of secants and tangents to a curve 1 1. Instantaneous rate of change versus average rate of change Equation of
More informationAn Introduction to MatLab
Introduction to MatLab 1 An Introduction to MatLab Contents 1. Starting MatLab... 3 2. Workspace and m-files... 4 3. Help... 5 4. Vectors and Matrices... 5 5. Objects... 8 6. Plots... 10 7. Statistics...
More informationORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing
ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing Robert Vanderbei Fall 2014 Slides last edited on November 24, 2014 http://www.princeton.edu/ rvdb Coin Tossing Example Consider two coins.
More informationExpectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or
Expectations Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) Expectations
More informationProblem Set 5 Question 1
2.32 Problem Set 5 Question As discussed in class, drug discovery often involves screening large libraries of small molecules to identify those that have favorable interactions with a certain druggable
More informationmaximize the average southbound flux of automobiles on Michigan Street, keep the speed of an individual automobile below the posted speed limit,
TO:JosephM.Powers FROM: Kevin R. O Neill DATE: 31 March 1999 RE: ME 334 Project: Part II 1 Problem Statement The city of South B wants to create a timing system for the traffic lights on Michigan Street
More informationClassification 1: Linear regression of indicators, linear discriminant analysis
Classification 1: Linear regression of indicators, linear discriminant analysis Ryan Tibshirani Data Mining: 36-462/36-662 April 2 2013 Optional reading: ISL 4.1, 4.2, 4.4, ESL 4.1 4.3 1 Classification
More informationOn Maxima and Minima *
On Maxima and Minima * Leonhard Euler 50 If a function of x was of such a nature, that while the values of x increase the function itself continuously increases or decreases, then this function will have
More informationHomework 1 Solutions
18-9 Signals and Systems Profs. Byron Yu and Pulkit Grover Fall 18 Homework 1 Solutions Part One 1. (8 points) Consider the DT signal given by the algorithm: x[] = 1 x[1] = x[n] = x[n 1] x[n ] (a) Plot
More informationMATH 196, SECTION 57 (VIPUL NAIK)
TAKE-HOME CLASS QUIZ: DUE MONDAY NOVEMBER 25: SUBSPACE, BASIS, DIMENSION, AND ABSTRACT SPACES: APPLICATIONS TO CALCULUS MATH 196, SECTION 57 (VIPUL NAIK) Your name (print clearly in capital letters): PLEASE
More informationECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0
ECONOMICS 07 SPRING 006 LABORATORY EXERCISE 5 KEY Problem. Solve the following equations for x. a 5x 3x + 8 = 9 0 5x 3x + 8 9 8 = 0(5x ) = 9(3x + 8), x 0 3 50x 0 = 7x + 7 3x = 9 x = 4 b 8x x + 5 = 0 8x
More informationExpectations and Entropy
Expectations and Entropy Data Science: Jordan Boyd-Graber University of Maryland SLIDES ADAPTED FROM DAVE BLEI AND LAUREN HANNAH Data Science: Jordan Boyd-Graber UMD Expectations and Entropy 1 / 9 Expectation
More informationCS340 Winter 2010: HW3 Out Wed. 2nd February, due Friday 11th February
CS340 Winter 2010: HW3 Out Wed. 2nd February, due Friday 11th February 1 PageRank You are given in the file adjency.mat a matrix G of size n n where n = 1000 such that { 1 if outbound link from i to j,
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationExam C Solutions Spring 2005
Exam C Solutions Spring 005 Question # The CDF is F( x) = 4 ( + x) Observation (x) F(x) compare to: Maximum difference 0. 0.58 0, 0. 0.58 0.7 0.880 0., 0.4 0.680 0.9 0.93 0.4, 0.6 0.53. 0.949 0.6, 0.8
More informationExponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger
Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm by Korbinian Schwinger Overview Exponential Family Maximum Likelihood The EM Algorithm Gaussian Mixture Models Exponential
More information