Stochastic Models of Manufacturing Systems

Similar documents
1 Random Variable: Topics

Review of Probability Theory

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Chapter 2: Random Variables

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

18.440: Lecture 28 Lectures Review

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

CME 106: Review Probability theory

18.440: Lecture 28 Lectures Review

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

More on Distribution Function

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Discrete Random Variables

Motivation and Applications: Why Should I Study Probability?

1 Presessional Probability

M378K In-Class Assignment #1

3 Multiple Discrete Random Variables

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

General Random Variables

STAT 430/510: Lecture 16

Joint Distribution of Two or More Random Variables

Chapter 5 Joint Probability Distributions

3 Conditional Expectation

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CSE 312 Final Review: Section AA

Recitation 2: Probability

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Topic 3: The Expectation of a Random Variable

Random Variables. P(x) = P[X(e)] = P(e). (1)

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Final Exam # 3. Sta 230: Probability. December 16, 2012

Lecture 19: Properties of Expectation

Notes 12 Autumn 2005

STAT Chapter 5 Continuous Distributions

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Bayesian statistics, simulation and software

Review of Probabilities and Basic Statistics

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Actuarial Science Exam 1/P

Northwestern University Department of Electrical Engineering and Computer Science

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Things to remember when learning probability distributions:

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Chapter 5. Chapter 5 sections

Analysis of Engineering and Scientific Data. Semester

Probability Review. Gonzalo Mateos

Introduction to Probability and Stocastic Processes - Part I

1 Basic continuous random variable problems

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Fundamental Tools - Probability Theory II

a zoo of (discrete) random variables

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Lectures on Elementary Probability. William G. Faris

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

SDS 321: Introduction to Probability and Statistics

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

Topic 3: The Expectation of a Random Variable

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

CS 109 Review. CS 109 Review. Julia Daniel, 12/3/2018. Julia Daniel

Discrete Probability Distributions

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Lecture 2: Probability and Distributions

Notes for Math 324, Part 17

Continuous Random Variables

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Chp 4. Expectation and Variance

Lecture 3. Discrete Random Variables

Part (A): Review of Probability [Statistics I revision]

Brief Review of Probability

1 Review of Probability

Tom Salisbury

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

Decision making and problem solving Lecture 1. Review of basic probability Monte Carlo simulation

Chapter 3: Random Variables 1

5. Conditional Distributions

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Lecture 4: Probability and Discrete Random Variables

Formulas for probability theory and linear models SF2941

MULTIVARIATE PROBABILITY DISTRIBUTIONS

2 (Statistics) Random variables

Lecture 1: Review on Probability and Statistics

Set Theory Digression

Chapter 2. Probability

Math Review Sheet, Fall 2008

1 Probability and Random Variables

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Joint Probability Distributions, Correlations

Basics of Stochastic Modeling: Part II

ACM 116: Lectures 3 4

SDS 321: Introduction to Probability and Statistics

3. Probability and Statistics

Transcription:

Stochastic Models of Manufacturing Systems Ivo Adan

Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/ iadan/4t400 Examination consists of: Weekly (8) take home assignments Send in take home assignments individually Best 7 out of 8 take home assignments count (40%) Take home assignments of last week will be discussed in BZ Final assignment, done in groups of two (60%)

Topics 3/47 Basic probability (refresher) Basic statistics for discrete-event simulation Modeling and analysis of manufacturing systems: Single-stage systems Multi-stage flow lines Job-shop systems CONWIP systems

Modeling 4/47 Some basic steps: Identify the issues to be addressed Learn about the system Choose a modeling approach Develop and test the model Verify and validate the model Experiment with the model Present the results

Modeling 5/47 Various types of models: Physical models Simulation models Analytical models But why modeling? Understanding Improvement Optimization Decision making

Modeling 6/47 Some issues: Complexity versus Simplicity Flexibility Data requirements Transparency Analytical and simulation capability: Effective modeling requires both!

Some Improvements? 7/47 Servers are equally fast, 10 circulating jobs Question: Replace one server by a server that is twice as fast. How does this affect average throughput time? Throughtput? Question: Does your answer change in case of more jobs? Less jobs?

Multiple Machines or a Single One? 8/47 4 machines, or one machine that is four times faster? Question: What do you prefer, 4 machines or one fast machine? Question: What do you prefer if process time variability is high? Question: What do you prefer if the load is low?

Multiple Machines or a Single One? 9/47 4 machines, or one machine that is four times faster? The required information is captured in the following formula: E(S) W 1 ρ E(R) c + E(B)

Running example 10/47 A robotic dairy barn

Milking robot 11/47 How to design such a barn?

Zone-Picking Systems 12/47

Zone-Picking Systems 13/47 Entrance station

Zone-Picking Systems 14/47 Main conveyor

Zone-Picking Systems 15/47 Zone 8 Zone 5 Zone 4 Zone 1 Zone 7 Zone 6 Zone 3 Zone 2 Segment 2 Segment 1

Zone-Picking Systems 16/47 Pick-by-voice Pick-by-light

Zone-Picking Systems 17/47 Pick station Zone Buffer

Zone-Picking Systems 18/47 Quality check

Zone-Picking Systems 19/47 Sealing

Zone-Picking Systems 20/47 Labeling

Zone-Picking Systems 21/47 Loading

Zone-Picking Systems 22/47 Main conveyor Segment entrance Segment conveyor Segment System entrance/exit

Zone-Picking Systems 23/47 How to design such complex systems? What should be the layout of the network? Size of zones? Where to locate items? What number of pickers and zones? Required WIP level?

Zone-Picking Systems 24/47 System Entrance / Exit e 0 Main Conveyor c 0 1 Main Conveyor c 0 2 Main Conveyor c 0 K Main Conveyor c 0 K+1....... To segment 2 To segment K 1 Segment Conveyor c 1 2 Segment Conveyor c 1 1 Segment Entrance e 1 Segment Conveyor c K 2 Segment Conveyor c K 1 Segment Entrance e K Zone z 1 1 Zone z K 1......... Segment Conveyor c 1 m 1 +1... Segment Conveyor c K m K +1 Zone z 1 m 1 Zone z K m K.. Segment 1 Segment K Network model: implemented in Java Applet

Contents Basic Probability 25/47 Sheldon M. Ross: Probability Models, Academic Press, 2003. Chapter 1: 1.1-1.5 Chapter 2: 2.1-2.5 Chapter 3: 3.1-3.5 Henk Tijms: Understanding Probability, Cambridge Univ. Press, 2012. Chapter 7: 7.1, 7.2 (till 7.2.1), 7.3 Chapter 8: 8.1, 8.2 Chapter 9 Chapter 10: 10.1, 10.2, 10.3, 10.4 (till 10.4.8), 10.5, 10.6 Chapter 11: 11.1, 11.2, 11.3, 11.4.1, 11.5 Chapter 13: 13.1, 13.2, 13.3 (till 13.3.1) Appendix: Permutations, Combinations, Exponential function, Geometric series

Basic Probability 26/47 Ingredients of a probability model: Sample space S: flipping a coin, rolling a die, process time,... Events are (essentially) all subsets of S: E = {H}, E = {1, 2}, E = (0, 1),... Get new events by union, intersection, complement

Probabilities on Events 27/47 For each event E there is a number P(E) such that: 0 P(E) 1 P(S) = 1 E 1, E 2,... mutually exclusive, then P( 0 E i ) = 0 P(E i ) Example: Tossing a coin, P({H}) = P({T }) = 1 2 Example: Rolling a die, P({1}) = 1 6, P({1, 2}) = P({1}) + P({2}) = 1 3 Intuition: If an experiment is repeated over and over, then, with probability 1, the long run portion of time that event E occurs is P(E)

Conditional probabilities 28/47 Probability of event E given that event F occurs, P(E F) = P(E F) P(F) Example: Rolling a die twice, P({i, j}) = 1 36 Given that i = 4 (event F), what is probability that j = 2 (event E)? P(E F) = 1 36 1 6 = 1 6 Example: Darts on unit disk {(x, y) x 2 + y 2 1} P(distance to 0 > 1 2 x > 0) = 1 2 π 1 8 π 1 2 π = 3 4 Note: P(E F) = P(E F)P(F) and we usually write P(E F) = P(E F)

Independent events 29/47 Events E and F are independent if P(E F) = P(E)P(F) and events E 1,..., E n are independent is P(E 1 E 2... E n ) = P(E 1 )P(E 2 ) P(E n ) Example: Rolling a die twice E = "i + j = 6" and F = "i = 4". Independent? P(E) = 5 36, P(F) = 1 6, P(E F) = 1 36 Now E = "i + j = 7". Independent? Independent experiments: S = S 1 S 2 S n where P(E 1 E 2... E n ) = P(E 1 )P(E 2 ) P(E n )

Random variable 30/47 Function of outcome: X Example: Rolling a die twice X is sum of outcomes, so X = i + j, P(X = 2) = 36 1 Example: Flipping a coin indefinitely, P(H) = p N is number of flips until first H (independent flips) P(N = n) = (1 p) n 1 p Discrete random variable (rv): possible values are discrete Continuous random variable: possible values are continuous

Distribution function 31/47 F(x) = P(X x), x R Properties: F(x), lim x F(x) = 1, lim x F(x) = 0 P(y < X x) = F(x) F(y) Discrete random variable X {x 1, x 2,...} P(X = x i ) = p(x i ) > 0, i=1 p(x i ) = 1 Example: Bernoulli rv P(X = 0) = 1 P(X = 1) = 1 p (1 is success) Example: Binomial rv X = #successes in n trials, p is success probability ( ) n p i = P(X = i) = p i (1 p) n i, i = 0, 1,... n i

Distribution function 32/47 Example: Geometric rv X = #trials till first success p n = P(X = n) = (1 p) n 1 p, n = 1, 2, 3,... Example: Poisson rv X p n = e λλn, n = 0, 1, 2,... n!

Continuous rv 33/47 X has a density f (x) P(X B) = f (x)dx, P(X b) = so B f (x)dx = 1, P(a X b) = b Interpretation: f (x)dx P(x < X x + dx) F(x) = x f (y)dy, a b f (x)dx f (x)dx, d F(x) = f (x), P(X = b) = 0 dx Example: Uniform rv X on [0, 1], or uniform rv X on [a, b], f (x) = 1, 0 x 1 f (x) = 1/(b a), a x b

Continuous rv 34/47 Example: Normal rv X f (x) = 1 2πσ e (x µ)2 2σ 2, < x < Example: Exponential rv X with rate λ f (x) = λe λx, F(x) = 1 e λx, x 0 Memoryless property: X = lifetime of component P(X > t + x X > t) = so used is as good as new! Failure rate h(x) P(X > t + x) P(X > t) = e λx = P(X > x) h(x)dx = P(x < X < x + dx X > x) = λdx so constant failure rate

Expectation of rv X 35/47 Expected value of discrete rv X E(X) = i P(X = x i )x i Expected value of continuous rv X ( E(X) = x f (x)dx = Examples: ) xd F(x) Bernoulli Binomial Geometric E(X) = p E(X) = np E(X) = 1 p Uniform[0, 1] E(X) = 1 2 Exponential E(X) = λ 1 Normal E(X) = µ

Expectation of g(x) 36/47 E(g(X)) = g(x) f X (x)dx Example: X is exponential, g(x) = X 2 E(g(X)) = E(X 2 ) = x 2 λe λx dx = 2 λ 2 Property: Independent trials X 1, X 2,..., then with probability 1 X 1 + X 2 + + X n n E(X) This is the (intuitively appealing) Strong Law of Large Numbers Property: E(a X + b) = ae(x) + b

Joint distributions of rv s 37/47 For rv s X, Y the joint distribution is F(a, b) = P(X a, Y b) and for continuous distribution with density f (x, y) F(a, b) = a b Marginal distribution of X F X (a) = P(X a) = where f X (x) = f (x, y)dy f (x, y)dxdy a f (x, y)dydx = a f X (x)dx

Joint distributions of rv s 38/47 Property: Linearity E(a X + by ) = ae(x) + be(y ) Example: X i Bernoulli, X = X 1 + + X n (Binomial) E(X) = E(X 1 + + X n ) = E(X 1 ) + + E(X n ) = np

Independent rv s 39/47 X and Y are independent if P(X a, Y b) = P(X a)p(y b) or F(a, b) = F X (a)f Y (b) or f (a, b) = f X (a) f Y (b) Property: If X and Y are independent, then E(XY ) = xy f X (x) f Y (y)dxdy = E(X)E(Y )

Variance 40/47 Variance of X is (a measure of variability) var(x) = E[(X E(X)) 2 ] = E(X 2 ) (E(X)) 2 Property: X and Y are independent, then var(x + Y ) = var(x) + var(y ), var(a X) = a 2 var(x) Definition: X 1, X 2,..., X n are independent, then X = n1 X i n is sample mean E(X) = E(X), var(x) = var(x) n Example: Bernouilli var(x) = p(1 p) Binomial var(x) = np(1 p) Exponential var(x) = 1 λ 2

Conditional expectation 41/47 Conditional probability P(E F) = P(E F) P(F) X, Y discrete rv s with p(x, y) = P(X = x, Y = y), p X (x) = P(X = x), P Y (y) = P(Y = y) Then conditional probability distribution of X given Y = y p X Y (x y) = P(X = x Y = y) = p(x, y) p Y (y) Conditional expectation E(X Y = y) = x xp X Y (x y)

Conditional expectation 42/47 If X and Y are independent, then p X Y (x y) = P(X = x) = p X (x), E(X Y = y) = E(X) Example: X 1, X 2 binomial with n 1, p and n 2, p, and independent ( n1 )( n2 ) P(X 1 = k X 1 + X 2 = m) = = k m k ) which is the Hypergeometric distribution ( n1 +n 2 m

Conditional expectation 43/47 X, Y continuous rv s with f (x, y), f X (x), f Y (y) Then conditional density of X given Y = y is Note f X Y (x y) = f (x, y) f Y (y) f X Y (x y)dx = P(x < X X + dx y < Y y + dy) = f (x, y)dxdy f Y (y)dy Conditional expectation E(X Y = y) = x f X Y (x y)dx

Conditional expectation 44/47 Example: f (x, y) = 1 2 ye xy for 0 < x <, 0 < y < 2, and f (x, y) = 0 otherwise What is E(e X 2 Y = 1)? f Y (y) = 0 1 2 ye xy dx = 1 2 so f X Y (x y) = 1 2 ye xy 1 2 = ye yx E(e X 2 Y = 1) = e x 2 e x dx = 2 0

Conditional expectation 45/47 Note: If X, Y are independent, then E(X Y = y) = E(X) Note: If E(X Y = y) is known, then E(X) = or if Y is discrete E(X Y = y) f Y (y)dy E(X) = y E(X Y = y)p(y = y)

Conditional expectation 46/47 Example: Total injuries per year Y = N 1 X i where N = #accidents per year, E(N) = 4, X i = #workers injured, E(X i ) = 2 and X i, N are all independent Then ( N ) E(Y ) = E X i = = = 1 ( N ) E X i N = n P(N = n) n=0 ( n ) E X i N = n P(N = n) = n=0 1 1 ( n ) E X i P(N = n) n=0 ne(x)p(n = n) = E(X)E(N) = 2E(N) = 8 n=0 1

Conditional expectation 47/47 Example: Getting a seat in a train Y is distance to closest door, Y is Uniform[0, 2] If distance is Y = y, the probability of getting a seat is 1 What is probability of getting a seat? 1 (which is NOT equal to 1 2 E(Y ) = 1 1 2 1 = 0.293) Let X = 1 if success (get a seat), and X = 0 otherwise 1 2 y P(X = 1) = 2 0 P(X = 1 Y = y) 1 2 2 dy = (1 0 1 2 y)1 2 dy = 1 3