Recursive Estimation

Similar documents
MAT 271E Probability and Statistics

Introduction to Probability Theory

Recursive Estimation

MAT 271E Probability and Statistics

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

MAT 271E Probability and Statistics

CH 3 P1. Two fair dice are rolled. What is the conditional probability that at least one lands on 6 given that the dice land on different numbers?

STAT 414: Introduction to Probability Theory

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

. Find E(V ) and var(v ).

Probability and Inference

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

STAT 418: Probability and Stochastic Processes

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

EXAM # 3 PLEASE SHOW ALL WORK!

It applies to discrete and continuous random variables, and a mix of the two.

Great Theoretical Ideas in Computer Science

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007

Probability & Statistics - FALL 2008 FINAL EXAM

18.05 Practice Final Exam

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

With Question/Answer Animations. Chapter 7

Probability theory basics

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

Terminology. Experiment = Prior = Posterior =

Week 3 Sept. 17 Sept. 21

Basics on Probability. Jingrui He 09/11/2007

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

ECE531: Principles of Detection and Estimation Course Introduction

*Karle Laska s Sections: There is no class tomorrow and Friday! Have a good weekend! Scores will be posted in Compass early Friday morning

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Classical and Bayesian inference

What is a random variable

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

CENTRAL LIMIT THEOREM (CLT)

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Lecture 3 Probability Basics

Discrete Structures for Computer Science

Lecture 1: Probability Fundamentals

CSC Discrete Math I, Spring Discrete Probability

Quantitative Understanding in Biology 1.7 Bayesian Methods

Objective - To understand experimental probability

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

Introduction to Probability 2017/18 Supplementary Problems

CS 361: Probability & Statistics

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

COMP 551 Applied Machine Learning Lecture 19: Bayesian Inference

Lecture 1: Basics of Probability

Econ 371 Problem Set #1 Answer Sheet

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

the time it takes until a radioactive substance undergoes a decay

Probability Problems for Group 3(Due by EOC Mar. 6)

Single Maths B: Introduction to Probability

Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation

Introduction to Machine Learning CMU-10701

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

A primer on Bayesian statistics, with an application to mortality rate estimation

Probabilistic models

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Formalizing Probability. Choosing the Sample Space. Probability Measures

Lecture 3. January 7, () Lecture 3 January 7, / 35

Conditional probability

Bayesian Updating with Continuous Priors Class 13, Jeremy Orloff and Jonathan Bloom

Computational Perception. Bayesian Inference

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Some Probability and Statistics

Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices.

Probability Theory and Simulation Methods

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Probabilistic models

Chapter Three. Hypothesis Testing

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Compute f(x θ)f(θ) dθ

6.041/6.431 Fall 2010 Quiz 2 Solutions

2 Chapter 2: Conditional Probability

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018

A Measurement of Randomness in Coin Tossing

CS 630 Basic Probability and Information Theory. Tim Campbell

Lecture 2: Probability, conditional probability, and independence

Swarthmore Honors Exam 2012: Statistics

6.867 Machine Learning

Discrete Probability. Chemistry & Physics. Medicine

Statistical Methods for the Social Sciences, Autumn 2012

Some Probability and Statistics

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010

Transcription:

Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables, p x denotes the probability density function of x, and p x y denotes the conditional probability density function of x conditioned on y. Note that shorthand (as introduced in Lecture and ) and longhand notation is used. The expected value of x and its variance is denoted by E [x] and Var [x], and Pr (Z) denotes the probability that the event Z occurs. A normally distributed random variable x with mean µ and variance σ is denoted by x N ( µ, σ ). Please report any errors found in this problem set to the teaching assistants (hofermat@ethz.ch or csferrazza@ethz.ch).

Problem Set Problem Mr. Jones has devised a gambling system for winning at roulette. When he bets, he bets on red, and places a bet only when the ten previous spins of the roulette have landed on a black number. He reasons that his chance of winning is quite large since the probability of eleven consecutive spins resulting in black is quite small. What do you think of this system? Problem Consider two boxes, one containing one black and one white marble, the other, two black and one white marble. A box is selected at random with equal probability and a marble is drawn at random with equal probability from the selected box. What is the probability that the marble is black? Problem In Problem, what is the probability that the first box was the one selected given that the marble is white? Problem 4 Urn contains two white balls and one black ball, while urn contains one white ball and five black balls. One ball is drawn at random with equal probability from urn and placed in urn. A ball is then drawn from urn at random with equal probability. It happens to be white. What is the probability that the transferred ball was white? Problem 5 Stores A, B and C have 50, 75, 00 employees, and respectively 50, 60 and 70 percent of these are women. Resignations are equally likely among all employees, regardless of sex. One employee resigns and this is a woman. What is the probability that she works in store C? Problem 6 a) A gambler has in his pocket a fair coin and a two-headed coin. He selects one of the coins at random with equal probability, and when he flips it, it shows heads. What is the probability that it is the fair coin? b) Suppose that he flips the same coin a second time and again it shows heads. What is now the probability that it is the fair coin? c) Suppose that he flips the same coin a third time and it shows tails. What is now the probability that it is the fair coin? Problem 7 Urn has five white and seven black balls. Urn has three white and twelve black balls. An urn is selected at random with equal probability and a ball is drawn at random with equal probability from that urn. Suppose that a white ball is drawn. What is the probability that the second urn was selected?

Problem 8 An urn contains b black balls and r red balls. One of the balls is drawn at random with equal probability, but when it is put back into the urn c additional balls of the same color are put in with it. Now suppose that we draw another ball at random with equal probability. Show that the probability that the first ball drawn was black, given that the second ball drawn was red is b/(b + r + c). Problem 9 Three prisoners are informed by their jailer that one of them has been chosen to be executed at random with equal probability, and the other two are to be freed. Prisoner A asks the jailer to tell him privately which of his fellow prisoners will be set free, claiming that there would be no harm in divulging this information, since he already knows that at least one will go free. The jailer refuses to answer this question, pointing out that if A knew which of his fellows were to be set free, then his own probability of being executed would rise from / to /, since he would then be one of two prisoners. What do you think of the jailer s reasoning? Problem 0 Let x and y be independent random variables. Let g( ) and h( ) be arbitrary functions of x and y, respectively. Define the random variables v g(x) and w h(y). Prove that v and w are independent. That is, functions of independent random variables are independent. Problem Let x be a continuous, uniformly distributed random variable with x X [ 5, 5]. Let z x + n z x + n, where n and n are continuous random variables with probability density functions α ( + n ) for n 0 p n ( n ) α ( n ) for 0 n 0 otherwise, ( α + n ) for n 0 ( p n ( n ) α n ) for 0 n 0 otherwise, where α and α are normalization constants. Assume that the random variables x, n, n are independent, i.e. p x,n,n ( x, n, n ) p x ( x) p n ( n ) p n ( n ). a) Calculate α and α. b) Use the change of variables formula from Lecture to show that p zi x( z i x) p ni ( z i x). c) Calculate p x z,z (x 0, 0). d) Calculate p x z,z (x 0, ). e) Calculate p x z,z (x 0, ).

Problem Consider the following estimation problem: an object B moves randomly on a circle with radius. The distance to the object can be measured from a given observation point P. The goal is to estimate the location of the object, see Figure. y B distance - θ L P x Figure : Illustration of the estimation problem. The object B can only move in discrete steps. x(k) {0,,..., N }, where θ(k) π x(k) N. The dynamics are x(k) mod (x(k ) + v(k), N), k,,..., where v(k) with probability p and v(k) otherwise. Note that mod (, N) N. The distance sensor measures z(k) (L cos θ (k)) + (sin θ (k)) + w(k), The object s location at time k is given by mod (N, N) 0 and where w(k) represents the sensor error which is uniformly distributed on [ e, e]. We assume that x(0) is uniformly distributed and x(0), v(k) and w(k) are independent. Simulate the object movement and implement a Bayesian tracking algorithm that calculates for each time step k the probability density function p x(k) z(:k). a) Test the following settings and discuss the results: N 00, x(0) N 4, e 0.5, (i) L, p 0.5, (ii) L, p 0.5, (iii) L, p 0.55, (iv) L 0., p 0.55, (v) L 0, p 0.55. b) How robust is the algorithm? Set N 00, x(0) N 4, e 0.5, L, p 0.55 in the simulation, but use slightly different values for p and e in your estimation algorithm, ˆp and ê, respectively. Test the algorithm and explain the result for: (i) ˆp 0.45, ê e, (ii) ˆp 0.5, ê e, (iii) ˆp 0.9, ê e, (iv) ˆp p, ê 0.9, (v) ˆp p, ê 0.45. 4

Sample solutions Problem We introduce a discrete random variable x i, representing the outcome of the ith spin with x i {red, black}. We assume that both are equally likely and ignore the fact that there is a 0 or a 00 on a Roulette board. Assuming independence between spins, we have Pr (x k, x k, x k,..., x ) Pr (x k ) Pr (x k )... Pr (x ). The probability of eleven consecutive spins resulting in black is Pr (x black, x 0 black, x 9 black,..., x black) ( ) 0.0005. This value is actually quite small. However, given that the previous ten were black, we calculate Pr (x black x 0 black, x 9 black,..., x black) (by independence assumption) Pr (x black) Pr (x red x 0 black, x 9 black,..., x black) i.e. it is equally likely that ten black spins in a row are followed by a red spin, as that they are followed by another black spin (by the independence assumption). Mr. Jones system is therefore no better than randomly betting on black or red. Problem We introduce two discrete random variables. Let x {, } represent which box is chosen (box or ) with probability p x () p x (). Furthermore, let y {b, w} represent the color of the drawn marble, where b is a black and w a white marble with probabilities p y x (b ) p y x (w ), p y x (b ), p y x(w ). Then, by the total probability theorem, we find p y (b) p y x (b ) p x () + p y x (b ) p x () + 7. Problem p x y ( w) p y x(w ) p x () p y (w) p y (b) 4 5 5. 5

Problem 4 Let x {b, w} represent the color of the ball drawn from urn, where p x (b), p x(w), and y {b, w} be the color of the ball subsequently drawn from urn. Considering the different possibilities, we have p y x (b b) 6 7 p y x (w b) 7 p y x (b w) 5 7 p y x (w w) 7. We seek the probability that the transferred ball was white, given that the second ball drawn is white and calculate p x y (w w) p y x(w w) p x (w) p y (w) 7 7 + 7 4 5. 7 p y x (w b) p x (b) + p y x (w w) p x (w) Problem 5 We introduce two discrete random variables, x {A, B, C} and y {M, F }, where x represents which store an employee works in, and y the sex of the employee. From the problem description, we have p x (A) 50 5 9 p x (B) 75 5 p x (C) 00 5 4 9, and the probability that an employee is a woman is p y x (F A) p y x (F B) 5 p y x (F C) 7 0. We seek the probability that the resigning employee works in store C, given that it is a woman and calculate p x y (C F ) p y x(f C) p x (C) p y (F ) i {A,B,C} 7 0 4 9 p y x (F i) p x (i) 7 0 4 9 9 + 5 + 7 0 4 9. Problem 6 a) Let x {F, U} represent whether it is a fair (F ) or an unfair (U) coin with p x (F ) p x (U). We introduce y {h, t} to represent how the toss comes up (heads or tails) with p y x (h F ) p y x (t F ) p y x (h U) p y x (t U) 0. 6

We seek the probability that the drawn coin is fair, given that the toss result is heads and calculate p x y (F h) p y x(h F ) p x (F ) p y (h) p y x (h F ) p x (F ) fair coin + p y x (h U) p x (U) unfair coin 4 + b) Let y represent the result of the first flip and y that of the second flip. We assume conditional independence between flips (conditioned on x), yielding p(y, y x) p(y x) p(y x).. We seek the probability that the drawn coin is fair, given that the both tosses resulted in heads and calculate p x y,y (F h, h) p y,y x(h, h F ) p x (F ) p y,y (h, h) ( ) fair coin + unfair coin 5. c) Obviously, the probability then is, because the unfair coin cannot show tails. Formally, we show this by introducing y to represent the result of the third flip and computing p x y,y,y (F h, h, t) p y,y,y x(h, h, t F ) p x (F ) p y,y,y (h, h, t) ( ) fair coin + 0 unfair coin. Problem 7 We introduce the following two random variables: x {, } represents which box is chosen (box or ) with probability p x () p x (), y {b, w} represents the color of the drawn ball, where b is a black and w a white ball with probabilities p y x (b ) 7 p y x (b ) 5 p y x (w ) 5 p y x (w ) 5. We seek the probability that the second box was selected, given that the ball drawn is white and calculate p x y ( w) p y x(w ) p x () p y (w) 5 5 + 5 7. 7

Problem 8 Let x {B, R} be the color (black or red) of the first ball drawn with p x (B) p x (R) r b+r ; and let x {B, R} be the color of the second ball with b b+r and b p x x (B R) b + r + c p x x (B B) b + c b + r + c p x x (R R) c + r b + r + c r p x x (R B) b + r + c. We seek the probability that the first ball drawn was black, given that the second ball drawn is red and calculate p x x (B R) p x x (R B) p x (B) p x (R) r b+r+c c + r b + r + c r }{{ b + r } first red b b+r r + b + r + c b }{{ b + r } first black b b + r + c. Problem 9 We can approach the solution in two ways: Descriptive solution: The probability that A is to be executed is, and there is a chance of that one of the others was chosen. If the jailer gives away the name of one of the fellow prisoners who will be set free, prisoner A does not get new information about his own fate, but the probability of the remaining prisoner (B or C) to be executed is now. The probability of A being executed is still. Bayesian analysis: Let x represent which prisoner is to be executed, where x {A, B, C}. We assume that it is a random choice, i.e. p x (A) p x (B) p x (C). Now let y {B, C} be the prisoner name given away by the jailer. We can now write the conditional probabilities: 0 if x y (the jailer does not lie) p(y x) if x A (A is to be executed, jailer mentions B and C with equal probability) if x A (jailer is forced to give the name of the other prisoner to be set free). You could also do this with a table: y x p(y x) B A / B B 0 B C C A / C B C C 0 To answer the question, we have to compare p x y (A ȳ), ȳ {B, C}, with p x (A): 8

p x y (A ȳ) p y x(ȳ A) p x (A) p y (ȳ) p y x (ȳ A) + 0 6 p y x (ȳ ȳ) k {A,B,C} p y x (ȳ k) p x (k) + p y x (ȳ not ȳ), where (not ȳ) C if ȳ B and (not ȳ) B if ȳ C. The value of the posterior probability is the same as the prior one, p x (A). The jailer is wrong: prisoner A gets no additional information from the jailer about his own fate! See also Wikipedia: Three prisoners problem, Monty Hall problem. Problem 0 Consider the joint cumulative distribution F v,w ( v, w) Pr ((v v) and (w w)) Pr ((g(x) v) and (h(y) w)). We define the sets A v and A w as below: A v { x X : g(x) v } A w { y Ȳ : h(y) w}. Now we can deduce F v,w ( v, w) Pr ((x A v ) and (y A w )) v, w Pr (x A v ) Pr (y A w ) (by independence assumption) Pr (g(x) v) Pr (h(y) w) Pr (v v) Pr (w w) F v ( v) F w ( w). Therefore v and w are independent. Problem a) By definition of a PDF, the integrals of the PDFs p ni, i, must evaluate to one, which we use to find the α i. We integrate the probability density functions, see the following figure: p n ( n ) p n ( n ) α α α 4α n n 9

For n, we obtain p n ( n ) d n α 0 ( + n ) d n + ( α + ) Therefore α. For n, we get p n ( n ) d n α Therefore α. 0 0 α. ( + ) n d n + α ( + ) α. ( n ) d n 0 ( ) n d n b) The goal is to calculate p(z i x) from z i g(n i, x) : x + n i and the given PDFs p ni, i,. We apply the change of variables formula for CRVs with conditional PDFs: p(z i x) p(n i x) g n i (n i, x) (just think of x as a parameter that parametrizes the PDFs). The proof for this formula is analogous to the proof in the lecture notes for a change of variables for CRVs with unconditional PDFs. We find that g n i (n i, x), for all n i, x and, therefore, the fraction in the change of variables formula is well-defined for all values of n i and x. Due to the independence of n i and x, p(n i x) p(n i ): p(z i x) p(n i x) g n i (n i, x) p(n i). Substituting n i z i x, we finally obtain p zi x( z i x) p ni ( n i ) p ni ( z i x). c) We use Bayes theorem to calculate p(x z, z ) p(z, z x) p(x). () p(z, z ) First, we calculate the prior PDF of the CRV x, which is uniformly distributed: { p(x) 0 for 5 x 5. 0 otherwise In b), we showed by change of variables that p zi x( z i x) p ni ( z i x). Since n, n are independent, it follows that z, z are conditionally independent given x. Therefore, we can rewrite the measurement likelihood as p(z, z x) p(z x) p(z x). 0

We calculate the individual conditional PDFs p zi x: α ( z + x) for 0 z x p z x( z x) p n ( z x) α ( + z x) for z x 0 0 otherwise. The conditional PDF of z given x is illustrated in the following figure: p z x( z x) α x x z x + Analogously ( α z + x) for 0 z x ( p z x( z x) p n ( z x) α + z x) for z x 0 0 otherwise. Let num( x) be the numerator of the Bayes rule fraction (). Given the measurements z 0, z 0, we find num( x) 0 p z x(0 x) p z x(0 x). We consider four different intervals of x: [ 5, ], [, 0], [0, ] and [, 5]. num( x) for these intervals results in: Evaluating for x [ 5, ] or x [, 5], num( x) 0 for x [, 0], num( x) ( 0 α ( + x) α + x ) for x [0, ], num( x) ( 0 α ( x) α x ) ( 0 ( + x) + x ) ( 0 ( x) x ). Finally, we need to calculate the denominator of the Bayes rule fraction (), the normalization constant, which can be calculated using the total probability theorem: p z,z ( z, z ) p z,z x( z, z x) p x ( x) d x Evaluated at z z 0, we find p z,z (0, 0) 0 ( ( + x) + x ) d x + 0 0 ( 5 + 5 ) 4. 0 num( x) d x. ( ( x) x ) d x

Finally, we obtain the posterior PDF p x z,z ( x 0, 0) num( x) 4 num( x) p z,z (0, 0) 0 for 5 x, x 5 6 5 ( + x) ( + x ) for x 0 6 5 ( x) ( x ) for 0 x which is illustrated in the following figure: p x z,z ( x 0, 0) 6 5 x The posterior PDF is symmetric about x 0 with a maximum at x 0: both sensors agree. d) Similar to the above. Given z 0, z, we write the numerator as num( x) 0 p z x(0 x) p z x( x). () Again, we consider the same four intervals: for x [ 5, ] or x [, 5], num( x) 0 for x [, 0], num( x) ( 0 α ( + x) α + ) x ( + x) 40 for x [0, ] num( x) ( 0 α ( x) α + ) x ( x ). 40 Normalizing yields p z,z (0, ) num( x) dx 0 + 0 40. The solution is therefore 0 for 5 x, x 5 p x z,z ( x 0, ) ( + x) for x 0 x for 0 x.

The posterior PDF is depicted in the following figure: p x z,z ( x 0, ) x The probability values are higher for positive x values because of the measurement z. e) We start in the same fashion: given z 0 and z, num( x) 0 p z x(0 x) p z x( x). However, the intervals of positive probability of p z x(0 x) and p z x( x) do not overlap, i.e. num( x) 0 x [ 5, 5]. In other words, given our noise model for n and n, there is no chance to measure z 0 and z. Therefore, p x z,z ( x 0, ) is not defined. Problem The Matlab code is available on the class webpage. We notice the following: a) (i) The PDF remains bimodal for all times. The symmetry in measurements and motion makes it impossible to differentiate between the upper and lower half circle. (ii) The PDF is initially bimodal, but the bias in the particle motion causes one of the two modes to have higher values after a few time steps. (iii) We note that this sensor placement also works. Note that the resulting PDF is not as sharp because more positions explain the measurements. (iv) The resulting PDF is uniform for all k,,.... With the sensor in the center, the distance measurement provides no information on the particle position (all particle positions have the same distance to the sensor). b) (i) The PDF has higher values at the position mirrored from the actual position, because the estimator model uses a value ˆp that biases the particle motion in the opposite direction. (ii) The PDF remains bimodal, even though the particle motion is biased in one direction. This is caused by the estimator assuming that the particle motion is unbiased, which makes both halves of the circle equally likely. (iii) The incorrect assumption on the motion probability ˆp causes the estimated PDF to drift away from the real particle position until it is forced back by measurements that cannot be explained otherwise. (iv) The resulting PDF is not as sharp because the larger value of ê implies that more states are possible for the given measurements. (v) The Bayesian tracking algorithm fails after a few steps (division by zero in the measurement update step). This is caused by a measurement that is impossible to explain with the present particle position PDF and the measurement error distribution defined by ê. Note that ê underestimates the possible measurement error. This leads to measurements where the true error (with uniform distribution defined by e) contains an error larger than is assumed possible in the algorithm.