M5A42 APPLIED STOCHASTIC PROCESSES
|
|
- Melvyn Dean
- 5 years ago
- Views:
Transcription
1 M5A42 APPLIED STOCHASTIC PROCESSES Professor G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 06/10/2016
2 Lectures: Thursdays 14:00-15:00, Huxley 140, Fridays 10:00-12:00, Huxley 130. Office Hours: Thursdays 15:00-16:00, Fridays 13:00-14:00 or by appointment. Course webpage: Text: Lecture notes, available from the course webpage. Also, recommended reading from various textbooks.
3 This is an introductory course on stochastic processes and their applications, aimed towards students in applied mathematics. The emphasis of the course will be on the presentation of analytical tools that are useful in the study of stochastic models that appear in various problems in applied mathematics, physics, chemistry and biology. Numerical methods for stochastic processes are presented in the course M5A44 Computational Stochastic Processes that is offered in Term 2. This is a year-long introductory graduate level course on stochastic processes: the analytical techniques that will be presented in Term I (M5A42) will provide the necessary theoretical background for the development of the computational techniques for studying stochastic processes that will be developed in Term II (M5A44).
4 Prerequisites Elementary probability theory. Ordinary and partial differential equations. Linear algebra. Some familiarity with analysis (measure theory, linear functional analysis) is desirable but not necessary. Course Objectives By the end of the course you are expected to be familiar with the basic concepts of the theory of stochastic processes in continuous time and to be able to use various analytical techniques to study stochastic models that appear in applications.
5 Course assessment Final exam (May/June 2017). There will be no assessed coursework. Problem Sheets Feedback Problem sheets and solutions are already available from the course webpage. Problem classes/office hours. Please do contact me, come to my office during office hours. Student Evaluation Forms.
6 Probability theory and random variables (2 lectures). Basic definitions, probability spaces, probability measures etc. Random variables, conditional expectation, characteristic functions, limits theorems. Stochastic processes (6 lectures). Basic definitions. Brownian motion. Stationary processes. Other examples of stationary processes. The Karhunen-Loeve expansion. Markov processes (4 lectures). Introduction and examples. Basic definitions. The Chapman-Kolmogorov equation. The generator of a Markov process and its adjoint. Ergodic and stationary Markov processes. Diffusion processes (4 lectures). Basic definitions and examples. The backward and forward (Fokker-Planck) Kolmogorov equations. Connection between diffusion processes and stochastic differential equations.
7 Stochastic Differential Equations (6 lectures). Basic properties of SDEs. Itô s formula. Linear SDEs. SDEs with multiplicative noise. The Fokker-Planck equation (6 lectures). Basic properties of the FP equation. Examples of diffusion processes and of the FP equation. The Ornstein-Uhlenbeck process. Gradient flows and eigenfunction expansions. Exit problems for diffusion processes The mean first passage time. One dimensional examples. Escape from a potential well. Stochastic resonance.
8 Lecture notes will be provided for all the material that we will cover in this course. The notes will be available from the course webpage. The notes are based on my book Stochastic processes and applications : diffusion processes, the Fokker-Planck and Langevin equations. It is available from the Central Library, PAV. The material relevant for this course will be available from the course webpage. There are many excellent textbooks/review articles on applied stochastic processes, at a level and style similar to that of this course. Standard textbooks that cover the material on probability theory, Markov chains and stochastic processes are: Grimmett and Stirzaker: Probability and Random Processes. Karlin and Taylor: A First Course in Stochastic Processes. Lawler: Introduction to Stochastic Processes. Resnick: Adventures in Stochastic Processes.
9 Books on stochastic processes with a view towards applications, mostly to physics, are: Horsthemke and Lefever: Noise induced transitions. Risken: The Fokker-Planck equation. Gardiner: Handbook of stochastic methods. van Kampen: Stochastic processes in physics and chemistry. Mazo: Brownian motion: fluctuations, dynamics and applications. Chorin and Hald: Stochastic tools for mathematics and science. Gillespie; Markov Processes.
10 The rigorous mathematical theory of probability and stochastic processes is presented in Koralov and Sinai: Theory of probability and random processes. Karatzas and Shreeve: Brownian motion and stochastic calculus. Revuz and Yor: Continuous martingales and Brownian motion. Stroock: Probability theory, an analytic view. Books on stochastic differential equations and their numerical solution are Oksendal: Stochastic differential equations. Kloeden and Platen, Numerical Solution of Stochastic Differential Equations. An excellent book on the theory and the applications of stochastic processes is Bhatthacharya and Waymire: Stochastic processes and applications.
11 A stochastic process is used to model systems that evolve in time and whose laws of evolution are probabilistic in nature. The state of the system evolves in time and can be described through a state variable x(t). The evolution of the state of the system depends on the outcome of an experiment. We can write x = x(t,ω), where ω denotes the outcome of the experiment. Examples: The random walk in one dimension. Brownian motion. The exchange rate between the British sterling and the US dollar. Photon emission. The spread of the SARS epidemic.
12 The One-Dimensional Random Walk We let time be discrete, i.e. t = 0, 1,... Consider the following stochastic process S n : S 0 = 0; at each time step it moves to ±1 with equal probability 1 2. In other words, at each time step we flip a fair coin. If the outcome is heads, we move one unit to the right. If the outcome is tails, we move one unit to the left. Alternatively, we can think of the random walk as a sum of independent random variables: S n = n X j, j=1 where X j { 1, 1} with P(X j = ±1) = 1 2.
13 We can simulate the random walk on a computer: We need a (pseudo)random number generator to generate n independent random variables which are uniformly distributed in the interval [0,1]. If the value of the random variable is 1 2 then the particle moves to the left, otherwise it moves to the right. We then take the sum of all these random moves. The sequence {S n } N n=1 indexed by the discrete time T = {1, 2,...N} is the path of the random walk. We use a linear interpolation (i.e. connect the points {n, S n } by straight lines) to generate a continuous path.
14 50 step random walk Figure: Three paths of the random walk of length N = 50.
15 1000 step random walk Figure: Three paths of the random walk of length N = 1000.
16 Every path of the random walk is different: it depends on the outcome of a sequence of independent random experiments. We can compute statistics by generating a large number of paths and computing averages. For example, E(S n ) = 0, E(S 2 n) = n. The paths of the random walk (without the linear interpolation) are not continuous: the random walk has a jump of size 1 at each time step. This is an example of a discrete time, discrete space stochastic processes. The random walk is a time-homogeneous (the probabilistic law of evolution is independent of time) Markov (the future depends only on the present and not on the past) process. If we take a large number of steps, the random walk starts looking like a continuous time process with continuous paths.
17 Consider the sequence of continuous time stochastic processes Zt n := 1 S nt. n In the limit as n, the sequence {Zt n } converges (in some appropriate sense) to a Brownian motion with diffusion coefficient D = x2 2 t = 1 2.
18 2 1.5 mean of 1000 paths 5 individual paths 1 U(t) t Figure: Sample Brownian paths.
19 Brownian motion W(t) is a continuous time stochastic processes with continuous paths that starts at 0 (W(0) = 0) and has independent, normally. distributed Gaussian increments. We can simulate the Brownian motion on a computer using a random number generator that generates normally distributed, independent random variables.
20 We can write an equation for the evolution of the paths of a Brownian motion X t with diffusion coefficient D starting at x: dx t = 2DdW t, X 0 = x. This is an example of a stochastic differential equation. The probability of finding X t at y at time t, given that it was at x at time t = 0, the transition probability density ρ(y, t) satisfies the PDE ρ t = D 2 ρ y2, ρ(y, 0) = δ(y x). This is an example of the Fokker-Planck equation. The connection between Brownian motion and the diffusion equation was made by Einstein in 1905.
21 Why introduce randomness in the description of physical systems? To describe outcomes of a repeated set of experiments. Think of tossing a coin repeatedly or of throwing a dice. To describe a deterministic system for which we have incomplete information: we have imprecise knowledge of initial and boundary conditions or of model parameters. ODEs with random initial conditions are equivalent to stochastic processes that can be described using stochastic differential equations. To describe systems for which we are not confident about the validity of our mathematical model (uncertainty quantification).
22 To describe a dynamical system exhibiting very complicated behavior (chaotic dynamical systems). Determinism versus predictability. To describe a high dimensional deterministic system using a simpler, low dimensional stochastic system. Think of the physical model for Brownian motion (a heavy particle colliding with many small particles). To describe a system that is inherently random. Think of quantum mechanics.
23 ELEMENTS OF PROBABILITY THEORY
24 Definition The set of all possible outcomes of an experiment is called the sample space and is denoted by Ω. Example The possible outcomes of the experiment of tossing a coin are H and T. The sample space is Ω = { H, T }. The possible outcomes of the experiment of throwing a die are 1, 2, 3, 4, 5 and 6. The sample space is Ω = { 1, 2, 3, 4, 5, 6 }.
25 Definition A collection F of Ω is called a field on Ω if 1 F; 2 if A F then A c F; 3 If A, B F then A B F. From the definition of a field we immediately deduce that F is closed under finite unions and finite intersections: A 1,...A n F n i=1 A i F, n i=1 A i F.
26 When Ω is infinite dimensional then the above definition is not appropriate since we need to consider countable unions of events. Definition A collection F of Ω is called a σ-field or σ-algebra on Ω if 1 F; 2 if A F then A c F; 3 If A 1, A 2, F then i=1 A i F. A σ-algebra is closed under the operation of taking countable intersections. Example F = {, Ω }. F = {, A, A c, Ω } where A is a subset of Ω. The power set of Ω, denoted by {0, 1} Ω which contains all subsets of Ω.
27 Let F be a collection of subsets of Ω. It can be extended to a σ algebra (take for example the power set of Ω). Consider all the σ algebras that contain F and take their intersection, denoted by σ(f), i.e. A Ω if and only if it is in every σ algebra containing F. σ(f) is a σ algebra. It is the smallest algebra containing F and it is called the σ algebra generated by F. Example Let Ω = R n. The σ-algebra generated by the open subsets of R n (or, equivalently, by the open balls of R n ) is called the Borel σ-algebra of R n and is denoted by B(R n ).
28 Let X be a closed subset of R n. Similarly, we can define the Borel σ-algebra of X, denoted by B(X). A sub-σ algebra is a collection of subsets of a σ algebra which satisfies the axioms of a σ algebra. The σ field F of a sample space Ω contains all possible outcomes of the experiment that we want to study. Intuitively, the σ field contains all the information about the random experiment that is available to us.
29 Definition A probability measure P on the measurable space (Ω, F) is a function P : F [0, 1] satisfying 1 P( ) = 0, P(Ω) = 1; 2 For A 1, A 2,... with A i A j =, i j then P( i=1 A i) = P(A i ). i=1 Definition The triple ( Ω, F, P ) comprising a set Ω, a σ-algebra F of subsets of Ω and a probability measure P on (Ω, F) is a called a probability space.
30 Example A biased coin is tossed once: Ω = {H, T}, F = {, H, T, Ω} = {0, 1}, P : F [0, 1] such that P( ) = 0, P(H) = p [0, 1], P(T) = 1 p, P(Ω) = 1. Example Take Ω = [0, 1], F = B([0, 1]), P = Leb([0, 1]). Then (Ω,F,P) is a probability space.
31 Definition A family {A i : i I} of events is called independent if for all finite subsets J of I. P ( j J A j ) = Πj J P(A j ) When two events A, B are dependent it is important to know the probability that the event A will occur, given that B has already happened. We define this to be conditional probability, denoted by P(A B). We know from elementary probability that P(A B) = P(A B). P(B)
32 Definition A family of events {B i : i I} is called a partition of Ω if B i B j =, i j and i I B i = Ω. Theorem Law of total probability. For any event A and any partition {B i : i I} we have P(A) = i I P(A B i )P(B i ).
33 Let (Ω, F, P) be a probability space and fix B F. Then P( B) defines a probability measure on F: P( B) = 0, P(Ω B) = 1 and (since A i A j = implies that (A i B) (A j B) = ) P( j=1 A i B) = P(A i B), j=1 for a countable family of pairwise disjoint sets {A j } + j=1. Consequently, (Ω,F,P( B)) is a probability space for every B F.
34 The function of the outcome of an experiment is a random variable, that is, a map from Ω to R. Definition A sample space Ω equipped with a σ field of subsets F is called a measurable space. Definition Let (Ω,F) and (E,G) be two measurable spaces. A function X : Ω E such that the event {ω Ω : X(ω) A} =: {X A} (1) belongs to F for arbitrary A G is called a measurable function or random variable.
35 When E is R equipped with its Borel σ-algebra, then (1) can by replaced with {X x} F x R. Let X be a random variable (measurable function) from (Ω,F,µ) to (E,G). If E is a metric space then we may define expectation with respect to the measure µ by E[X] = X(ω) dµ(ω). More generally, let f : E R be G measurable. Then, E[f(X)] = f(x(ω)) dµ(ω). Ω Ω
36 Let U be a topological space. We will use the notation B(U) to denote the Borel σ algebra of U: the smallest σ algebra containing all open sets of U. Every random variable from a probability space (Ω,F,µ) to a measurable space (E,B(E)) induces a probability measure on E: µ X (B) = PX 1 (B) = µ(ω Ω; X(ω) B), B B(E). (2) The measure µ X is called the distribution (or sometimes the law) of X.
37 Example Let I denote a subset of the positive integers. A vector ρ 0 = {ρ 0,i, i I} is a distribution on I if it has nonnegative entries and its total mass equals 1: i I ρ 0,i = 1.
38 Consider the case where E = R equipped with the Borel σ algebra. In this case a random variable is defined to be a function X : Ω R such that {ω Ω : X(ω) x} F x R. We can now define the probability distribution function of X, F X : R [0, 1] as F X (x) = P ({ ω Ω X(ω) x )} =: P(X x). (3) In this case, (R,B(R), F X ) becomes a probability space.
39 The distribution function F X (x) of a random variable has the properties that lim x F X (x) = 0, lim x + F(x) = 1 and is right continuous. Definition A random variable X with values on R is called discrete if it takes values in some countable subset {x 0, x 1, x 2,...} of R. i.e.: P(X = x) x only for x = x 0, x 1,....
40 With a random variable we can associate the probability mass function p k = P(X = x k ). We will consider nonnegative integer valued discrete random variables. In this case p k = P(X = k), k = 0, 1, 2,... Example The Poisson random variable is the nonnegative integer valued random variable with probability mass function where λ > 0. p k = P(X = k) = λk k! e λ, k = 0, 1, 2,...,
41 Example The binomial random variable is the nonnegative integer valued random variable with probability mass function p k = P(X = k) = N! n!(n n)! pn q N n k = 0, 1, 2,...N, where p (0, 1), q = 1 p.
42 Definition A random variable X with values on R is called continuous if P(X = x) = 0 x R. Let (Ω,F,P) be a probability space and let X : Ω R be a random variable with distribution F X. This is a probability measure on B(R). We will assume that it is absolutely continuous with respect to the Lebesgue measure with density ρ X : F X (dx) = ρ(x) dx. We will call the density ρ(x) the probability density function (PDF) of the random variable X.
43 Example 1 The exponential random variable has PDF with λ > 0. f(x) = { λe λx x > 0, 0 x < 0, 2 The uniform random variable has PDF { 1 f(x) = b a a < x < b, 0 x / (a, b), with a < b.
44 Definition Two random variables X and Y are independent if the events {ω Ω X(ω) x} and {ω Ω Y(ω) y} are independent for all x, y R. Let X, Y be two continuous random variables. We can view them as a random vector, i.e. a random variable from Ω to R 2. We can then define the joint distribution function F(x, y) = P(X x, Y y). The mixed derivative of the distribution function f X,Y (x, y) := 2 F x y (x, y), if it exists, is called the joint PDF of the random vector {X, Y}: F X,Y (x, y) = x y f X,Y (x, y) dxdy.
45 If the random variables X and Y are independent, then F X,Y (x, y) = F X (x)f Y (y) and f X,Y (x, y) = f X (x)f Y (y). The joint distribution function has the properties F X,Y (x, y) = F Y,X (y, x), F X,Y (+, y) = F Y (y), f Y (y) = + f X,Y (x, y) dx.
46 We can extend the above definition to random vectors of arbitrary finite dimensions. Let X be a random variable from (Ω,F,µ) to (R d,b(r d )). The (joint) distribution function F X R d [0, 1] is defined as F X (x) = P(X x). Let X be a random variable in R d with distribution function f(x N ) where x N = {x 1,... x N }. We define the marginal or reduced distribution function f N 1 (x N 1 ) by f N 1 (x N 1 ) = f N (x N ) dx N. We can define other reduced distribution functions: f N 2 (x N 2 ) = f N 1 (x N 1 ) dx N 1 = f(x N ) dx N 1 dx N. R R R R
47 We can use the distribution of a random variable to compute expectations and probabilities: E[f(X)] = f(x) df X (x) (4) and P[X G] = G R df X (x), G B(E). (5) The above formulas apply to both discrete and continuous random variables, provided that we define the integrals in (4) and (5) appropriately. When E = R d and a PDF exists, df X (x) = f X (x) dx, we have F X (x) := P(X x) = x1 xd... f X (x) dx..
48 Example (Normal Random Variables) Consider the random variable X : Ω R with pdf ) γ σ,m (x) := (2πσ) 1 (x m)2 2 exp (. 2σ Such an X is termed a Gaussian or normal random variable. The mean is EX = xγ σ,m (x) dx = m R and the variance is E(X m) 2 = (x m) 2 γ σ,m (x) dx = σ. R
49 Example (Normal Random Variables contd.) Let m R d and Σ R d d be symmetric and positive definite. The random variable X : Ω R d with pdf ( ) 1 ( γ Σ,m (x) := (2π) d 2 detσ exp 1 ) 2 Σ 1 (x m),(x m) is termed a multivariate Gaussian or normal random variable. The mean is E(X) = m (6) and the covariance matrix is ( ) E (X m) (X m) = Σ. (7)
50 Let X, Y be random variables we want to know whether they are correlated and, if they are, to calculate how correlated they are. We define the covariance of the two random variables as cov(x, Y) = E [ (X EX)(Y EY) ] = E(XY) EXEY. The correlation coefficient is ρ(x, Y) = cov(x, Y) var(x) var(x) (8) The Cauchy-Schwarz inequality yields that ρ(x, Y) [ 1, 1]. We will say that two random variables X and Y are uncorrelated provided that ρ(x, Y) = 0. It is not true in general that two uncorrelated random variables are independent. This is true, however, for Gaussian random variables.
51 Assume that E X < and let G be a sub σ algebra of F. The conditional expectation of X with respect to G is defined to be the function E[X G] : Ω E which is G measurable and satisfies E[X G] dµ = X dµ G G. G G We can define E[f(X) G] and the conditional probability P[X F G] = E[I F (X) G], where I F is the indicator function of F, in a similar manner.
52 φ(t) = R e itλ df(λ) = E(e itx ). (9) For a continuous random variable for which the distribution function F has a density, df(λ) = p(λ)d λ, (9) gives φ(t) = e itλ p(λ) dλ. R For a discrete random variable for which P(X = λ k ) = α k, (9) gives φ(t) = e itλ k a k. k=0
53 The characteristic function determines uniquely the distribution function of the random variable, in the sense that there is a one-to-one correspondance between F(λ) and φ(t). Lemma Let {X 1, X 2,... X n } be independent random variables with characteristic functions φ j (t), j = 1,...n and let Y = n j=1 X j with characteristic function φ Y (t). Then φ Y (t) = Π n j=1 φ j(t). Lemma Let X be a random variable with characteristic function φ(t) and assume that it has finite moments. Then E(X k ) = 1 i kφ(k) (0).
54 Theorem Let b R n and Σ R n n a symmetric and positive definite matrix. Let X be the multivariate Gaussian random variable with probability density function γ(x) = 1 ( Z exp 1 ) 2 Σ 1 (x b), x b. Then 1 The normalization constant is Z = (2π) n/2 det(σ). 2 The mean vector and covariance matrix of X are given by EX = b and E((X EX) (X EX)) = Σ. 3 The characteristic function of X is φ(t) = e i b,t 1 2 t,σt.
55 One of the most important aspects of the theory of random variables is the study of limit theorems for sums of random variables. The most well known limit theorems in probability theory are the law of large numbers and the central limit theorem. There are various different types of convergence for sequences or random variables.
56 Definition Let {Z n } n=1 be a sequence of random variables. We will say that (a) Z n converges to Z with probability one if P ( lim n + Z n = Z ) = 1. (b) Z n converges to Z in probability if for every ε > 0 lim n + P ( Z n Z > ε ) = 0. (c) Z n converges to Z in L p if lim n + E [ Zn Z p] = 0. (d) Let F n (λ), n = 1, +, F(λ) be the distribution functions of Z n n = 1, + and Z, respectively. Then Z n converges to Z in distribution if lim n + F n (λ) = F(λ) for all λ R at which F is continuous.
57 The distribution function F X of a random variable from a probability space (Ω,F,P) to R induces a probability measure on R and that (R,B(R), F X ) is a probability space. We can show that the convergence in distribution is equivalent to the weak convergence of the probability measures induced by the distribution functions. Definition Let (E, d) be a metric space, B(E) the σ algebra of its Borel sets, P n a sequence of probability measures on (E,B(E)) and let C b (E) denote the space of bounded continuous functions on E. We will say that the sequence of P n converges weakly to the probability measure P if, for each f C b (E), lim n + E f(x) dp n (x) = E f(x) dp(x).
58 Theorem Let F n (λ), n = 1, +, F(λ) be the distribution functions of Z n n = 1, + and Z, respectively. Then Z n converges to Z in distribution if and only if, for all g C b (R) Remark lim n + (10) is equivalent to X g(x) df n (x) = X E n (g) = E(g), g(x) df(x). (10) where E n and E denote the expectations with respect to F n and F, respectively.
59 When the sequence of random variables whose convergence we are interested in takes values in R n or, more generally, a metric space space (E, d) then we can use weak convergence of the sequence of probability measures induced by the sequence of random variables to define convergence in distribution. Definition A sequence of real valued random variables X n defined on a probability spaces (Ω n,f n, P n ) and taking values on a metric space (E, d) is said to converge in distribution if the indued measures F n (B) = P n (X n B) for B B(E) converge weakly to a probability measure P.
60 Let {X n } n=1 be iid random variables with EX n = V. Then, the strong law of large numbers states that average of the sum of the iid converges to V with probability one: P ( 1 lim n + N N X n = V ) = 1. n=1 The strong law of large numbers provides us with information about the behavior of a sum of random variables (or, a large number or repetitions of the same experiment) on average. We can also study fluctuations around the average behavior.
61 let E(X n V) 2 = σ 2. Define the centered iid random variables Y n = X n V. Then, the sequence of random 1 variables σ N N n=1 Y n converges in distribution to a N(0, 1) random variable: lim P n + ( 1 σ N ) N Y n a = n=1 This is the central limit theorem. a 1 2π e 1 2 x2 dx.
M4A42 APPLIED STOCHASTIC PROCESSES
M4A42 APPLIED STOCHASTIC PROCESSES G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 12/10/2009 Lectures: Mondays 09:00-11:00, Huxley 139, Tuesdays 09:00-10:00, Huxley 144.
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationAPPLIED STOCHASTIC PROCESSES
APPLIED STOCHASTIC PROCESSES G.A. Pavliotis Department of Mathematics Imperial College London, UK January 16, 2011 Pavliotis (IC) StochProc January 16, 2011 1 / 367 Lectures: Mondays, 10:00-12:00, Huxley
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationA Short Introduction to Diffusion Processes and Ito Calculus
A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationIntroduction to Stochastic Processes
Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationLecture 6 Basic Probability
Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationHandbook of Stochastic Methods
C. W. Gardiner Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences Third Edition With 30 Figures Springer Contents 1. A Historical Introduction 1 1.1 Motivation I 1.2 Some Historical
More informationJUSTIN HARTMANN. F n Σ.
BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationSTAT 7032 Probability Spring Wlodek Bryc
STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationLecture 1: Review on Probability and Statistics
STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationExample 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)
Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More information2 n k In particular, using Stirling formula, we can calculate the asymptotic of obtaining heads exactly half of the time:
Chapter 1 Random Variables 1.1 Elementary Examples We will start with elementary and intuitive examples of probability. The most well-known example is that of a fair coin: if flipped, the probability of
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationLectures for APM 541: Stochastic Modeling in Biology. Jay Taylor
Lectures for APM 541: Stochastic Modeling in Biology Jay Taylor November 3, 2011 Contents 1 Distributions, Expectations, and Random Variables 4 1.1 Probability Spaces...................................
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationHandbook of Stochastic Methods
Springer Series in Synergetics 13 Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences von Crispin W Gardiner Neuausgabe Handbook of Stochastic Methods Gardiner schnell und portofrei
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationRecap of Basic Probability Theory
02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationLecture 4: Stochastic Processes (I)
Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 215 Lecture 4: Stochastic Processes (I) Readings Recommended: Pavliotis [214], sections 1.1, 1.2 Grimmett and Stirzaker [21], section 9.4 (spectral
More informationElementary Probability. Exam Number 38119
Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationRecap of Basic Probability Theory
02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationLecture 1: Brief Review on Stochastic Processes
Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationProduct measure and Fubini s theorem
Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationWe will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.
1 Probability 1.1 Probability spaces We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events. Definition 1.1.
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationSTA 711: Probability & Measure Theory Robert L. Wolpert
STA 711: Probability & Measure Theory Robert L. Wolpert 6 Independence 6.1 Independent Events A collection of events {A i } F in a probability space (Ω,F,P) is called independent if P[ i I A i ] = P[A
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationNotes on Mathematics Groups
EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationDiscrete Probability Refresher
ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More information1 Probability theory. 2 Random variables and probability theory.
Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationStochastic process for macro
Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationProbability Theory. Richard F. Bass
Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More informationSystem Identification
System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module
More informationIEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008
IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 Justify your answers; show your work. 1. A sequence of Events. (10 points) Let {B n : n 1} be a sequence of events in
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationLocal vs. Nonlocal Diffusions A Tale of Two Laplacians
Local vs. Nonlocal Diffusions A Tale of Two Laplacians Jinqiao Duan Dept of Applied Mathematics Illinois Institute of Technology Chicago duan@iit.edu Outline 1 Einstein & Wiener: The Local diffusion 2
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More information