From Multiplicity to Entropy

Size: px
Start display at page:

Download "From Multiplicity to Entropy"

Transcription

1 From Multiplicity to Entropy R Hanel, S Thurner, and M. Gell-Mann Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

2 When I first visited CBPF... Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

3 Introduction What is the probability of finding a probability distribution function? In other words: What is the probability to find a histogram? Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

4 What is the probability to find a histogram? Sequence x = (x 1, x 2,, x N ) with x n {1, 2} (coin tosses) 1q 0 1 q0 2 1q 1 1 q0 2 1q 0 1 q1 2 1q 2 1 q0 2 2q 1 1 q1 2 1q 0 1 q2 2 1q 3 1 q0 2 3q 2 1 q1 2 3q 1 1 q2 2 1q 1 1 q3 2 1q 4 1 q0 2 4q 3 1 q1 2 6q 2 1 q2 2 4q 1 1 q3 2 1q 0 1 q4 2 q 1 and q 2... probabilities to toss 1 (heads) or 2 (tails) in a trial x n. k = (k 1, k 2 )... is the histogram of x; i.e. k 1 (k 2 )... number of times 1 (2) appears in x. Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

5 What is the probability to find a histogram? Binomial The probability to find the histogram k is: P(k; q) = N! k 1!k 2! qk 1 1 qk 2 2 N... length of sequence x 2 states x n {1, 2} Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

6 What is the probability to find a histogram? Multinomial The probability to find the histogram k is: P(k; q) = N! k 1!k 2! k W! qk 1 1 qk 2 2 qk W W N... length of sequence x W states x n {1, 2,, W } Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

7 What is the probability to find a histogram? Multinomial The probability to find the histogram k is: P(k; q) = N! k 1!k 2! k W! }{{} multiplicity M(k) q k 1 1 qk 2 2 qk W }{{ W} probability G(k;q) Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

8 What is the probability to find a histogram? Multinomial We make two observations... Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

9 Observation 1: The factorization property Factorization P(k; q) = M(k) G(k; q) Multiplicity M(k) does not depend on q! All dependencies on q are captured by G(k; q) Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

10 Observation 1: The factorization property Take the log on both sides and scale with 1/N... P(k; q) = M(k) G(k; q) 1 log P(k; q) = 1 log M(k) + 1 log G(k; q) N N N Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

11 Observation 2: From factorization property to MEP ( 1 N log M(k) = 1 N log 1 N log ( N! k 1!k 2! k W! ) N N k k 1 1 k k 2 2 k k W W ) Stirling = log N 1 N W i=1 k i log k i = W k i i=1 N log k i N = W i=1 p i log p i frequencies p i = k i N = S Shannon [p] Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

12 Observation 2: From factorization property to MEP ( ) 1 N log G(k; q) = 1 N log q k 1 1 qk 2 2 qk W W = 1 N W i=1 k i log q i = W i=1 p i log q i = S cross [p q] Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

13 Observation 2: From factorization property to MEP The identified terms: 1 log P(k; q) = 1 log M(k) + 1 log G(k; q) N N N = W i=1 p i log(p i ) + W i=1 p i log(q i ) Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

14 The Maximum Entropy Principle Variable transformation... q i = exp( α βε i ) W 1 N log P(k; q) = p i log(p i ) i=1 }{{} constraint independent α W p i β i=1 W p i ε i i=1 } {{ } constraints What is the most likely histogram k? for fixed N i=1 0 = k i log P(k; q) ( ) 0 = W W W p i log(p i ) α p i β p i ε i p i i=1 i=1 Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

15 Principle Principle If a factorization exists then there exists a MEP with P(k; q) = M(k)G(k; q) 1 1 log P(k; q) = log M(k) + 1 log G(k; q) φ(n) φ(n) φ(n) }{{}}{{} generalized entropy constraints Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

16 When does factorization exist? When does factorization exist? There is little one can say in general but... focus on processes out of equilibrium that are aging, non-markovian, path-dependent Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

17 Aging & memory When do factorizations of P exist? process remembers its entire past in terms of the histogram k Consider growing sequences x = (x 1,, x N ) x = (x 1,, x N, x N+1 ) ˆp(x N+1 k; q)... conditional probability to find x N+1 given histogram k of x IF process follows ˆp(x N+1 k; q) and has stable solutions d dn k i(n) = ˆp(i k; q) There exists a recursive definition of how M(k) (similar to the multinomial coefficients) Asymptotically M(k) is given by a deformed multinomial... Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

18 Deformed factorials and deformed multinomials N! u M u,t (k) ( ) i NT ki N! u }{{} Deformed multinomial T monotonic with T (0) = 0, T (1) = 1, N! u deformed multinomials lead to trace-form entropies S[p] = 1 φ(n) log M u,t (k) =... = N u(n) n=1 } {{ } Deformed factorial W i=1 pi 0 dz Λ(z) Λ(z) = 1 a ( [ ] ) N T (z) log u(nt (z)) log λ φ(n) Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

19 General solution for deformed multinomials Asymptotic identity for constants a and λ Λ(z) = 1 ( [ ] N T (z) log u(nt (z)) a φ(n) ) log λ Λ(z) does not depend on N separation of variables with separation constant ν Λ(z) = T (z)t (z) ν T (1) T (1)+νT (1) 2 u(n) = λ(nν ) 1 λ 1...q shifted factorials (ν = 1) φ(n) = N 1+ν Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

20 Happy Birthday The MEP with Tsallis entropy derived for a class of aging processes for the simplest (and probably the generic) case T (z) = z... Λ(z) = T (z)t (z) ν T (1) T (1) + νt (1) 2 = zν 1 ν and S[p] = W i=1 pi where Q 1 + ν and K a constant 0 dzλ(z) = K 1 W i=1 pq i Q 1 Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

21 Remark on Extensive vs MEP Are the extensive (c, d)-entropies of a process equivalent to the (c, d)-entropy of the MEP? No they are NOT but they are related d extensive = 1 c MEP R. Hanel & S. Thurner, Generalized (c,d)-entropy and aging random walks, arxiv: Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

22 Example Consider for example the conditional probability ˆp(x N+1 = i k; q) = 1 Z (k) q i W j=i+1 plug p(i k; q) into the recursive definition of M(k) [ do some not so easy math ]... λ (k ν j ) Use S[p] = 1 φ(n) log M(k) and solve the MEP to get theoretical predictions for the p i for fixed N generate sequences x of length N on the computer (with q i = 1/W ) and produce the histograms k i of the sequences (full lines) Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

23 Example ˆp(i k; q) = 1 Z (k) q i W j=i+1 λ (k ν j ) 10 1 ν=0.25, λ=1.1 α,β 2 1 α β p i simul N=1000 N=5000 N=10000 theory N=1000 N=5000 N= N ε i =i (i) Simulation: generate sequences x on the computer with q i = 1/W and W = 50; (ii) MEP prediction: p i = exp q ( α βε i ) with q = 1 + ν and ε i = i. Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

24 FIN Happy birthday, Constantino! Hanel (CeMSIIS2013) Aging & Entropy Tsallis Birthday / 24

LECTURE 2 Information theory for complex systems

LECTURE 2 Information theory for complex systems LECTURE 2 Information theory for complex systems Stefan Thurner www.complex-systems.meduniwien.ac.at www.santafe.edu rio summer school oct 28 2015 Lecture 1: What are CS? Lecture 2: Basics in information

More information

How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems. Pðk 1 ; k 2 jθ 1 ; θ 2 Þ = θ k 1

How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems. Pðk 1 ; k 2 jθ 1 ; θ 2 Þ = θ k 1 How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems Rudolf Hanel a, Stefan Thurner a,b,c, and Murray Gell-Mann b, a Section for Science of Complex

More information

Lecture 2. Binomial and Poisson Probability Distributions

Lecture 2. Binomial and Poisson Probability Distributions Durkin, Lecture 2, Page of 6 Lecture 2 Binomial and Poisson Probability Distributions ) Bernoulli Distribution or Binomial Distribution: Consider a situation where there are only two possible outcomes

More information

Is There a World Behind Shannon? Entropies for Complex Systems

Is There a World Behind Shannon? Entropies for Complex Systems Is There a World Behind Shannon? Entropies for Complex Systems Stefan Thurner and Rudolf Hanel Abstract In their seminal works, Shannon and Khinchin showed that assuming four information theoretic axioms

More information

Minimum Bias Events at ATLAS

Minimum Bias Events at ATLAS Camille Bélanger-Champagne Lehman McGill College University City University of ew York Thermodynamics Charged Particle and Correlations Statistical Mechanics in Minimum Bias Events at ATLAS Statistical

More information

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution Lecture 2 Binomial and Poisson Probability Distributions Consider a situation where there are only two possible outcomes (a Bernoulli trial) Example: flipping a coin James

More information

Lecture 2: Discrete Probability Distributions

Lecture 2: Discrete Probability Distributions Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture

More information

Discrete Binary Distributions

Discrete Binary Distributions Discrete Binary Distributions Carl Edward Rasmussen November th, 26 Carl Edward Rasmussen Discrete Binary Distributions November th, 26 / 5 Key concepts Bernoulli: probabilities over binary variables Binomial:

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

Statistical. mechanics

Statistical. mechanics CHAPTER 15 Statistical Thermodynamics 1: The Concepts I. Introduction. A. Statistical mechanics is the bridge between microscopic and macroscopic world descriptions of nature. Statistical mechanics macroscopic

More information

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio 4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio Wrong is right. Thelonious Monk 4.1 Three Definitions of

More information

UNIT NUMBER PROBABILITY 6 (Statistics for the binomial distribution) A.J.Hobson

UNIT NUMBER PROBABILITY 6 (Statistics for the binomial distribution) A.J.Hobson JUST THE MATHS UNIT NUMBER 19.6 PROBABILITY 6 (Statistics for the binomial distribution) by A.J.Hobson 19.6.1 Construction of histograms 19.6.2 Mean and standard deviation of a binomial distribution 19.6.3

More information

Mu Sequences and Series Topic Test Solutions FAMAT State Convention 2018

Mu Sequences and Series Topic Test Solutions FAMAT State Convention 2018 . E The first term of the sum has a division by 0 and therefore the entire expression is undefined.. C We can pair off the elements in the sum, 008 008 ( (k + ) + (k + )) = 3. D We can once again pair

More information

Name: Firas Rassoul-Agha

Name: Firas Rassoul-Agha Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE

More information

Sections 5.1 and 5.2

Sections 5.1 and 5.2 Sections 5.1 and 5.2 Shiwen Shen Department of Statistics University of South Carolina Elementary Statistics for the Biological and Life Sciences (STAT 205) 1 / 19 Sampling variability A random sample

More information

Math 180B Problem Set 3

Math 180B Problem Set 3 Math 180B Problem Set 3 Problem 1. (Exercise 3.1.2) Solution. By the definition of conditional probabilities we have Pr{X 2 = 1, X 3 = 1 X 1 = 0} = Pr{X 3 = 1 X 2 = 1, X 1 = 0} Pr{X 2 = 1 X 1 = 0} = P

More information

Lecture 14. More discrete random variables

Lecture 14. More discrete random variables 18.440: Lecture 14 More discrete random variables Scott Sheffield MIT 1 Outline Geometric random variables Negative binomial random variables Problems 2 Outline Geometric random variables Negative binomial

More information

On the average uncertainty for systems with nonlinear coupling

On the average uncertainty for systems with nonlinear coupling On the average uncertainty for systems with nonlinear coupling Kenric P. elson, Sabir Umarov, and Mark A. Kon Abstract The increased uncertainty and complexity of nonlinear systems have motivated investigators

More information

Discrete Probability distribution Discrete Probability distribution

Discrete Probability distribution Discrete Probability distribution 438//9.4.. Discrete Probability distribution.4.. Binomial P.D. The outcomes belong to either of two relevant categories. A binomial experiment requirements: o There is a fixed number of trials (n). o On

More information

Objective - To understand experimental probability

Objective - To understand experimental probability Objective - To understand experimental probability Probability THEORETICAL EXPERIMENTAL Theoretical probability can be found without doing and experiment. Experimental probability is found by repeating

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1 MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter

More information

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39 Entropy Probability and Computing Presentation 22 Probability and Computing Presentation 22 Entropy 1/39 Introduction Why randomness and information are related? An event that is almost certain to occur

More information

Sampling WITHOUT replacement, Order IS important Number of Samples = 6

Sampling WITHOUT replacement, Order IS important Number of Samples = 6 : Different strategies sampling 2 out of numbers {1,2,3}: Sampling WITHOUT replacement, Order IS important Number of Samples = 6 (1,2) (1,3) (2,1) (2,3) (3,1) (3,2) : Different strategies sampling 2 out

More information

III.H Zeroth Order Hydrodynamics

III.H Zeroth Order Hydrodynamics III.H Zeroth Order Hydrodynamics As a first approximation, we shall assume that in local equilibrium, the density f 1 at each point in space can be represented as in eq.(iii.56), i.e. [ ] p m q, t)) f

More information

arxiv: v1 [cond-mat.stat-mech] 17 Jul 2015

arxiv: v1 [cond-mat.stat-mech] 17 Jul 2015 A new entropy based on a group-theoretical structure Evaldo M. F. Curado arxiv:1507.05058v1 [cond-mat.stat-mech] 17 Jul 2015 Centro Brasileiro de Pesquisas Fisicas and National Institute of Science and

More information

Lower Bound Techniques for Statistical Estimation. Gregory Valiant and Paul Valiant

Lower Bound Techniques for Statistical Estimation. Gregory Valiant and Paul Valiant Lower Bound Techniques for Statistical Estimation Gregory Valiant and Paul Valiant The Setting Given independent samples from a distribution (of discrete support): D Estimate # species Estimate entropy

More information

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1 University of California, Berkeley, Statistics 134: Concepts of Probability Michael Lugo, Spring 2011 Exam 1 February 16, 2011, 11:10 am - 12:00 noon Name: Solutions Student ID: This exam consists of seven

More information

Chapter 1 Principles of Probability

Chapter 1 Principles of Probability Chapter Principles of Probability Combining independent probabilities You have applied to three medical schools: University of California at San Francisco (UCSF), Duluth School of Mines (DSM), and Harvard

More information

success and failure independent from one trial to the next?

success and failure independent from one trial to the next? , section 8.4 The Binomial Distribution Notes by Tim Pilachowski Definition of Bernoulli trials which make up a binomial experiment: The number of trials in an experiment is fixed. There are exactly two

More information

MCS 256 Discrete Calculus and Probability Exam 5: Final Examination 22 May 2007

MCS 256 Discrete Calculus and Probability Exam 5: Final Examination 22 May 2007 MCS 256 Discrete Calculus and Probability SOLUTIONS Exam 5: Final Examination 22 May 2007 Instructions: This is a closed-book examination. You may, however, use one 8.5 -by-11 page of notes, your note

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

Spectra of Large Random Stochastic Matrices & Relaxation in Complex Systems

Spectra of Large Random Stochastic Matrices & Relaxation in Complex Systems Spectra of Large Random Stochastic Matrices & Relaxation in Complex Systems Reimer Kühn Disordered Systems Group Department of Mathematics, King s College London Random Graphs and Random Processes, KCL

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

METEOR PROCESS. Krzysztof Burdzy University of Washington

METEOR PROCESS. Krzysztof Burdzy University of Washington University of Washington Collaborators and preprints Joint work with Sara Billey, Soumik Pal and Bruce E. Sagan. Math Arxiv: http://arxiv.org/abs/1308.2183 http://arxiv.org/abs/1312.6865 Mass redistribution

More information

Probability Distributions - Lecture 5

Probability Distributions - Lecture 5 Probability Distributions - Lecture 5 1 Introduction There are a number of mathematical models of probability density functions that represent the behavior of physical systems. In this lecture we explore

More information

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias

More information

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Coin A is flipped until a head appears, then coin B is flipped until

More information

MA 1125 Lecture 15 - The Standard Normal Distribution. Friday, October 6, Objectives: Introduce the standard normal distribution and table.

MA 1125 Lecture 15 - The Standard Normal Distribution. Friday, October 6, Objectives: Introduce the standard normal distribution and table. MA 1125 Lecture 15 - The Standard Normal Distribution Friday, October 6, 2017. Objectives: Introduce the standard normal distribution and table. 1. The Standard Normal Distribution We ve been looking at

More information

The Central Limit Theorem

The Central Limit Theorem The Central Limit Theorem Suppose n tickets are drawn at random with replacement from a box of numbered tickets. The central limit theorem says that when the probability histogram for the sum of the draws

More information

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics Probability Rules MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2018 Introduction Probability is a measure of the likelihood of the occurrence of a certain behavior

More information

Lecture 6: Entropy Rate

Lecture 6: Entropy Rate Lecture 6: Entropy Rate Entropy rate H(X) Random walk on graph Dr. Yao Xie, ECE587, Information Theory, Duke University Coin tossing versus poker Toss a fair coin and see and sequence Head, Tail, Tail,

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Lecture 10: Random Walks in One Dimension

Lecture 10: Random Walks in One Dimension Physical Principles in Biology Biology 3550 Fall 2018 Lecture 10: Random Walks in One Dimension Wednesday, 12 September 2018 c David P. Goldenberg University of Utah goldenberg@biology.utah.edu Announcements

More information

A Fresh Look at the Bayes Theorem from Information Theory

A Fresh Look at the Bayes Theorem from Information Theory A Fresh Look at the Bayes Theorem from Information Theory Tan Bui-Thanh Computational Engineering and Optimization (CEO) Group Department of Aerospace Engineering and Engineering Mechanics Institute for

More information

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014 Introduction. ommon probability distributionsi Math 7 Probability and Statistics Prof. D. Joyce, Fall 04 I summarize here some of the more common distributions used in probability and statistics. Some

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Assignment 4: Solutions

Assignment 4: Solutions Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Chapter 1. Sets and probability. 1.3 Probability space

Chapter 1. Sets and probability. 1.3 Probability space Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability

More information

Lecture 2: Conjugate priors

Lecture 2: Conjugate priors (Spring ʼ) Lecture : Conjugate priors Julia Hockenmaier juliahmr@illinois.edu Siebel Center http://www.cs.uiuc.edu/class/sp/cs98jhm The binomial distribution If p is the probability of heads, the probability

More information

Inferring information about models from samples

Inferring information about models from samples Contents Inferring information about models from samples. Drawing Samples from a Probability Distribution............. Simple Samples from Matlab.................. 3.. Rejection Sampling........................

More information

FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY

FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY Emina Soljanin Mathematical Sciences Research Center, Bell Labs April 16, 23 A FRAME 1 A sequence {x i } of vectors in a Hilbert space with the property

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

Example. If 4 tickets are drawn with replacement from ,

Example. If 4 tickets are drawn with replacement from , Example. If 4 tickets are drawn with replacement from 1 2 2 4 6, what are the chances that we observe exactly two 2 s? Exactly two 2 s in a sequence of four draws can occur in many ways. For example, (

More information

Introduction to Bayesian Statistics

Introduction to Bayesian Statistics Bayesian Parameter Estimation Introduction to Bayesian Statistics Harvey Thornburg Center for Computer Research in Music and Acoustics (CCRMA) Department of Music, Stanford University Stanford, California

More information

I. Introduction to probability, 1

I. Introduction to probability, 1 GEOS 33000/EVOL 33000 3 January 2006 updated January 10, 2006 Page 1 I. Introduction to probability, 1 1 Sample Space and Basic Probability 1.1 Theoretical space of outcomes of conceptual experiment 1.2

More information

Compute f(x θ)f(θ) dθ

Compute f(x θ)f(θ) dθ Bayesian Updating: Continuous Priors 18.05 Spring 2014 b a Compute f(x θ)f(θ) dθ January 1, 2017 1 /26 Beta distribution Beta(a, b) has density (a + b 1)! f (θ) = θ a 1 (1 θ) b 1 (a 1)!(b 1)! http://mathlets.org/mathlets/beta-distribution/

More information

the probability of getting either heads or tails must be 1 (excluding the remote possibility of getting it to land on its edge).

the probability of getting either heads or tails must be 1 (excluding the remote possibility of getting it to land on its edge). Probability One of the most useful and intriguing aspects of quantum mechanics is the Heisenberg Uncertainty Principle. Before I get to it however, we need some initial comments on probability. Let s first

More information

(It's not always good, but we can always make it.) (4) Convert the normal distribution N to the standard normal distribution Z. Specically.

(It's not always good, but we can always make it.) (4) Convert the normal distribution N to the standard normal distribution Z. Specically. . Introduction The quick summary, going forwards: Start with random variable X. 2 Compute the mean EX and variance 2 = varx. 3 Approximate X by the normal distribution N with mean µ = EX and standard deviation.

More information

MATH 10 INTRODUCTORY STATISTICS

MATH 10 INTRODUCTORY STATISTICS MATH 10 INTRODUCTORY STATISTICS Ramesh Yapalparvi Week 2 Chapter 4 Bivariate Data Data with two/paired variables, Pearson correlation coefficient and its properties, general variance sum law Chapter 6

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

4452 Mathematical Modeling Lecture 14: Discrete and Continuous Probability

4452 Mathematical Modeling Lecture 14: Discrete and Continuous Probability Math Modeling Lecture 14: iscrete and Continuous Page 1 4452 Mathematical Modeling Lecture 14: iscrete and Continuous Introduction If you have taken mathematical statistics, then you have seen all this

More information

Short proofs of the Quantum Substate Theorem

Short proofs of the Quantum Substate Theorem Short proofs of the Quantum Substate Theorem Rahul Jain CQT and NUS, Singapore Ashwin Nayak University of Waterloo Classic problem Given a biased coin c Pr(c = H) = 1 - e Pr(c = T) = e Can we generate

More information

Longest runs in coin tossing. Teaching recursive formulae, asymptotic theorems and computer simulations

Longest runs in coin tossing. Teaching recursive formulae, asymptotic theorems and computer simulations tmcs-karacsony 2011/11/20 19:51 page 261 #1 9/2 (2011), 261 274 Longest runs in coin tossing. Teaching recursive formulae, asymptotic theorems and computer simulations Zsolt Karácsony and Józsefné Libor

More information

18 Equilibrium equation of a laminated plate (a laminate)

18 Equilibrium equation of a laminated plate (a laminate) z ν = 1 ν = ν = 3 ν = N 1 ν = N z 0 z 1 z N 1 z N z α 1 < 0 α > 0 ν = 1 ν = ν = 3 Such laminates are often described b an orientation code [α 1 /α /α 3 /α 4 ] For eample [0/-45/90/45/0/0/45/90/-45/0] Short

More information

FIBONACCI EXPRESSIONS ARISING FROM A COIN-TOSSING SCENARIO INVOLVING PAIRS OF CONSECUTIVE HEADS

FIBONACCI EXPRESSIONS ARISING FROM A COIN-TOSSING SCENARIO INVOLVING PAIRS OF CONSECUTIVE HEADS FIBONACCI EXPRESSIONS ARISING FROM A COIN-TOSSING SCENARIO INVOLVING PAIRS OF CONSECUTIVE HEADS MARTIN GRIFFITHS Abstract. In this article we study a combinatorial scenario which generalizes the wellknown

More information

1. Principle of large deviations

1. Principle of large deviations 1. Principle of large deviations This section will provide a short introduction to the Law of large deviations. The basic principle behind this theory centers on the observation that certain functions

More information

Discrete and continuous

Discrete and continuous Discrete and continuous A curve, or a function, or a range of values of a variable, is discrete if it has gaps in it - it jumps from one value to another. In practice in S2 discrete variables are variables

More information

Thermodynamic limit and phase transitions in non-cooperative games: some mean-field examples

Thermodynamic limit and phase transitions in non-cooperative games: some mean-field examples Thermodynamic limit and phase transitions in non-cooperative games: some mean-field examples Conference on the occasion of Giovanni Colombo and Franco Ra... https://events.math.unipd.it/oscgc18/ Optimization,

More information

Eleventh Problem Assignment

Eleventh Problem Assignment EECS April, 27 PROBLEM (2 points) The outcomes of successive flips of a particular coin are dependent and are found to be described fully by the conditional probabilities P(H n+ H n ) = P(T n+ T n ) =

More information

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) S.T. is the key to understanding driving forces. e.g., determines if a process proceeds spontaneously. Let s start with entropy

More information

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions

More information

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin, Chapter 8 Exercises Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin, morin@physics.harvard.edu 8.1 Chapter 1 Section 1.2: Permutations 1. Assigning seats *

More information

Quantitative Biology II Lecture 4: Variational Methods

Quantitative Biology II Lecture 4: Variational Methods 10 th March 2015 Quantitative Biology II Lecture 4: Variational Methods Gurinder Singh Mickey Atwal Center for Quantitative Biology Cold Spring Harbor Laboratory Image credit: Mike West Summary Approximate

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

SUPPLEMENT TO SLOPE IS ADAPTIVE TO UNKNOWN SPARSITY AND ASYMPTOTICALLY MINIMAX. By Weijie Su and Emmanuel Candès Stanford University

SUPPLEMENT TO SLOPE IS ADAPTIVE TO UNKNOWN SPARSITY AND ASYMPTOTICALLY MINIMAX. By Weijie Su and Emmanuel Candès Stanford University arxiv: arxiv:503.08393 SUPPLEMENT TO SLOPE IS ADAPTIVE TO UNKNOWN SPARSITY AND ASYMPTOTICALLY MINIMAX By Weijie Su and Emmanuel Candès Stanford University APPENDIX A: PROOFS OF TECHNICAL RESULTS As is

More information

CS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning

CS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning CS 446 Machine Learning Fall 206 Nov 0, 206 Bayesian Learning Professor: Dan Roth Scribe: Ben Zhou, C. Cervantes Overview Bayesian Learning Naive Bayes Logistic Regression Bayesian Learning So far, we

More information

A Generalized Geometric Distribution from Vertically Dependent Bernoulli Random Variables

A Generalized Geometric Distribution from Vertically Dependent Bernoulli Random Variables Academic Advances of the CTO, Vol. 1, Issue 3, 2017 Original Research Article A Generalized Geometric Distribution from Vertically Dependent Bernoulli Random Variables Rachel Traylor, Ph.D. Abstract This

More information

1 Randomized reduction: a first start

1 Randomized reduction: a first start 6.S078 Fine-Grained Algorithms and Complexity MIT Lectures 3 and 4 February 14 and 21, 2018 Today: In the previous two lectures, we mentioned ETH and SETH, about the time complexity of SAT. Today we want

More information

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite

More information

MAT Mathematics in Today's World

MAT Mathematics in Today's World MAT 1000 Mathematics in Today's World Last Time We discussed the four rules that govern probabilities: 1. Probabilities are numbers between 0 and 1 2. The probability an event does not occur is 1 minus

More information

Entropy for complex systems and its use in path dependent non-ergodic processes

Entropy for complex systems and its use in path dependent non-ergodic processes Entropy for complex systems and its use in path dependent non-ergodic processes Stefan Thurner www.complex-systems.meduniwien.ac.at www.santafe.edu rio oct 21 2015 with Rudolf Hanel and Bernat Corominas-Murtra

More information

Probabilistic modeling. The slides are closely adapted from Subhransu Maji s slides

Probabilistic modeling. The slides are closely adapted from Subhransu Maji s slides Probabilistic modeling The slides are closely adapted from Subhransu Maji s slides Overview So far the models and algorithms you have learned about are relatively disconnected Probabilistic modeling framework

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

Machine Learning for Signal Processing Expectation Maximization Mixture Models. Bhiksha Raj 27 Oct /

Machine Learning for Signal Processing Expectation Maximization Mixture Models. Bhiksha Raj 27 Oct / Machine Learning for Signal rocessing Expectation Maximization Mixture Models Bhiksha Raj 27 Oct 2016 11755/18797 1 Learning Distributions for Data roblem: Given a collection of examples from some data,

More information

arxiv: v1 [cond-mat.stat-mech] 6 Mar 2008

arxiv: v1 [cond-mat.stat-mech] 6 Mar 2008 CD2dBS-v2 Convergence dynamics of 2-dimensional isotropic and anisotropic Bak-Sneppen models Burhan Bakar and Ugur Tirnakli Department of Physics, Faculty of Science, Ege University, 35100 Izmir, Turkey

More information

MFE6516 Stochastic Calculus for Finance

MFE6516 Stochastic Calculus for Finance MFE6516 Stochastic Calculus for Finance William C. H. Leon Nanyang Business School December 11, 2017 1 / 51 William C. H. Leon MFE6516 Stochastic Calculus for Finance 1 Symmetric Random Walks Scaled Symmetric

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Lecture 3: Randomness in Computation

Lecture 3: Randomness in Computation Great Ideas in Theoretical Computer Science Summer 2013 Lecture 3: Randomness in Computation Lecturer: Kurt Mehlhorn & He Sun Randomness is one of basic resources and appears everywhere. In computer science,

More information

Mathematical Modelling Lecture 13 Randomness

Mathematical Modelling Lecture 13 Randomness Lecture 13 Randomness phil.hasnip@york.ac.uk Overview of Course Model construction dimensional analysis Experimental input fitting Finding a best answer optimisation Tools for constructing and manipulating

More information

Discrete Mathematics and Probability Theory Summer 2017 Hongling Lu, Vrettos Moulos, and Allen Tang HW 6

Discrete Mathematics and Probability Theory Summer 2017 Hongling Lu, Vrettos Moulos, and Allen Tang HW 6 CS 70 Discrete Mathematics and Probability Theory Summer 017 Hongling Lu, Vrettos Moulos, and Allen Tang HW 6 Sundry Before you start your homework, write down your team. Who else did you work with on

More information

Nonextensive Aspects of Self- Organized Scale-Free Gas-Like Networks

Nonextensive Aspects of Self- Organized Scale-Free Gas-Like Networks Nonextensive Aspects of Self- Organized Scale-Free Gas-Like Networks Stefan Thurner Constantino Tsallis SFI WORKING PAPER: 5-6-6 SFI Working Papers contain accounts of scientific work of the author(s)

More information

CS1512 Foundations of Computing Science 2. Lecture 4

CS1512 Foundations of Computing Science 2. Lecture 4 CS1512 Foundations of Computing Science 2 Lecture 4 Bayes Law; Gaussian Distributions 1 J R W Hunter, 2006; C J van Deemter 2007 (Revd. Thomas) Bayes Theorem P( E 1 and E 2 ) = P( E 1 )* P( E 2 E 1 ) Order

More information

On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method

On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel ETH, Zurich,

More information

Probability Distributions.

Probability Distributions. Probability Distributions http://www.pelagicos.net/classes_biometry_fa18.htm Probability Measuring Discrete Outcomes Plotting probabilities for discrete outcomes: 0.6 0.5 0.4 0.3 0.2 0.1 NOTE: Area within

More information

Some Statistics. V. Lindberg. May 16, 2007

Some Statistics. V. Lindberg. May 16, 2007 Some Statistics V. Lindberg May 16, 2007 1 Go here for full details An excellent reference written by physicists with sample programs available is Data Reduction and Error Analysis for the Physical Sciences,

More information

Continuous-Valued Probability Review

Continuous-Valued Probability Review CS 6323 Continuous-Valued Probability Review Prof. Gregory Provan Department of Computer Science University College Cork 2 Overview Review of discrete distributions Continuous distributions 3 Discrete

More information

Weighted differential entropy based approaches to dose-escalation in clinical trials

Weighted differential entropy based approaches to dose-escalation in clinical trials Weighted differential entropy based approaches to dose-escalation in clinical trials Pavel Mozgunov, Thomas Jaki Medical and Pharmaceutical Statistics Research Unit, Department of Mathematics and Statistics,

More information