S.R.S Varadhan by Professor Tom Louis Lindstrøm

Similar documents
1 Normal Distribution.

N/4 + N/2 + N = 2N 2.

Uncertainty. Michael Peters December 27, 2013

MATH/STAT 3360, Probability

Lecture 3. January 7, () Lecture 3 January 7, / 35

MITOCW watch?v=vjzv6wjttnc

Probability Theory and Simulation Methods

Chapter 4: An Introduction to Probability and Statistics

6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld. Random Variables

Expected Value II. 1 The Expected Number of Events that Happen

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Part 3: Parametric Models

Grades 7 & 8, Math Circles 24/25/26 October, Probability

Great Theoretical Ideas in Computer Science

Exam 1 Review With Solutions Instructor: Brian Powers

Math 140 Introductory Statistics

Date: Tuesday, 18 February :00PM. Location: Museum of London

What is a random variable

Discrete Probability

Binomial Distribution. Collin Phillips

Continuum Probability and Sets of Measure Zero

the time it takes until a radioactive substance undergoes a decay

Hypothesis testing I. - In particular, we are talking about statistical hypotheses. [get everyone s finger length!] n =

CS 361: Probability & Statistics

Part 3: Parametric Models

Introductory Probability

Are Spinners Really Random?

Chapter 26: Comparing Counts (Chi Square)

Lectures Conditional Probability and Independence

Chapter 8: An Introduction to Probability and Statistics

RANDOM WALKS IN ONE DIMENSION

Math 140 Introductory Statistics

*Karle Laska s Sections: There is no class tomorrow and Friday! Have a good weekend! Scores will be posted in Compass early Friday morning

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

CS 246 Review of Proof Techniques and Probability 01/14/19

Conditional Probability

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

COMP6053 lecture: Sampling and the central limit theorem. Jason Noble,

CHAPTER 15 PROBABILITY Introduction

Statistical Methods for the Social Sciences, Autumn 2012

Math 493 Final Exam December 01

P (E) = P (A 1 )P (A 2 )... P (A n ).

Testing a Hash Function using Probability

Lecture 6 - Random Variables and Parameterized Sample Spaces

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

Learning From Data Lecture 14 Three Learning Principles

CS 361: Probability & Statistics

Examples of frequentist probability include games of chance, sample surveys, and randomized experiments. We will focus on frequentist probability sinc

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

What is Probability? Probability. Sample Spaces and Events. Simple Event

CS 361: Probability & Statistics

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Linear Programming and its Extensions Prof. Prabha Shrama Department of Mathematics and Statistics Indian Institute of Technology, Kanpur

COMP6053 lecture: Sampling and the central limit theorem. Markus Brede,

Probability and Statistics

Notes for Math 324, Part 20

The topics in this section concern with the first course objective.

Probability (Devore Chapter Two)

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

CMPSCI 240: Reasoning Under Uncertainty

CONTENTS OF DAY 2. II. Why Random Sampling is Important 10 A myth, an urban legend, and the real reason NOTES FOR SUMMER STATISTICS INSTITUTE COURSE

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Lecture Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability

Conditional Probability

Bayesian Estimation An Informal Introduction

CS 361: Probability & Statistics

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

Intermediate Math Circles November 15, 2017 Probability III

STAT/SOC/CSSS 221 Statistical Concepts and Methods for the Social Sciences. Random Variables

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Introduction to Modern Cryptography. Benny Chor

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Heat and Mass Transfer Prof. S. P. Sukhatme Department of Mechanical Engineering Indian Institute of Technology, Bombay Lecture No.

Chapter 4 - Introduction to Probability

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

RVs and their probability distributions

Formalizing Probability. Choosing the Sample Space. Probability Measures

Physics Motion Math. (Read objectives on screen.)

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Intro to Stats Lecture 11

Note that we are looking at the true mean, μ, not y. The problem for us is that we need to find the endpoints of our interval (a, b).

Chance and Randomness

Basic Probability. Introduction

Collisions in Two Dimensions

Chapter 5. Means and Variances

Chapter 7 Wednesday, May 26th

where Female = 0 for males, = 1 for females Age is measured in years (22, 23, ) GPA is measured in units on a four-point scale (0, 1.22, 3.45, etc.

MAT Mathematics in Today's World

STOR Lecture 4. Axioms of Probability - II

IE 4521 Midterm #1. Prof. John Gunnar Carlsson. March 2, 2010

CIVL Probability vs. Statistics. Why are we studying probability and statistics? Basic Laws and Axioms of Probability

Transcription:

S.R.S Varadhan by Professor Tom Louis Lindstrøm Srinivasa S. R. Varadhan was born in Madras (Chennai), India in 1940. He got his B. Sc. from Presidency College in 1959 and his Ph.D. from the Indian Statistical Institute in 1963. Since 1963 he has been working at the Courant Institute of Mathematical Sciences at New York University. The Courant Institute one the worlds leading centers of applied mathematics is also the home institution of the 2005 Abel Laureate, Peter Lax. Although his work is often motivated by problems in neighboring fields such as mathematical physics and partial differential equations, Varadhan is primarily a probabilist. Historically, probability theory started as an attempt to understand simple games of chance, often with betting involved. In such games there is a finite number of possible outcomes, and the task is to find the probability of each one. Although this may sound simple, these problems often require a lot ingenuity. The subject soon moved on, however, to more important and difficult issues. These problems often have to do with what happens if one repeats the same experiment over and over again, and the mathematical laws governing these repeated experiments are often called limit laws as they describe what happens in the limit as one performs the experiments more and more times. Two of these limit laws we all have some experience with. To describe the first of these laws, let us assume that we are tossing a coin many times. If the coin is fair, we would expect the proportion of heads to stabilize around 1/2 as we

toss the coin more and more times. This is an instance of the Law of Large of Numbers which says that if we repeat the experiment more and more times, the proportion of heads will (with probability one) go to exactly 1/2. The figures above illustrate what happens. In each case we have performed four sequences of coin tosses and computed the proportion of heads. The figure in the upper, left hand corner shows the results when we toss the coin 50 times. The other figures show the results when we toss the coin 100 times, 200 times and 1000 times, respectively. We see that the convergence is rather slow; even when we toss the coin 1000 times, the results are clearly separable. To describe the other limit law that we all have some experience with, let us assume that we are measuring the heights of Norwegian (male) soldiers. If we make a diagram of how many soldiers there are of each height (i.e. how many are 175 cm tall, how many are 176 cm etc.), we soon see that this diagram takes on a bell shaped form, and as we add the heights of more and more soldiers, the curve becomes more and more regular. The curve is symmetric around its center, and the center point is the average height (around 180 cm for Norwegian male soldiers). If we did the same experiment with female soldiers, we would get the same kind of bell shaped curve, but with a different center (as woman are in average not as tall as men) and a slightly different width. What we are seeing here are consequences of the Central Limit Theorem which basically says that statistical properties which depend on a lot of small, independent factors have a bell shaped distribution. These bell shaped distributions are called normal distributions, and different normal distributions are described by two numbers (the mean and the variance) one telling us where the center is and the other telling us how wide and flat the curve is. The figure below shows two normal distributions with different centers and different widths, i.e. different means and different variances. Why are these limit laws important? They are important because in most practical situations one is interested in large collections of statistical data. If you are running a car insurance company, you are not interested in each individual car, but you are interested in how many accidents (and what kinds of accidents) all the cars you insure will be in involved in. If you are building an oil drilling platform in the North Sea, and you worry about the impact of the ocean waves on the construction, you don t worry about the impact of each individual wave, but of the collective impact of all of them. If you are building a telephone network and worry about the capacity, you are not interested in each

individual customer, but you worry about the probability that too many of them will pick up the phone at the same time during peak hours. If you look at the mathematical problems suggested by the examples above, you will find that many of them can be solved using the Law of Large Numbers and the Central Limit Theorem. But not all! An interesting question they can not tackle, is the question of large deviations. To explain this problem, let us go back to coin tossing. If we flip a coin many times, we expect the proportion of heads to be around 1/2. But this need not happen even if you flip the coin a thousand times, there is a small (extremely small!) probability that the coin will show heads every time. There is a larger but still extremely small probability that the proportion of heads will be (say) 3/4 instead of 1/2. The art of large deviations is to calculate the probability of such rare events. Large deviations were first studied by the great Swedish statistician and insurance mathematician Harald Cramér (1893-1985) in the late 1930s. It is easy to see why the problem would attract the attention of somebody interested in insurance mathematics. The premium you have to pay for your car insurance, is based on the statistics from previous years the company needs to collect enough money to cover the claims of unfortunate drivers. But what if this year happens to be an extremely bad year that for some unforeseen reason (perhaps just bad luck) much more cars crash than previous years? If the company has to pay out more money than it has got, it is obviously in trouble! There is no way one can totally avoid this problem if the company made the premium so high that it covered the unlikely (but still possible) case that all cars crashed, the insurance would be so expensive that nobody would buy it! Now, a bad year is a large deviation, and what the company needs to do, is to compute the probability of these large deviations of various sizes, and try to find a reasonable level of risk. There are similar problems in our other examples above if you build an oil rig in the North Sea, you have to worry about the probability of extremely large and extremely rare waves ( the hundred year wave ), and if you set up a telephone network, you need to know how likely it is that it will occasionally break down due to overload. It may be worthwhile to invest a little more in increased capacity rather than have to face angry customers every now and then! One of Varadhan s great contributions is to turn the technique of large deviations into a very strong and versatile tool with applications in many areas of mathematics and related sciences (some of this work was done in collaboration with his colleague at the Courant Institute, Monroe Donsker). Varadhan s Large Deviation Principle succinctly sums up what is needed to apply the technique successfully, and it covers a surprising number of seemingly different situations. The theory is a tour de force of many areas of mathematics; it combines probability theory with convex analysis, nonlinear programming, functional analysis and partial differential equations. It turns out that the theory of large deviations is much more subtle than the theory of classical limit laws such as the Law of Large Numbers and the Central Limit Theorem. In these limit laws, the important thing is that the same kind of event is counted again and again; the nature of

each individual event is of little importance (or, more correctly, what is important is easily summarized in two numbers, the mean and the variance). In large deviations, the nature of the individual events is of the outmost importance different kinds of events give rise to quite different probabilities for large deviations. An insurance company which wants to compute precise estimates for large deviations hence needs to know more about car accidents than just how often they happen and how much they cost on average. In addition to the examples I have already referred to, I would like to add the many applications that large deviation theory has found in mathematical physics. Many physical theories are statistical in nature. If, for instance, you want to describe the air in this room or the water in a flowing river, you cannot possible describe the motion of each individual molecule or particle involved. Instead you describe the statistical behavior of the totality of particles in terms of macroscopic quantities such as pressure and flux. The laws and equations you then get are not probabilistic, they just describe the average or expected behavior of the gas or the fluid. You may think of this as a much more complicated version of our coin-tossing experiment in that experiment, the average behavior of our probabilistic experiment was summed up in the simple number 50%; in the present case, the total probabilistic behavior of the particles is summed up in the laws of thermodynamics and hydrodynamics! But as in the coin-tossing game, there are fluctuations also in this situation perhaps there is a very small probability that all the air in this room will suddenly concentrate on this end and leave you suffocated at the other end! In fact, these problems are even more complicated than what I have described so far since the behavior of the particles is not really random at all they move according to the fundamental laws of physics, and their behavior only looks random because they collide and interact all the time. The derivation of the equations of hydrodynamics and thermodynamics from first principles is thus a two stage process first to derive the statistical behavior of the particles from the laws of physics and then to deduce macroscopic laws from this statistical description. With collaborators, Varadhan has done impressive work in this area, often using large deviations as a tool. Let us go back to the coin tossing experiment to take a look at another aspect of Varadhan s work. We can think of coin tossing as a very simple gambling game each time we toss the coin and heads come up, I win and you have to pay me a dollar, but each time tails come up, you win and I have to pay you a dollar. This game is fair in an obvious sense in average, I cannot expect to win anything and neither can you! Let us change the game slightly and throw a die instead of a coin. This time I get 5 dollars from you each time we throw a six, but you get one dollar from me each time one of the other sides comes up. The game is still fair in the sense I described above I get five times as much as you each time I win, but you can expect to win five times as often, and neither of us can expect to win anything in the long run. Games which are fair in this sense are called martingales, and this notion can be generalized to much more general contexts. Over the last fifty years, it has become clear that martingales are extremely useful tools in the study of random phenomena. In the 1970s, Varadhan and D.W. Stroock wrote an impressive series of papers on so called martingale problems

culminating in their book Multidimensional Diffusion Processes of 1979. Their approach unified, simplified and extended the previous results in the area substantially. The basic idea is that instead of looking for solutions to quite complicated problems of mathematical analysis, all one has to look for is a probability distribution which turns certain processes into martingales. - - I have mentioned some the most important of Varadhan s contributions to mathematics, but there are many more. His is a prolific scientist with deep insights and an impressive array of technical tools, and he is very highly regarded and esteemed in the probability community. This does not only have to do with his results, but also his style listening to a lecture by Varadhan, one is not only exposed to the best and most recent results in the subject, but one is also introduced to a way of thinking. His talks always emphasize the basic ideas, the challenges, the obstacles, and the delicate balance between the desirable and the possible which one has to strike to produce top class mathematical results. S. R. S. Varadhan is certainly a worthy winner of the Abel Prize!