Zurich Open Repository and Archive University of Zurich Main Library Strickhofstrasse 39 CH-8057 Zurich www.zora.uzh.ch Year: 1996 A proof of a conjecture of Bobkov and Houdré Kwapien, S; Pycia, Marek; Schachermayer, W DOI: https://doi.org/10.114/ecp.v1-97 Posted at the Zurich Open Repository and Archive, University of Zurich ZORA URL: https://doi.org/10.5167/uzh-151941 Published Version Originally published at: Kwapien, S; Pycia, Marek; Schachermayer, W (1996). A proof of a conjecture of Bobkov and Houdré. Electronic Communications in Probability, 1(0):7-10. DOI: https://doi.org/10.114/ecp.v1-97
Elect. Comm. in Probab. 1 (1996) 7 10 ELECTRONIC COMMUNICATIONS in PROBABILITY A PROOF OF A CONJECTURE OF BOBKOV AND HOUDRE S. KWAPIEN and M.PYCIA Department of Mathematics, Warsaw University, ul.banacha, 0-097 Warsaw, Poland. e-mail: kwapstan@mimuw.edu.pl, mpycia@mimuw.edu.pl W. SCHACHERMAYER Department of Statistics, University of Vienna, Bruennerstrasse 7, A-110 Wien, Austria. e-mail: wschach@stat1.bwl.univie.ac.at AMS 1991 Subject classification: 60E05 Keywords and phrases: Gaussian Distribution, Characterization. Abstract S.G. Bobkov and C. Houdré recently posed the following question on the Internet ([1]): Let X, Y be symmetric i.i.d. random variables such that: IP{ X + Y t} IP{ X t}, for each t>0. Does it follow that X has finite second moment (which then easily implies that X is Gaussian)? In this note we give an affirmative answer to this problem and present a proof. Using a different method K. Oleszkiewicz has found another proof of this conjecture, as well as further related results. We prove the following: Theorem. Let X, Y be symmetric i.i.d random variables. If, for each t>0, then X is Gaussian. IP{ X + Y t} IP{ X t}, (1) Proof. Step 1. IE{ X p } < for 0 p<. For this purpose it will suffice to show that, for p<, X has finite weak p th moment, i.e., that there are constants C p such that IP{ X t} C p t p. To do so, it is enough to show that, for ɛ>0,δ > 0, we can find t 0 such that, for t t 0,we have 7
8 Electronic Communications in Probability Fix ɛ>0. Then: IP{ X ( +ɛ)t} 1 IP{ X t}. () δ IP{ X + Y t} =IP{X + Y t} IP{X ( +ɛ)t, Y ɛt, or Y ( +ɛ)t, X ɛt} =(IP{X ( +ɛ)t}ip{y ɛt} IP{X ( +ɛ)t}ip{y ( +ɛ)t}) =IP{ X ( +ɛ)t}(ip{y ɛt} 1 IP{X ( +ɛ)t}) ( δ)ip{ X ( +ɛ)t}, where δ>0may be taken arbitrarily small for t large enough. Using (1) we obtain inequality (). Step. Let α 1,..., α n be real numbers such that α 1 +...+α n 1andlet(X i) be i.i.d. copies of X; then IE{ α 1 X 1 +... + α n X n } IE{ X }. We shall repeatedly use the following result: Fact: Let S and T be symmetric random variables such that IP{ S t) IP{ T t), for all t>0, and let the random variable X be independent of S and T.Then IE{ S + X } IE{ T + X }. Indeed, for fixed x IR, the function h(s) = s+x + s x is symmetric and non-decreasing in s IR + and therefore S + x + S x T + x + T x IE{ S + x } =IE{ } IE{ } =IE T + x }. Now take a sequence β 1,..., β n { k/ : k IN 0 }, such that α i β i < α i. Then β1 +... + βn and IE{ α 1 X 1 +... + α n X n } IE{ β 1 X 1 +... + β n X n }. If there is i j with β i = β j we may replace β 1,...,β n by γ 1,...,γ n 1 with n β i = n 1 γ j and n n 1 IE{ β i X i } IE{ γ j X j }. (3) Indeed, supposing without loss of generality that i = n 1andj = n we let γ i = β i,for i =1,...,n andγ n 1 = β n 1 = β n. With this definition we obtain (3) from (1) and the above mentioned fact. Applying the above argument a finite number of times we end up with 1 m n and numbers (γ j ) m in { k/ : k IN 0 }, γ i γ j,fori j, satisfying m γ j and n m IE{ α i X i } IE{ γ j X j }.
B-H Conjecture 9 To estimate this last expression it suffices to consider the extreme case γ j = (j 1)/,for j =1,...,m. In this case applying again repeatedly the argument used to obtain (3): IE{ m m 1 (j 1)/ X j } IE{ (j 1)/ X j + (m 1)/ X m } m IE{ (j 1)/ X j + (m )/ X m } IE{ X 1 + X } IE{ X 1 } = IE{ X 1 }. Step 3. IE{X } <. We deduce from Step that for a sequence (α i ) with α i < the series α i X i converges in mean and therefore almost surely. Using the notation { S if S 1, [S] = sign(s) if S 1. for a random variable S, we deduce from Kolmogorov s three series theorem that IE{[α i X i ] } <. Suppose now that IE{X } = ; this implies that for every C>0, we can find α>0 such that IE{[αX] } Cα. From this inequality it is straightforward to construct a sequence (α i ) such that IE{[α i X i ] } =, while α i <, a contradiction proving Step 3. Step 4. Finally, we show how IE{X } < implies that X is normal. We follow the argument of Bobkov and Houdré []. The finiteness of the second moment implies that we must have equality in the assumption of the theorem, i.e., IP{ X + Y t} =IP{ X t}. Indeed, assuming that there is strict inequality in (1) for some t>0, we would obtain that the second moment of X + Y is strictly smaller than the second moment of X, whichleads to a contradiction: IE{X } > IE{(X + Y ) } =IE{X } +IE{Y } =IE{X }. Hence, n/ (X 1 +...+ X n) has the same distribution as X and we deduce from the Central Limit Theorem that X is Gaussian.
10 Electronic Communications in Probability References [1] S.G. Bobkov, C.Houdré (1995): Open Problem, Stochastic Analysis Digest 15 [] S.G. Bobkov, C. Houdré (1995): A characterization of Gaussian measures via the isoperimetric property of half-spaces, (preprint).