About Closedness by Convolution of the Tsallis Maximizers
|
|
- Shana Golden
- 6 years ago
- Views:
Transcription
1 About Closedness by Convolution of the Tsallis Maximizers C. Vignat*, A. O. Hero III and J. A. Costa E.E.C.S., University of Michigan, Ann Arbor, USA (*) L.T.C.I., Université de Marne la Vallée, France Abstract In this paper, we study the stability under convolution of the maximizing distributions of the Tsallis entropy under energy constraint (called hereafter Tsallis distributions). These distributions are shown to obey three important properties: a stochastic representation property, an orthogonal invariance property and a duality property. As a conseuence of these properties, the behaviour of Tsallis distributions under convolution is characterized. At last, a special random convolution, called Kingman convolution, is shown to ensure the stability of Tsallis distributions. Key words: Tsallis entropy, convolution Introduction It is well-known that the Boltzmann distributions are the maximizers of the Shannon-Boltzmann entropy under energy (covariance) constraint. These Boltzmann distributions enjoy the stability property for addition: if X and Y are independent vectors, each distributed according to a Boltzmann law, then their sum = X + Y is again of the Boltzmann type. This property is important in physics since it is involved in some very general results like the central limit theorem [6]. The Tsallis entropy was introduced by Tsallis [0], and proved as a very efficient tool to describe the behavior of complex systems like the interior solar plasma [] or self-gravitating systems []. A particular case of the Tsallis entropy family is the Shannon-Boltzmann entropy. The maximizers of the Tsallis entropy under covariance constraint - called Tsallis distributions in this paper - were studied in detail by several authors, Preprint submitted to Elsevier Preprint 0 November 003
2 in the scalar case in [3], in the multivariate case in [4] and more recently in [5]. A natural uestion is thus to explore the stability under convolution of these Tsallis distributions. This is the problem addressed in this paper. Tsallis distributions The order- Tsallis entropy H (f) of a continuous probability density f writes H (f) = µ f. Ω It can be easily checked that lim H (f) exists and coincides with the celebrated Boltzmann-Shannon entropy H (f) = f log f. In this paper, x denotes an n dimensional real-valued random vector with covariance matrix K=E (x µ X )(x µ X ) T. Without loss of generality, we consider only the centered case µ X =0. We define next the following n variate probability density f as follows: if n n + <, f ³ (x) =A ( ) βx T K x + 8x R n () with x + =max(0,x), β =. The problem of maximization of the n( ) Tsallis entropy under energy constraint can be easily solved using a Bregman information divergence, as described in the following theorem. Theorem f defined by () is the only probability density that verifies f =arg max H f : Exx T (f). =K PROOF. Consider the following non-symmetric Bregman divergence D (fjjg) =sign ( ) f + g fg. () The positivity of D (fjjg) (with nullity if and only if f = g pointwise) is a conseuence of the convexity of function x 7! sign ( ) x /. Suppose for example >:the fact that distribution f has the same covariance K as f defined by () can be expressed by f = f f
3 so that f 0 D (fjjf )= + f f = f f = (H (f ) H (f)). The proof in the case < follows accordingly. The Tsallis distributions verify three important properties: ² the stochastic representation property. If X is Tsallis distributed with parameter < and covariance matrix K then X d = CN A (3) where A is a chi random variable with m = n+ degrees of freedom, independent of the Gaussian vector N (ENN T = I)andwithC =(m ) K. If Y is Tsallis distributed with parameter >, then Y d = CN A + knk where A is a chi random variable with + degrees of freedom. Note that the denominator is again a chi random variable, but that contrary to the case <, it is now dependent on the numerator. ² the orthogonal invariance property. The Tsallis distributions write as ³ f (x) =φ x T K x. This property is a direct conseuence of the invariance under orthogonal transformation of the covariance constraint. ² the duality property. There is a natural bijection between the cases < and > : ifx is Tsallis with parameter <, m = n + and C =(m ) K then X Y = p XT C X is Tsallis with covariance matrix m K and parameter m+ 0 > such that 0 = n. (4) The stability issue The stability problem can be solved using the properties described above, and the main theorem is the following. 3
4 Theorem () if X is a Tsallis n vector with parameter and if H is a full-rank ñ n matrix with ñ n then X = HX is a Tsallis ñ vector with parameter such that ñ = n () as a particular case, if X and X are mutually Tsallis distributed (as components of a Tsallis vector X T = h X T, X T i ), then any linear combination Y = H X + H X where H and H are full-rank matrices, is a Tsallis vector (3) if X and X are both Tsallis but independent (and then X T = h i X T, X T is not Tsallis), then a linear combination Y = H X +H X is not Tsallis (4) if X and X are scalar, each with parameter <, then there exists a convolution of the random type (called Kingman convolution) Y = X X such that Y is Tsallis with the same parameter as X and Y. The three first results, that hold in both cases < and >, are a direct conseuence of the orthogonal invariance property, and their detailed proofs can be found in [5]. The last statement is developed and proved in the next section. We stress the point that, contrary to the Boltzmann ( =)case, stability under addition holds only if X and X are dependent, in the sense that they share the same mixing random variable A as introduced in (3) and (4). This result sheds a new light on the relationship between independence and stability in the set of Tsallis distributions. 3 Kingman convolution (case <) The results about Kingman convolution are described in the following theorem. Theorem 3 If X and Y are independent and scalar Tsallis random variables with parameter < and if λ is a Tsallis random variable independent of X and Y andwithparameter 0 =, then the random variable = XY p X + Y +λxy (5) note that this assumption implies that X and X are dependent 4
5 is Tsallis with parameter and variance σ such that σ = σ X + σ Y. PROOF. A classical result about Hankel transform writes (see [9, formula ]) + Ω m (u jxj) f (x) dx = e σ X m u 8u>0 (6) with m =, where the Bessel convolution kernel Ω m writes µ Ω m (u) =j m u>0 (7) u and where j ν denotes the normalized Bessel function of the first kind j ν (x) = ν Γ (ν +)x ν J ν (x). Considering a scalar Tsallis random variable X with parameter <, euality (6) writes euivalently E Xm Ω m (u jxj) =e σ X m u 8u>0. Now Sonine-Gegenbauer s integral writes ([7] cited by [8]): Γ ³ m Ω m (x) Ω m (y) = p ³ πγ m + Ã! ³ Ω m p x λ m 3 dλ + y +λx y Γ( and we remark that the function m ³ ) πγ( m λ m 3 is the distribution of a ) Tsallis random variable λ with parameter λ such as λ = m m 3 >. Now choose X and Y independent and Tsallis with parameter < and respective variances σ X and σ Y so that Ã! jxy j Ω m (u jxj) Ω m (u jy j) =E λ Ω m p X + Y +λxy u, 8u >0. But as E X Ω m (u jxj) E Y Ω m (u jy j) =e u (m ) we deduce, by defining = XY p X + Y +λxy ³ + σ X σ Y, 5
6 that E,λ Ω m (u jj) =e u (m ) ³ + σ X σ Y. (8) From euality (8), we conclude, using [8, lemma 4], that is Tsallis with parameter and variance σ such that σ = σ X + σ Y. We note that the Kingman convolution expressed by (5) is associative and commutative (see [8]); moreover, it is a convolution of random type, since it combines both random variables X and Y with a third one, λ, that should be chosen independent of X and Y, and acts as a mixing variable. The physical interpretation of these results is currently under study. Acknowledgements The first author would like to thank Pr. P.H. Chavanis and Pr. J. Naudts for fruitful discussions during Next 003 Conference. References [] G. Kaniadakis, A. Lavagno and P. Quarati, Generalized statistics and solar neutrinos, Physics Letters B, 369, 3-4, (996), [] A. R. Plastino, A. Plastino, Stellar polytropes and Tsallis entropy, Phys. Lett. A 74 (993), 5-6, [3] S. Moriguti, A lower bound for a probability moment of any absolutely continuous distribution with finite variance, Ann. Math. Stat. 3, 95, [4] J. N. Kapur, Generalized Cauchy and Student s distributions as maximumentropy distributions, Proc. Nat. Acad. Sci. India Sect. A 58 (988),, [5] C. Vignat, J. Costa and A. Hero, On Solutions to Multivariate Maximum alphaentropy Problems, Lecture Notes in Computer Science, Vol. 683, 003, 8 [6] A.R. Barron, Entropy and the Central limit Theorem, The Annals of Probability, 4-, (986), [7] G. N. Watson, A treatise on the theory of Bessel functions, Cambridge, 944 [8] J.F.C. Kingman, Random Walks with Spherical Symmetry, Act. Math. 09, 963, 53 [9] I.S. Gradshteyn, I.M. Ryzhik, TableofIntegrals,SeriesandProducts, Sixth edition, Alan Jeffrey and Daniel willinger (eds.) [0] C. Tsallis, Possible Generalization of Boltzmann-Gibbs Statistics, J. Stat. Phys. 5, 988,
J. Costa, A. Hero, and C. Vignat The Rényi -entropy [7] is a generalization of the Shannon entropy and is dened as follows: S (f) = 1 1 log f (x)dx ;
On Solutions to Multivariate Maximum -entropy Problems J. Costa 1, A. Hero 1, and C. Vignat 1 Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 48109-1, USA
More informationStatistical physics models belonging to the generalised exponential family
Statistical physics models belonging to the generalised exponential family Jan Naudts Universiteit Antwerpen 1. The generalized exponential family 6. The porous media equation 2. Theorem 7. The microcanonical
More informationStability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: A basis for q-exponential distributions
PHYSICAL REVIE E 66, 4634 22 Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: A basis for -exponential distributions Sumiyoshi Abe Institute of Physics, University
More informationAlso, in recent years, Tsallis proposed another entropy measure which in the case of a discrete random variable is given by
Gibbs-Shannon Entropy and Related Measures: Tsallis Entropy Garimella Rama Murthy, Associate Professor, IIIT---Hyderabad, Gachibowli, HYDERABAD-32, AP, INDIA ABSTRACT In this research paper, it is proved
More informationSome results concerning maximum Rényi entropy distributions
Some results concerning maximum Rényi entropy distributions Oliver Johnson Christophe Vignat July 9, 005 English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy
More informationAn integral formula for L 2 -eigenfunctions of a fourth order Bessel-type differential operator
An integral formula for L -eigenfunctions of a fourth order Bessel-type differential operator Toshiyuki Kobayashi Graduate School of Mathematical Sciences The University of Tokyo 3-8-1 Komaba, Meguro,
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationarxiv:astro-ph/ v2 19 Nov 2002
Astronomy & Astrophysics manuscript no. A AMRFF February, 8 (DOI: will be inserted by hand later) Jeans gravitational instability and nonextensive kinetic theory J. A. S. Lima, R. Silva, J. Santos Departamento
More informationSome results concerning maximum Rényi entropy distributions
arxiv:math/0507400v2 [math.pr] 23 May 2006 Some results concerning maximum Rényi entropy distributions Oliver Johnson February 2, 2008 Christophe Vignat English abstract: We consider the Student-t and
More informationMAXIMUM ENTROPIES COPULAS
MAXIMUM ENTROPIES COPULAS Doriano-Boris Pougaza & Ali Mohammad-Djafari Groupe Problèmes Inverses Laboratoire des Signaux et Systèmes (UMR 8506 CNRS - SUPELEC - UNIV PARIS SUD) Supélec, Plateau de Moulon,
More informationLimits from l Hôpital rule: Shannon entropy as limit cases of Rényi and Tsallis entropies
Limits from l Hôpital rule: Shannon entropy as it cases of Rényi and Tsallis entropies Frank Nielsen École Polytechnique/Sony Computer Science Laboratorie, Inc http://www.informationgeometry.org August
More information2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates
.7 The Gaussian Probability Density Function Samples taken from a Gaussian process have a jointly Gaussian pdf (the definition of Gaussian process). Correlator outputs are Gaussian random variables if
More informationOn the entropy flows to disorder
CHAOS 2009 Charnia, Crete 1-5 June 2009 On the entropy flows to disorder C.T.J. Dodson School of Mathematics University of Manchester, UK Abstract Gamma distributions, which contain the exponential as
More informationInvestigation of microwave radiation from a compressed beam of ions using generalized Planck s radiation law
Investigation of microwave radiation from a compressed beam of ions using generalized Planck s radiation law Sreeja Loho Choudhury 1, R. K. Paul 2 Department of Physics, Birla Institute of Technology,
More informationarxiv:cond-mat/ v1 10 Aug 2002
Model-free derivations of the Tsallis factor: constant heat capacity derivation arxiv:cond-mat/0208205 v1 10 Aug 2002 Abstract Wada Tatsuaki Department of Electrical and Electronic Engineering, Ibaraki
More informationContinuous Wave Data Analysis: Fully Coherent Methods
Continuous Wave Data Analysis: Fully Coherent Methods John T. Whelan School of Gravitational Waves, Warsaw, 3 July 5 Contents Signal Model. GWs from rotating neutron star............ Exercise: JKS decomposition............
More informationarxiv:astro-ph/ v2 29 Nov 2004
Dark matter distribution function from non-extensive statistical mechanics arxiv:astro-ph/0407111v2 29 Nov 2004 Steen H. Hansen, Daniel Egli, Lukas Hollenstein, Christoph Salzmann University of Zurich,
More informationFisher information and Stam inequality on a finite group
Fisher information and Stam inequality on a finite group Paolo Gibilisco and Tommaso Isola February, 2008 Abstract We prove a discrete version of Stam inequality for random variables taking values on a
More informationarxiv: v2 [math-ph] 11 Jun 2015
On the relevance of q-distribution functions: The return time distribution of restricted random walker Jaleh Zand 1,, Ugur Tirnakli 2,, and Henrik Jeldtoft Jensen 1, 1 Centre for Complexity Science and
More informationFrom time series to superstatistics
From time series to superstatistics Christian Beck School of Mathematical Sciences, Queen Mary, University of London, Mile End Road, London E 4NS, United Kingdom Ezechiel G. D. Cohen The Rockefeller University,
More informationStat 5101 Lecture Notes
Stat 5101 Lecture Notes Charles J. Geyer Copyright 1998, 1999, 2000, 2001 by Charles J. Geyer May 7, 2001 ii Stat 5101 (Geyer) Course Notes Contents 1 Random Variables and Change of Variables 1 1.1 Random
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationEXPLICIT MULTIVARIATE BOUNDS OF CHEBYSHEV TYPE
Annales Univ. Sci. Budapest., Sect. Comp. 42 2014) 109 125 EXPLICIT MULTIVARIATE BOUNDS OF CHEBYSHEV TYPE Villő Csiszár Budapest, Hungary) Tamás Fegyverneki Budapest, Hungary) Tamás F. Móri Budapest, Hungary)
More informationx log x, which is strictly convex, and use Jensen s Inequality:
2. Information measures: mutual information 2.1 Divergence: main inequality Theorem 2.1 (Information Inequality). D(P Q) 0 ; D(P Q) = 0 iff P = Q Proof. Let ϕ(x) x log x, which is strictly convex, and
More informationarxiv:math-ph/ v4 22 Mar 2005
Nonextensive triangle euality and other properties of Tsallis relative-entropy minimization arxiv:math-ph/0501025v4 22 Mar 2005 Ambedkar Dukkipati, 1 M. Narasimha Murty,,2 Shalabh Bhatnagar 3 Department
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationarxiv:cond-mat/ v1 [cond-mat.stat-mech] 5 Dec 1996
Comparison of the Standard Statistical Thermodynamics (SST) arxiv:cond-mat/9612055v1 [cond-mat.stat-mech] 5 Dec 1996 with the Generalized Statistical Thermodynamics (GST) Results for the Ising Chain Uǧur
More informationOn the concentration of eigenvalues of random symmetric matrices
On the concentration of eigenvalues of random symmetric matrices Noga Alon Michael Krivelevich Van H. Vu April 23, 2012 Abstract It is shown that for every 1 s n, the probability that the s-th largest
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationOn the average uncertainty for systems with nonlinear coupling
On the average uncertainty for systems with nonlinear coupling Kenric P. elson, Sabir Umarov, and Mark A. Kon Abstract The increased uncertainty and complexity of nonlinear systems have motivated investigators
More informationA Holevo-type bound for a Hilbert Schmidt distance measure
Journal of Quantum Information Science, 205, *,** Published Online **** 204 in SciRes. http://www.scirp.org/journal/**** http://dx.doi.org/0.4236/****.204.***** A Holevo-type bound for a Hilbert Schmidt
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationMATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015
ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which
More informationL n = l n (π n ) = length of a longest increasing subsequence of π n.
Longest increasing subsequences π n : permutation of 1,2,...,n. L n = l n (π n ) = length of a longest increasing subsequence of π n. Example: π n = (π n (1),..., π n (n)) = (7, 2, 8, 1, 3, 4, 10, 6, 9,
More informationQUADRATIC EQUATIONS AND MONODROMY EVOLVING DEFORMATIONS
QUADRATIC EQUATIONS AND MONODROMY EVOLVING DEFORMATIONS YOUSUKE OHYAMA 1. Introduction In this paper we study a special class of monodromy evolving deformations (MED). Chakravarty and Ablowitz [4] showed
More informationInformation Divergences and the Curious Case of the Binary Alphabet
Information Divergences and the Curious Case of the Binary Alphabet Jiantao Jiao, Thomas Courtade, Albert No, Kartik Venkat, and Tsachy Weissman Department of Electrical Engineering, Stanford University;
More informationSome Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom
Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom Stefan M. Moser April 7, 007 Abstract The non-central chi-square distribution plays an important role
More informationOn the entropy flows to disorder
On the entropy flows to disorder C.T.J. Dodson School of Mathematics, University of Manchester, Manchester M 9PL, UK ctdodson@manchester.ac.uk May, 9 Abstract Gamma distributions, which contain the exponential
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationarxiv:math/ v1 [math.ra] 18 Mar 2004
arxiv:math/0403314v1 [math.ra] 18 Mar 2004 WEIGHT AND RANK OF MATRICES OVER FINITE FIELDS THERESA MIGLER, KENT E. MORRISON, AND MITCHELL OGLE Abstract. Define the weight of a matrix to be the number of
More informationNational Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation
Maximum Entropy and Spectral Estimation 1 Introduction What is the distribution of velocities in the gas at a given temperature? It is the Maxwell-Boltzmann distribution. The maximum entropy distribution
More informationPOCHHAMMER SYMBOL WITH NEGATIVE INDICES. A NEW RULE FOR THE METHOD OF BRACKETS.
POCHHAMMER SYMBOL WITH NEGATIVE INDICES A NEW RULE FOR THE METHOD OF BRACKETS IVAN GONZALEZ, LIN JIU, AND VICTOR H MOLL Abstract The method of brackets is a method of integration based upon a small number
More informationCS 591, Lecture 2 Data Analytics: Theory and Applications Boston University
CS 591, Lecture 2 Data Analytics: Theory and Applications Boston University Charalampos E. Tsourakakis January 25rd, 2017 Probability Theory The theory of probability is a system for making better guesses.
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationFourier Series. Spectral Analysis of Periodic Signals
Fourier Series. Spectral Analysis of Periodic Signals he response of continuous-time linear invariant systems to the complex exponential with unitary magnitude response of a continuous-time LI system at
More informationThe q-exponential family in statistical physics
Journal of Physics: Conference Series The q-exponential family in statistical physics To cite this article: Jan Naudts 00 J. Phys.: Conf. Ser. 0 0003 View the article online for updates and enhancements.
More informationarxiv: v1 [gr-qc] 17 Nov 2017
Tsallis and Kaniadakis statistics from a point of view of the holographic equipartition law arxiv:1711.06513v1 [gr-qc] 17 Nov 2017 Everton M. C. Abreu, 1,2, Jorge Ananias Neto, 2, Albert C. R. Mendes,
More informationAnalogues for Bessel Functions of the Christoffel-Darboux Identity
Analogues for Bessel Functions of the Christoffel-Darboux Identity Mark Tygert Research Report YALEU/DCS/RR-1351 March 30, 2006 Abstract We derive analogues for Bessel functions of what is known as the
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationLog-Convexity Properties of Schur Functions and Generalized Hypergeometric Functions of Matrix Argument. Donald St. P. Richards.
Log-Convexity Properties of Schur Functions and Generalized Hypergeometric Functions of Matrix Argument Donald St. P. Richards August 22, 2009 Abstract We establish a positivity property for the difference
More informationThe Comparison Test & Limit Comparison Test
The Comparison Test & Limit Comparison Test Math4 Department of Mathematics, University of Kentucky February 5, 207 Math4 Lecture 3 / 3 Summary of (some of) what we have learned about series... Math4 Lecture
More informationTesting Some Covariance Structures under a Growth Curve Model in High Dimension
Department of Mathematics Testing Some Covariance Structures under a Growth Curve Model in High Dimension Muni S. Srivastava and Martin Singull LiTH-MAT-R--2015/03--SE Department of Mathematics Linköping
More informationEigenvalues and Singular Values of Random Matrices: A Tutorial Introduction
Random Matrix Theory and its applications to Statistics and Wireless Communications Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction Sergio Verdú Princeton University National
More informationNon-Gaussian Maximum Entropy Processes
Non-Gaussian Maximum Entropy Processes Georgi N. Boshnakov & Bisher Iqelan First version: 3 April 2007 Research Report No. 3, 2007, Probability and Statistics Group School of Mathematics, The University
More informationIndependence of some multiple Poisson stochastic integrals with variable-sign kernels
Independence of some multiple Poisson stochastic integrals with variable-sign kernels Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological
More informationConjugate Predictive Distributions and Generalized Entropies
Conjugate Predictive Distributions and Generalized Entropies Eduardo Gutiérrez-Peña Department of Probability and Statistics IIMAS-UNAM, Mexico Padova, Italy. 21-23 March, 2013 Menu 1 Antipasto/Appetizer
More informationVector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.
Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar
More informationOn the entropy flows to disorder
On the entropy flows to disorder School of Mathematics University of Manchester Manchester M 9PL, UK (e-mail: ctdodson@manchester.ac.uk) C.T.J. Dodson Abstract. Gamma distributions, which contain the exponential
More informationSteiner s formula and large deviations theory
Steiner s formula and large deviations theory Venkat Anantharam EECS Department University of California, Berkeley May 19, 2015 Simons Conference on Networks and Stochastic Geometry Blanton Museum of Art
More informationShannon Entropy: Axiomatic Characterization and Application
Shannon Entropy: Axiomatic Characterization and Application C. G. Chakrabarti,Indranil Chakrabarty arxiv:quant-ph/0511171v1 17 Nov 2005 We have presented a new axiomatic derivation of Shannon Entropy for
More information10-704: Information Processing and Learning Spring Lecture 8: Feb 5
10-704: Information Processing and Learning Spring 2015 Lecture 8: Feb 5 Lecturer: Aarti Singh Scribe: Siheng Chen Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal
More informationNear-exact approximations for the likelihood ratio test statistic for testing equality of several variance-covariance matrices
Near-exact approximations for the likelihood ratio test statistic for testing euality of several variance-covariance matrices Carlos A. Coelho 1 Filipe J. Marues 1 Mathematics Department, Faculty of Sciences
More informationFisher Analysis of D st Time Series
ISSN 1749-3889 (print), 1749-3897 (online) International Journal of Nonlinear Science Vol.9(2010) No.4,pp.402-406 Fisher Analysis of D st Time Series L. Telesca 1, M. Lovallo 2 1,2 Istituto di Metodologie
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationEntropy Rate of Thermal Diffusion
Entropy Rate of Thermal Diffusion Abstract John Laurence Haller Jr. Predictive Analytics, CCC Information Services, Chicago, USA jlhaller@gmail.com The thermal diffusion of a free particle is a random
More informationSolution to a Problem Found by Calculating the Magnetic Internal Energy Inside Nonextensive Statistical Mechanics
EJTP 7, No. 24 (2010) 131 138 Electronic Journal of Theoretical Physics Solution to a Problem Found by Calculating the Magnetic Internal Energy Inside Nonextensive Statistical Mechanics Felipe A. Reyes
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationENTANGLED STATES ARISING FROM INDECOMPOSABLE POSITIVE LINEAR MAPS. 1. Introduction
Trends in Mathematics Information Center for Mathematical Sciences Volume 6, Number 2, December, 2003, Pages 83 91 ENTANGLED STATES ARISING FROM INDECOMPOSABLE POSITIVE LINEAR MAPS SEUNG-HYEOK KYE Abstract.
More informationSimple observations concerning black holes and probability. Abstract
Simple observations concerning black holes and probability Sándor Hegyi KFKI Research Institute for Particle and Nuclear Physics H-1525 Budapest, P.O. Box 49. Hungary E-mail: hegyi@rmki.kfki.hu Abstract
More informationRENORMALIZATION OF DYSON S VECTOR-VALUED HIERARCHICAL MODEL AT LOW TEMPERATURES
RENORMALIZATION OF DYSON S VECTOR-VALUED HIERARCHICAL MODEL AT LOW TEMPERATURES P. M. Bleher (1) and P. Major (2) (1) Keldysh Institute of Applied Mathematics of the Soviet Academy of Sciences Moscow (2)
More informationThe groups SO(3) and SU(2) and their representations
CHAPTER VI The groups SO(3) and SU() and their representations Two continuous groups of transformations that play an important role in physics are the special orthogonal group of order 3, SO(3), and the
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationarxiv: v1 [math.ds] 31 Jul 2018
arxiv:1807.11801v1 [math.ds] 31 Jul 2018 On the interior of projections of planar self-similar sets YUKI TAKAHASHI Abstract. We consider projections of planar self-similar sets, and show that one can create
More informationPACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION
PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION DAVAR KHOSHNEVISAN AND YIMIN XIAO Abstract. In order to compute the packing dimension of orthogonal projections Falconer and Howroyd 997) introduced
More informationStochastic Design Criteria in Linear Models
AUSTRIAN JOURNAL OF STATISTICS Volume 34 (2005), Number 2, 211 223 Stochastic Design Criteria in Linear Models Alexander Zaigraev N. Copernicus University, Toruń, Poland Abstract: Within the framework
More informationAn almost sure invariance principle for additive functionals of Markov chains
Statistics and Probability Letters 78 2008 854 860 www.elsevier.com/locate/stapro An almost sure invariance principle for additive functionals of Markov chains F. Rassoul-Agha a, T. Seppäläinen b, a Department
More informationSOME GENERALIZATION OF MINTY S LEMMA. Doo-Young Jung
J. Korea Soc. Math. Educ. Ser. B: Pure Appl. Math. 6(1999), no 1. 33 37 SOME GENERALIZATION OF MINTY S LEMMA Doo-Young Jung Abstract. We obtain a generalization of Behera and Panda s result on nonlinear
More informationMichael Lacey and Christoph Thiele. f(ξ)e 2πiξx dξ
Mathematical Research Letters 7, 36 370 (2000) A PROOF OF BOUNDEDNESS OF THE CARLESON OPERATOR Michael Lacey and Christoph Thiele Abstract. We give a simplified proof that the Carleson operator is of weaktype
More informationLecture 8: The Metropolis-Hastings Algorithm
30.10.2008 What we have seen last time: Gibbs sampler Key idea: Generate a Markov chain by updating the component of (X 1,..., X p ) in turn by drawing from the full conditionals: X (t) j Two drawbacks:
More informationLecture Notes 1: Vector spaces
Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector
More informationLecture Note 1: Probability Theory and Statistics
Univ. of Michigan - NAME 568/EECS 568/ROB 530 Winter 2018 Lecture Note 1: Probability Theory and Statistics Lecturer: Maani Ghaffari Jadidi Date: April 6, 2018 For this and all future notes, if you would
More informationarxiv: v2 [math-ph] 22 May 2009
On the entropy flows to disorder arxiv:8.438v [math-ph] May 9 C.T.J. Dodson School of Mathematics, University of Manchester, Manchester M3 9PL, UK ctdodson@manchester.ac.uk August 6, 8 Abstract Gamma distributions,
More informationThe Maximum Entropy Principle and Applications to MIMO Channel Modeling
The Maximum Entropy Principle and Applications to MIMO Channel Modeling Eurecom CM Talk, 16/02/2006 Maxime Guillaud maxime.guillaud@eurecom.fr (Joint work with Mérouane Debbah) The Maximum Entropy Principle
More informationLecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016
Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,
More informationOn Lévy measures for in nitely divisible natural exponential families
On Lévy measures for in nitely divisible natural exponential families Célestin C. Kokonendji a;, a University of Pau - LMA & IUT STID, Pau, France Mohamed Khoudar b b University of Pau - IUT STID, Pau,
More informationMicrocanonical scaling in small systems arxiv:cond-mat/ v1 [cond-mat.stat-mech] 3 Jun 2004
Microcanonical scaling in small systems arxiv:cond-mat/0406080v1 [cond-mat.stat-mech] 3 Jun 2004 M. Pleimling a,1, H. Behringer a and A. Hüller a a Institut für Theoretische Physik 1, Universität Erlangen-Nürnberg,
More informationRemarks on solutions of a fourth-order problem
Applied Mathematics Letters ( ) www.elsevier.com/locate/aml Remarks on solutions of a fourth-order problem Anne Beaulieu a,rejeb Hadiji b, a A.B. Laboratoire d Analyse et de Mathématiques Appliquées, CNRS
More informationEstimation of Rényi Information Divergence via Pruned Minimal Spanning Trees 1
Estimation of Rényi Information Divergence via Pruned Minimal Spanning Trees Alfred Hero Dept. of EECS, The university of Michigan, Ann Arbor, MI 489-, USA Email: hero@eecs.umich.edu Olivier J.J. Michel
More informationSELF-ADJOINTNESS OF DIRAC OPERATORS VIA HARDY-DIRAC INEQUALITIES
SELF-ADJOINTNESS OF DIRAC OPERATORS VIA HARDY-DIRAC INEQUALITIES MARIA J. ESTEBAN 1 AND MICHAEL LOSS Abstract. Distinguished selfadjoint extension of Dirac operators are constructed for a class of potentials
More informationLecture 4.6: Some special orthogonal functions
Lecture 4.6: Some special orthogonal functions Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4340, Advanced Engineering Mathematics
More informationNotes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.
Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where
More informationProperty of Tsallis entropy and principle of entropy increase
Bull. Astr. Soc. India (2007) 35, 691 696 Property of Tsallis entropy and principle of entropy increase Du Jiulin Department of Physics, School of Science, Tianjin University, Tianjin 300072, China Abstract.
More informationAngular momentum and Killing potentials
Angular momentum and Killing potentials E. N. Glass a) Physics Department, University of Michigan, Ann Arbor, Michigan 4809 Received 6 April 995; accepted for publication September 995 When the Penrose
More informationDunkl operators and Clifford algebras II
Translation operator for the Clifford Research Group Department of Mathematical Analysis Ghent University Hong Kong, March, 2011 Translation operator for the Hermite polynomials Translation operator for
More informationA Criterion for the Compound Poisson Distribution to be Maximum Entropy
A Criterion for the Compound Poisson Distribution to be Maximum Entropy Oliver Johnson Department of Mathematics University of Bristol University Walk Bristol, BS8 1TW, UK. Email: O.Johnson@bristol.ac.uk
More informationFree Probability, Sample Covariance Matrices and Stochastic Eigen-Inference
Free Probability, Sample Covariance Matrices and Stochastic Eigen-Inference Alan Edelman Department of Mathematics, Computer Science and AI Laboratories. E-mail: edelman@math.mit.edu N. Raj Rao Deparment
More informationMedical Imaging. Norbert Schuff, Ph.D. Center for Imaging of Neurodegenerative Diseases
Uses of Information Theory in Medical Imaging Norbert Schuff, Ph.D. Center for Imaging of Neurodegenerative Diseases Norbert.schuff@ucsf.edu With contributions from Dr. Wang Zhang Medical Imaging Informatics,
More informationStochastic Processes
Introduction and Techniques Lecture 4 in Financial Mathematics UiO-STK4510 Autumn 2015 Teacher: S. Ortiz-Latorre Stochastic Processes 1 Stochastic Processes De nition 1 Let (E; E) be a measurable space
More informationRECURSIVE ESTIMATION AND KALMAN FILTERING
Chapter 3 RECURSIVE ESTIMATION AND KALMAN FILTERING 3. The Discrete Time Kalman Filter Consider the following estimation problem. Given the stochastic system with x k+ = Ax k + Gw k (3.) y k = Cx k + Hv
More informationFundamental equations of relativistic fluid dynamics
CHAPTER VI Fundamental equations of relativistic fluid dynamics When the energy density becomes large as may happen for instance in compact astrophysical objects, in the early Universe, or in high-energy
More information