On the average uncertainty for systems with nonlinear coupling

Similar documents
ISSN Article. Tsallis Entropy, Escort Probability and the Incomplete Information Theory

Is There a World Behind Shannon? Entropies for Complex Systems

Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: A basis for q-exponential distributions

arxiv: v1 [cond-mat.stat-mech] 3 Apr 2007

From time series to superstatistics

arxiv: v3 [cond-mat.stat-mech] 19 Mar 2015

arxiv: v1 [cond-mat.stat-mech] 17 Jul 2015

arxiv: v1 [nlin.cd] 22 Feb 2011

Power law distribution of Rényi entropy for equilibrium systems having nonadditive energy

Colegio de Morelos & IFUNAM, C3 UNAM PHD in Philosophy and Sciences: Complex systems S. Elena Tellez Flores Adviser: Dr.

Generalized Huberman-Rudnick scaling law and robustness of q-gaussian probability distributions. Abstract

arxiv:math-ph/ v4 22 Mar 2005

arxiv:cond-mat/ v1 10 Aug 2002

Power law distribution of Rényi entropy for equilibrium systems having nonadditive energy

How the Nonextensivity Parameter Affects Energy Fluctuations. (Received 10 October 2013, Accepted 7 February 2014)

Entropy measures of physics via complexity

Energy conservation and the prevalence of power distributions

Shannon Entropy: Axiomatic Characterization and Application

arxiv:cond-mat/ v1 8 Jan 2004

About Closedness by Convolution of the Tsallis Maximizers

MCPS Algebra II Pacing Guide

On Generalized Entropy Measures and Non-extensive Statistical Mechanics

Scaling properties of fine resolution point rainfall and inferences for its stochastic modelling

arxiv: v1 [gr-qc] 17 Nov 2017

Algebra 1. Correlated to the Texas Essential Knowledge and Skills. TEKS Units Lessons

arxiv: v1 [math-ph] 28 Jan 2009

Scaling properties of fine resolution point rainfall and inferences for its stochastic modelling

A world-wide investigation of the probability distribution of daily rainfall

PETERS TOWNSHIP HIGH SCHOOL

Algebra I. CORE TOPICS (Key Concepts & Real World Context) UNIT TITLE

If we want to analyze experimental or simulated data we might encounter the following tasks:

Superstatistics: theory and applications

The q-exponential family in statistical physics

Essential Mathematics

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

West Windsor-Plainsboro Regional School District Advanced Algebra II Grades 10-12

Statistical physics models belonging to the generalised exponential family

A note on the connection between the Tsallis thermodynamics and cumulative prospect theory

Temperature Fluctuations and Entropy Formulas

College Algebra with Corequisite Support: Targeted Review

CURRICULUM CATALOG. Algebra II (3135) VA

Closer look at time averages of the logistic map at the edge of chaos


College Algebra with Corequisite Support: A Blended Approach

College Algebra with Corequisite Support: A Compressed Approach

Study of energy fluctuation effect on the statistical mechanics of equilibrium systems

Nonextensive Aspects of Self- Organized Scale-Free Gas-Like Networks

Logarithmic and Exponential Equations and Change-of-Base

Binomials defined, 13 division by, FOIL method and, 22 multiplying binomial by trinomial,

Check boxes of Edited Copy of Sp Topics (was 217-pilot)

A video College Algebra course & 6 Enrichment videos

arxiv: v1 [cond-mat.stat-mech] 19 Feb 2016

ALGEBRA 2. Background Knowledge/Prior Skills Knows what operation properties hold for operations with matrices

Environmental Atmospheric Turbulence at Florence Airport

Observations Homework Checkpoint quizzes Chapter assessments (Possibly Projects) Blocks of Algebra

Math 4C Fall 2008 Final Exam Study Guide Format 12 questions, some multi-part. Questions will be similar to sample problems in this study guide,

1.1 Basic Algebra. 1.2 Equations and Inequalities. 1.3 Systems of Equations

arxiv: v1 [cond-mat.stat-mech] 6 Mar 2008

evaluate functions, expressed in function notation, given one or more elements in their domains

Algebra One Dictionary

Course Number 432/433 Title Algebra II (A & B) H Grade # of Days 120

q-generalization of Symmetric Alpha-Stable Distributions: Part I

How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems. Pðk 1 ; k 2 jθ 1 ; θ 2 Þ = θ k 1

Introduction to Series and Sequences Math 121 Calculus II Spring 2015

xvi xxiii xxvi Construction of the Real Line 2 Is Every Real Number Rational? 3 Problems Algebra of the Real Numbers 7

Prentice Hall: Algebra 2 with Trigonometry 2006 Correlated to: California Mathematics Content Standards for Algebra II (Grades 9-12)

Multivariate Distribution Models

Contents. CHAPTER P Prerequisites 1. CHAPTER 1 Functions and Graphs 69. P.1 Real Numbers 1. P.2 Cartesian Coordinate System 14

CHINO VALLEY UNIFIED SCHOOL DISTRICT INSTRUCTIONAL GUIDE ALGEBRA II

West Windsor-Plainsboro Regional School District Algebra and Trigonometry Grades 11-12

Algebra I Vocabulary Cards

A New Procedure to Understanding Formulas of Generalized Quantum Mean Values for a Composite A + B

arxiv: v1 [astro-ph.sr] 30 Mar 2019

Intermediate Algebra

Two-dimensional dissipative maps at chaos threshold:sensitivity to initial conditions and relaxation dynamics

arxiv: v1 [nlin.cd] 23 Oct 2009

Stochastic equations for thermodynamics

Algebra 2 (2006) Correlation of the ALEKS Course Algebra 2 to the California Content Standards for Algebra 2

q-statistics on symmetric generalized binomial distributions

The fitting of q-laws as replacement for power-law fit: A social science tutorial. Douglas R. White 2006

Comparison of Virginia s College and Career Ready Mathematics Performance Expectations with the Common Core State Standards for Mathematics

Accelerated Advanced Algebra. Chapter 1 Patterns and Recursion Homework List and Objectives

MA.8.1 Students will apply properties of the real number system to simplify algebraic expressions and solve linear equations.

College Algebra To learn more about all our offerings Visit Knewton.com

BOLTZMANN-GIBBS ENTROPY: AXIOMATIC CHARACTERIZATION AND APPLICATION

NFC ACADEMY COURSE OVERVIEW

Elementary and Intermediate Algebra

On q-gamma Distributions, Marshall-Olkin q-gamma Distributions and Minification Processes

Discriminative Direction for Kernel Classifiers

Semester Review Packet

MyMathLab for School Precalculus Graphical, Numerical, Algebraic Common Core Edition 2016

Equations and Inequalities

Probabilistic Machine Learning. Industrial AI Lab.

Stat 5101 Lecture Notes

Honors Integrated Algebra/Geometry 3 Critical Content Mastery Objectives Students will:

Algebra II/ Advanced Algebra II (2016)

Grade 8 Math Curriculum Map Erin Murphy

arxiv:cond-mat/ v2 [cond-mat.stat-mech] 23 Feb 1998

2012 Texas Essential Knowledge and Skills for Algebra II in Pearson Texas Algebra II

Chapter 2 Review of Classical Information Theory

Transcription:

On the average uncertainty for systems with nonlinear coupling Kenric P. elson, Sabir Umarov, and Mark A. Kon Abstract The increased uncertainty and complexity of nonlinear systems have motivated investigators to consider generalized approaches to defining an entropy function. In this paper, the source of nonlinearity is used as the parameter defining the generalization; examples include the inverse of the degree of freedom and the fluctuation of the variance. The average uncertainty is defined in the probability domain. The Shannon entropy when transformed to the probability domain is the weighted geometric mean of the probabilities. For the exponential and Gaussian distributions, the weighted geometric mean of the distribution is equal to the density of the distribution at the location plus the scale (i.e. at the width of the distribution). The average uncertainty is generalized via the weighted generalized mean, in which the moment is a function of the nonlinear source. Both the Rényi and Tsallis entropies transform to this definition of the generalized average uncertainty. For the generalized Pareto and student- t distributions, which are the maximum entropy distributions of these generalized entropies, the appropriate weighted generalized mean also equals the density of the distribution at the location plus scale. A coupled entropy function is proposed, which is closely related to the normalized Tsallis entropy, but incorporates a distinction between the additive coupling and multiplicative coupling. The additive coupling modifies the addition function and is proportional to the nonlinear source. The multiplicative coupling modifies the product function, and incorporates the dimension of the random variable, such that the multiplicative coupling is proportional to the inverse of the fractal dimension. Keywords: complex systems, nonlinear coupling, non- additive entropy, nonextensive statistical mechanics Introduction Entropy is the standard- bearer for defining the average uncertainty of a probability distribution or density function [ 3]. Boltzmann, Gibbs, and Shannon demonstrated that the kernel ln s necessary to form an arithmetic average of the uncertainty of a probability distribution. Khinchin showed that the entropy function H = p ln s unique given axioms stating that it is a) continuous with pi, b) i= i i maximal for the uniform distribution, c) not changed by a state with zero probability, and d) additive for conditionally independent probabilities. Rényi, Tsallis and Amari [4 6] sought to broaden the definition of average uncertainty to account for the influence of nonlinear dynamics in complex systems.

Generalized entropy measures have been applied to a variety of nonlinear systems such as decision making under risk [7,8], communication channel equalization [9], compressive sensing [0], the edge of chaos [], and quantum entanglement [2]. Hanel and Thurner [3,4] have shown that requiring just the first three of Khinchin s axioms leads to a two- parameter generalization of entropy, one of which is the Tsallis generalization and the focus of this communication. The paper has three objectives. In Section 2, we show that statistical functions can be expressed clearly in terms of the source of nonlinearity. In Section 3, we show that this approach to nonlinear stochastic systems leads to the weighted generalized mean of the probabilities of a distribution as the definition of average uncertainty. This provides a definition for the average uncertainty. We show that care in the distinction between the additive and multiplicative coupling leads to a definition for the coupled entropy. In Section 4, the properties of the coupled entropy are compared with the Renyi and Tsallis entropy. While based upon the normalized Tsallis entropy, the coupled entropy is shown to have close to linear growth close to linear growth as a function of the source of nonlinearity for the coupled- Gaussian, which is equivalent to the student- t distribution. A review of the average uncertainty for important members of the exponential family provides a helpful framework prior to introducing the effect of nonlinearity. The Boltzmann- Gibbs- Shannon entropy transformed to the probability domain is the weighted geometric mean of the distribution pi Pavg exp+ pi ln pi = pi, i= () i= where the weights, now a power, are also the probabilities. The average uncertainty Pavg is the average probability of being able to determine the state of the system. The average uncertainty ranges from certainty (Pavg =) when one of the states is certain; to (Pavg =/) when the states are equiprobable. The intuition is that the total probability of the distribution is the product of the probabilities and the average is determined by weighting each term by the probability. Assuming a continuous distribution, the average density for the exponential distribution is then ( x µ ) ( x µ ) ( x µ ) σ σ σ favg e exp e ln e dx σ = µ σ σ (2) = σe and for the Gaussian distribution the average density is 2 2 2 x µ x µ x µ 2 σ 2 σ 2 σ f avg e = exp e ln e dx 2πσ 2πσ 2πσ (3) =. 2πeσ

In both cases the average density is equal to f ( µ σ) +. We will show that this relationship can be generalized for the coupled exponential distributions. First, we review the concept of nonlinear statistical coupling. 2 onlinear Statistical Coupling Fundamental to complex systems is the influence of nonlinearity on dynamics. As such in modeling the statistical mechanics of complex system, we choose the source of nonlinearity to define the nonlinear statistical coupling [5]. 0< < is the domain of heavy- tail statistics in which the Positive coupling nonlinearity causes increased variation. Examples include multiplicative noise [6], superstatistics in which the variance fluctuates [7,8], and the inverse of the < < 0 is the domain of compact- support degree of freedom. egative coupling statistics in which the nonlinearity causes reduction in variation. The two domains are related by a conjugate dual. = 0 are the domain of Linear systems + exponential statistics and logarithmic measures of information. The fluctuation of the variance, known as superstatistics [7,8], is a helpful example. For a system governed by an exponential distribution, x σ ' e σ ' ;x 0 (4) but with the inverse scale β '= fluctuating according to a gamma distribution σ ' the modified statistics are governed by a deformed exponential distribution, which we will call a coupled- exponential. The scale and shape of the coupled exponential are defined by the mean and relative variance of the inverse scale σ = β ' = β '2 β ' 2. β ' 2 For this mean and relative variance, the gamma distribution is = f β ' Γ σ ( σ ) β ' (5) e σβ ' (6) and the coupled exponential distribution is ( σ x ) + β 'x = f ( β ')( β 'e )dβ '. (7) 0 The coupled exponential function is notated as an analog of the exponential function exp x ( x) + + ; a max 0,a +. (8)

The inverse of the exponent is a modified coupling, which will be referred to as the multiplicative coupling. For multivariates [9,20] the exponent generalizes to + d, but only the one- dimensional function will be used in this paper. d will d be used to modify the product function and is referred to as the multiplicative coupling. The inverse of the multiplicative coupling is equal to the asymptotic fractal dimension lim x x d = O x d (7); thus the inverse of the coupling terms are summarized by =ν DegreeofFreedom of the coupled exponential distribution d = DAsymptoticFractalDimension Raising to a power is not identical to multiplying the argument, so an additional parameter definition is required to show the relationships ( x) a exp a + a exp x (0) exp a, ( x) ( + ( a )x) a +. These notational definitions are needed to clarify that raising to a power is equivalent to both multiplying the argument and dividing the nonlinear coupling terms ( x) = exp a ( ax ) (), The nonlinear statistical coupling has the effect of changing the relationship between variables in the state space. Multiplication of independent coupled exponential distributions results in a combined variable of ( y) = ( x + y + xy) x The nonlinear term is viewed as forming the coupled addition [5,2] x y = x + y + xy, (3) which via the coupled subtraction is seen as a dilation of the state space x y x y y. (4) (9). (2) The inverse of the couple exponential is the coupled logarithm ln ( x) x. (5) Importantly, the integral over the unit interval is invariant to the deformation ln ( x)dx =. This insures that the coupled logarithm deforms the measure of 0 information without modifying the total information over the unit interval. The role of the coupled addition in modifying the state space can now be expressed in terms of the coupled exponential and logarithm functions

exp ( x i ) = exp x i, x i x... x (6) ln x i = ln x i The coupled product has a complimentary role, but operates on the exponent of the variable. Thus not only do the coupled sum and product not form an algebra with a distribution property [2], but are more accurately two complementary algebras [22,23]. The coupled product of functions f and g are f ' g f ' + g ' ', where the coupling term is the inverse of the exponent of the coupled exponential function '=. This definition extends to multiple functions and has the following properties exp ( x i ) exp x i ( ) = exp x i ln x i = ln x. i In [20] the formation of multivariate coupled exponentials required the coupled product to be defined such that the output dimension was the sum of the input dimensions. This paper is focused on the coupling between states for a single random variable and as such the dimension does not change. The coupled power will also be needed, x a ' ax ' a Generalization of the stretched exponential distributions [24] ασ α utilizes the two- parameter definitions from Equation (0). Thus exp xα f x the coupled stretched exponential is expressed as (7) ( ) '. (8) f ( x) exp x α α σ α = exp x α xα α, ασ α = σ α (9) + The inverse of the stretched exponential leads to two candidates for defining the two- parameter coupled logarithm (three- parameter if the dimensions is also included) for the deformed measure of information. If x α α is treated as the σ variable and the integral over the unit interval equals one, then the coupled surprisal (or generalized measure of information) is s α, ln α, x α ln x α = x ;forx > 0. (20)

This is the definition that will be investigated in this paper; however, a second definition treats x σ as the argument of the coupled stretched exponential and leads to a definition of the coupled logarithms as α ln x α α ( ln α x, ) α x α ;forx > 0. (2) While not investigated here, the second definition is of potential significance as an approach to the two- parameter generalized entropy function proposed by Hanel, et. al. [3,4] and related generalizations investigated in [22], The coupled exponential( α = ) and coupled Gaussian( α = 2) distributions are of particular interest given their role as maximum generalized entropy distributions with constraints on the scale of the distribution. These distributions [25] are defined as α Z(,σ,α ) exp α x µ σ, (22) For 0 and For 0 α = the domain of the random variable is x µ and α = 2, < x < and Z ( σα) ( x ),, =, πσ and Z ( σα) + ( 2 ) where ( + ) 2,, = σ. x Γ + is a generalization of the factorial function to the real numbers. The parameters of the distributions are the location μ, the scale σ, and the normalization Z. One of the benefits of this approach to the definition is that the source of coupling corresponds exactly to the shape parameter for the generalized Pareto distribution and the inverse of the degree of freedom for the Student s t distribution. The scale parameter for these distributions is determined by moments based on an escort distribution corresponding to q = + rather than random variable, where q is the Tsallis index associated with nonextensive statistical mechanics [26] or q- statistics. In this paper, this distribution is called the coupled- probability or density because it can also be shown to represent the probability of a coupled state of the system. Definition Coupled Probability/Density Given a discrete probability distribution = { p i p i= i = } p or a continuous density f ( x) f x dx = the distribution is reshaped into the coupled probability/density by raising to the power + and renormalizing

+ p = i P i α, j= p j + = j= j i j= j i p j p j, (23) + f ( α, ) f x ( x). (24) + f ( x)dx The expression on the right of Equation (23) is shown to make evident the modeling of a coupled state of the system. The Cauchy distribution α = 2, = is an illustrative example. In this case = and the coupled- probability is formed by dividing the probability of state i by all the probabilities of all the other states and then renormalization. This procedure has a direct connection with the origin of the Cauchy distribution as the division of two random variables. An important result of nonextensive statistical mechanics is the relationship between coupled- moments and the parameters of the coupled- exponential and coupled- Gaussian distributions [5,27]. The generalization of the n th - moment is x n x n P ( n, ) i = x n +n +n. (25) n The α parameter is not required because it drops out from the expression α. Lemma Given either the coupled- exponential or the coupled- Gaussian distribution, the generalized scale parameter σ is determined by the generalizations of the st and 2 nd moments, respectively. See (Tsallis, et. al. 2009)[27] for the proof and (elson, et. al. 200)[5] for the notation. x = x σ exp x σ x 2 = x 2 Z exp 2 x 2 σ 2 +2 dx +3 dx +2 σ exp x σ dx =σ (26) +3 Z exp 2 x 2 σ 2 dx (27)

3 The Coupled Average Uncertainty The coupled algebra provides the foundation for defining a generalization of the average uncertainty given coupling of statistical states. Equation () is generalized using the coupled exponential and coupled logarithm functions. Definition 2 Coupled Log Average Given a set of weights { w, i=,.. } such that wi = and a set of variables { x, i i =,.. }, the coupled- log average of the i= variables is i exp α α w i ln x i Lemma 2 The coupled- log average a) can be expressed equivalently using the coupled product and b) is equal to the weighted generalized mean [23,28]. (28) Proof: a)exp α α w i ln x i = x i, (29) which utilizes the properties in Equations (7) and (8). b)exp α α w i ( ln x i ) = w i + w i x i x i w i x i Given the central role of moment of weighted generalized mean, the coupled log average is summarized as exp α α w i ln x i = w i x i, = The coupled- log average extends to continuous variables by the relationship exp α w( x)ln f α x dx = = w x = w i f m x dx (30) (3) m. (32)

Using the coupled probability as the weight of the generalized mean we can know define the average uncertainty of a distribution which originates from a system with nonlinear coupling. Definition 3 Coupled Average Uncertainty Given a distribution p = pi pi = its coupled average uncertainty is determined by the coupled log i= average with xi = p and w = P ( α, ) i i i α, P avg P pi i. (33) Furthermore, the coupled average uncertainty simplifies to the weighted generalized mean with the negative coupling value and the distribution as the weight; thus, + p i p i + P avg = p i + p, = (34) j j= and for a continuous distribution + f ( x)ln f α ( x) dx f avg ( x) = exp α = f + ( x)dx. (35) f + ( x )dx Applying the coupled average uncertainty to the coupled exponential and coupled Gaussian distributions produces the first main result regarding its relationship with the location and scale of the distribution. Theorem Given a) the coupled exponential distribution or b) the coupled Gaussian distribution (22) and a weight defined by the coupled probability; the coupled log- average or weighted generalized mean of the distribution is equal to the density of the distribution at x = µ + σ. Proof: a) For the coupled exponential distribution α = setting µ = 0 the coupled average uncertainty is and without loss of generality

f avg ( x) = f ( x)dx withf x =σ =σ 0 +2 +2 0 = σ e x σ σ +2 dx = σ exp x σ (36) b) For the coupled Gaussian distribution α = 2 and without loss of generality setting µ = 0, the coupled average uncertainty is f avg f avg ( x) = f +2 2 ( x)dx withf x ( x) = = exp Z,σ,2 2 x 2 thesolutionintermsofsourceofnonlinearcoupling is = = ( 2 ) πσ ( )( 2 ) +3 2 +3 2 ( 2 ) πσ ( )( 2 ) ( 2 ) πσ ( )( 2 ) = Z,σ,2 e /2 x 2 σ 2 πσ 2 2 2 2 +3 2 dx 2 σ 2 (37) Theorem has important implications for defining the generalized average uncertainty. For 0 the weighted generalized mean is the geometric mean. Just as the standard deviation is an expression of the variation of the random variable, the geometric mean of a distribution weighted by the distribution is an expression of average uncertainty. For the Gaussian distribution, the geometric mean of the distribution is equal to the density at x = µ + σ.

The average uncertainty is generalized by using the coupled probability distribution for the weight and taking the αk f x is a moment. And when member of the coupled distributions f µ + σ = f x (38). This relationshis shown in Figure for the heavy- tail domain of the coupled- Gaussian distribution. The coupled average uncertainty versus the source of coupling parameter for the distribution is plotted over the range (0,), which 0 = 0.5 and the includes the Gaussian ( = ), the boundary with infinite variance Cauchy distribution ( =. ) Four values for the metric coupling = {, 2, 3, 4 } 5 5 5 5 e corresponds to 2 Z Kavg are shown. In each case the maximum uncertainty. AverageUncertainty 0.5 0.4 0.3 0.2 0. Metric Coupling one fifth two fifths three fifths four fifths g e Z /2 0.0 0.0 0.2 0.4 0.6 0.8.0 Coupling of Distribution Figure The coupled average uncertainty as a function of the source of coupling is shown. The x- axis varies the coupling of a coupled Gaussian distribution with σ =. The four lines are for metric coupling of { 2 3 4 2 =,,, }, which translates into weighted generalized mean with 2 K =. When the metric 5 5 5 5 + e 2. coupling equals the distribution coupling the average uncertainty is minimized and equals Applying ln x to the coupled average uncertainty leads to a definition α, for the coupled- entropy. Z

Definition 4: Coupled- Entropy Given a discrete probability distribution { p = pi p i= i = } or a continuous density f ( x) f ( x) dx = and setting the Boltzmann constant to one, the Coupled- Entropy is defined as S α, S α, + ( p) ln α, ( f ( x) ) ln α, f + x dx = P ln α, (39), i = f ( x)ln α, f ( x)dx. (40) The relationship between the middle and right- hand terms is shown in Section 4 as part of the comparison with the Tsallis entropy. The coupled- entropy expands to the following expression S α, + ( p) = ln α, = + = + p i. For the two principal cases of interest the coupled- entropy becomes S, ( p) = ( +2 ) p i, S 2, ( p) = ( +3 ) ( ). (4) (42) 4 Comparison with Shannon, Tsallis and Rényi Entropies Both the Rényi and Tsallis entropies can be expressed in terms of a transformation of the coupled average uncertainty (33) to an entropy scale. To facilitate the comparison a substitution is made for the Rényi order and Tsallis index q +. Table summarizes the properties of the transformation from an average uncertainty to an entropy by comparing the power and normalization terms

of the generalized logarithm. For the Rényi entropy (R) the relationship with the coupled average uncertainty is evident given its use of the natural logarithm R S α, + = ln. (43) Table Comparison of Generalized Entropies The generalized entropy functions are expressed in terms of three components. All of the generalized entropies modify the geometric mean (Shannon) to the generalized mean with the moment equal to. The generalized mean is transformed to the entropy scale via the negative of the generalized logarithm with two parameters Power and orm(alization). orm x Power Entropy Moment Power orm. Coupled Tsallis ormalized Tsallis Renyi 0 0 Shannon 0 0 0 The coupled entropy is most closely related to the normalized Tsallis entropy (T). Expressed in terms of the coupled suprisal (2) the normalized Tsallis entropy is seen to the coupled entropy (39) multiplied by S T α, ( p) + ln α, + = ( )ln α, +. (44) Using the unnormalized form of Equation (44) for the Tsallis entropy definition, the lack of normalization is shown to be equivalent to using the opposite sign of the multiplier for the transformation T S α, + + ( p) ( ) ln α, = ( )ln α,. (45) Derivation of the right- hand expression is shown in Appendix A. The coupled entropy (C), normalized Tsallis, and Tsallis entropy have then the following relationship

C S α, = S T T α, = S α,. (46) K + ( ) p j Figure 2 shows several entropy functions for the Coupled Gaussian distribution (22) with scale σ = over the nonlinear source range of 0 < < 2. The coupled, Tsallis, and ormalized Tsallis entropies use a matching value of coupling between the distribution and the entropy function. The importance of both normalizing the coupled- probability and normalizing the generalized logarithm with the additive coupling is evident in the close linear growth for the coupled entropy. Furthermore, Shannon entropy is also close linear, although a closed form solution was not used and the numerical solution is susceptible to underestimation beyond >. The similarity in Figure 2 between Shannon entropy and coupled R entropy is only for σ =. The Renyi entropy, ln + S p ακ ακ = i ακ is also close to i= linear, though with a smaller slope. Details of the closed- form and numerical integration for the plot are provided in Appendix B. The closed- form expression for the coupled entropy of the coupled exponential (or generalized Pareto) distribution is provided in Appendix C. The decay in the Tsallis entropy for a distribution which clearly increases in uncertainty as the coupling increases, is upon first inspection an odd outcome. However, it may reflect the restricted phase space of the coupled system for individual instances of a related process, while the original distribution represents the ensemble statistics. If so, a conjugate coupled entropy may be useful, in which either the opposite sign of the multiplier is used as derived for the Tsallis entropy, or possibly the dual term is used for the transformation. A definition for the coupled conjugate entropy will require an investigation of the divergence in uncertainty between time- average and ensemble- averaged statistics for non- ergodic processes.

ormalized Tsallis Coupled Shannon Entropy Conjugate Coupled Renyi Tsallis Coupling Figure 2 Comparison of Entropy Functions The Coupled Entropy is compared with the Tsallis, ormalized Tsallis, Shannon, and Renyi Entropies for a Coupled ormal Distribution. The coupling of the entropy (except Shannon) and distribution match and is plotted for the range of source coupling. Closed form solutions exist for the Coupled Tsallis, and ormalized Tsallis entropies. The Shannon and Renyi entropies are numerical approximations. The Shannon entropy, although integrated from ±5,000, has appreciable numerical error at the higher coupling values. The Renyi entropy is integrated from ±,000. 5 Conclusion The nonlinear statistical coupling of states is defined in terms of a source of nonlinear coupling in the system. Positive coupling models non- stationary systems in which the variance fluctuates. The negative domain of, which has,0 ; and = 0 is the linear domain with decreased variability, is limited to exponential statistics. The maximum entropy distribution for these systems given a constrained location and scale is the coupled- exponential distributions, defined in Equation (22). In addition to the nonlinear coupling these distributions depend on the stability index of the stretched exponential α e. The stability index modifies the additive coupling, which defines the coupled sum used to combine states, and the multiplicative coupling, which defines the coupled product used to combine probabilities. Thus three quantities define the nonlinear statistical coupling {nonlinear source, additive coupling, multiplicative coupling} and are symbolized by α x

,, d =. In this paper the one dimensional case d = was + d examined. When 0.5 the variance is no longer finite; thus, traditional statistical tools like the sampled variance and the Boltzmann- Gibbs- Shannon entropy are either undefined or inaccurate. Thus a measure of uncertainty which accounts for the nonlinearity is required. Rényi, Tsallis and other investigators have defined generalized entropy functions for this purpose; however the precise requirements and definitions are still active areas of research. In this paper we have defined the coupled entropy which is a modification to the normalized Tsallis entropy. We also establish a definition for the coupled average uncertainty in the probability/density space. The coupled average probability has been successfully utilized in the analysis of information fusion algorithms [8] and is recommended as a simpler approach to understanding average uncertainty. The coupled average uncertainty is the weighted generalized mean with a power equal to. The expression is derived from and expressible in terms of deformation of the product function (7) and the coupled power (8) ( P ) i. (47) P Kavg p K + i = i=q For = 0 this is the weighted geometric mean in which each state is treated as independent and thus multiplied, and the dependency from the probabilities summing to one is contained in the weight of each probability. When 0the coupled product models the nonlinear dependency between the probabilities of the distribution and the weights are now the deformed coupled probability. Importantly, for the coupled exponential distributions the coupled average probability is equal to the density at the location plus the scale P e α Kavg =. Upon transformation to the probability domain the Tsallis, normalized Tsallis and Rényi entropies each fulfill this relationship between the coupled average uncertainty and the coupled distributions. The potentially improved properties of the coupled- entropy are achieved by requiring the coupled logarithm to have a unit integral of one. The coupled entropy is defined as the transformation of P avg to the entropy scale by the coupled logarithm, + S α, ( p) ln α, = ( + ) p i. (48) Further analysis and experimentation is recommended to determine how the coupled entropy can be used to model the effects of non- stationary uncertainty in complex thermodynamic and information systems. Z

Appendices A. Derivation of coupled entropy from average uncertainty The coupled algebra can be used to derive the equivalency between the two expressions for the coupled entropy + S α, ( p) ln α, = P ln α i, Starting with the expression on the right, the coupled power is used to move the coupled probability from a multiplier of the generalized logarithm to a power of the argument S α, ( ) ( p) ( ) = ln α, ;givenaln α, x = ln α, x a ( ) P i / expandingthecoupledpowerandusingtheproperty ln α, x = ln α, x S α, ( p) = ln ( /( )) / α, P pi /( ) i Pi ; expandingthecoupledproductandusing P i / ( /( )) / = ln α, P pi i + = ln α, ( ) ( ) = ; ;given =.

Since the normalized Tsallis entropy is just a multiple of the coupled entropy, the derivation is not repeated. For the Tsallis entropy, the normalization of the coupled probability must be incorporated into the argument of the coupled probability to that the sign of the generalized logarithm has changed T S α, ( p) + ln α, = ( )ln α, + p i ( ) expandingcoupledpower&product + = ( )ln α, ;propertyofcoupledpower&log ( ) simplifyingandremovingnegativesignfromexponent + = ( )ln α, B. Comparison of generalized entropy for coupled Gaussian The comparison of entropy functions for the Coupled Gaussian distribution shown in Figure 2 includes the following computations. The distribution analyzed is + Γ 2 2 + 2 2 ( 0,) G ( 0,) ( x ). K : = + (49) = + π Γ 2 The computations were completed using Mathematica TM. The Pare Approximation functions was used to determine 2 nd order approximations. The closed form solutions and quadratic approximations for the Coupled, Tsallis, and ormalized Tsallis entropy are: a) Coupled Entropy of Coupled Gaussian C 2, ( ) ( G) 2 Γ + + ( 2 ) + + + π σ C Γ 2 S2, ( G( 0,) ) = 2 2 2 and for σ, S 0,.4 +.384 + 0.092 b) Tsallis Entropy of Coupled Gaussian with σ = 2 + Γ 2 0.77 π + + 2 2 + Γ 0.506+ 0.530 0.252 2 (50) (5)

c) ormalized Tsallis of Coupled ormal with σ = 2 + Γ ( 2 ) + ( + ) + ( + )( π ) + Γ ( 2 ) 2.425 2.469.332 2 + + (52) 2 Closed form solutions were not determined for the Shannon and Renyi entropies. The numerical integration of the Shannon entropy has potential errors if a) for small f values of the integration extends to regions where ( x ) ln f x is undefined or if f b) for large values of the integration is truncated before ( x ) ln f x sufficiently to be excluded. The FindFit function in Mathematica was used to determine a quadratic approximation. has decayed a) Shannon Entropy for Coupled Gaussian with σ = (sampled from 0.0 to 2 by 0.0).392.75 0.038 2 + (53) b) Renyi Entropy for Coupled Gaussian with σ = (sampled from 0. to 2 by 0.0); values 0.0 0. were eliminated due to numerical errors.425 0.472 0.060 2 + (54) C. Coupled entropy of coupled exponential distribution The coupled exponential or equivalently the generalized Pareto distribution α = has a matching coupled entropy of C S, σ e x ( ( σ ) ( ) = )σ (55) which is equal to for σ =. This emphasizes that the coupled entropy accounts for uncertain through the combination of the degree of nonlinear coupling which quantifies the non- stationary fluctuations, see Equation (5), and a modified entropy measure which facilitates measurement of the central tendency of the uncertainty. This viewpoint should contribute to research aimed at improving the designs of systems influenced by non- stationary processes.

References [] T.M. Cover, J.A. Thomas, Elements of information theory, John Wiley \& Sons, 202. [2] A.I. Khinchin, Mathematical Foundations of Statistical Mechanics, Courier Corporation, 949. [3] A.I. Khinchin, Mathematical Foundations of Information Theory, Courier Corporation, 957. [4] A. Renyi, On measures of entropy and information, Berkeley Symp. Math. Stat.. (96). [5] C. Tsallis, Possible generalization of Boltzmann- Gibbs statistics, J. Stat. Phys. 52 (988) 479 487. [6] S.I. Amari, H. agaoka, Methods of Information Geometry, Oxford University Press, 2000. [7] C. Anteneodo, C. Tsallis, A.S. Martinez, Risk aversion in trade transactions, Eur. Lett. 59 (2002) 635 64. [8] K.P. elson, B.J. Scannell, H. Landau, A Risk Profile for Information Fusion Algorithms, Entropy. 3 (20) 58 532. [9] I. Santamaria, D. Erdogmus, J.C. Principe, Entropy minimization for supervised digital communications channel equalization, IEEE Trans. Signal Process. 50 (2002) 84 92. doi:0.09/78.995074. [0] W.R. Carson, M. Chen, M.R.D. Rodrigues, R. Calderbank, L. Carin, Communications- Inspired Projection Design with Application to Compressive Sensing, SIAM J. Imaging Sci. 5 (202) 85 22. doi:0.37/20878380. [] E.P. Borges, C. Tsallis, G.F.J. Ananos, P.M.C. de Oliveira, onequilibrium probabilistic dynamics of the logistic map at the edge of chaos, Phys. Rev. Lett. 89 (2002) 25403. [2]. Canosa, R. Rossignoli, Generalized nonadditive entropies and quantum entanglement, Phys. Rev. Lett. 88 (2002) 7040. [3] R. Hanel, S. Thurner, M. Gell- Mann, Generalized entropies and the transformation group of superstatistics, Proc. atl. Acad. Sci. 08 (20) 6390.

[4] R. Hanel, S. Thurner, A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions, EPL (Europhysics Lett. 93 (20) 20006. [5] K.P. elson, S. Umarov, onlinear statistical coupling, Phys. A Stat. Mech. Its Appl. 389 (200) 257 263. [6] C. Anteneodo, C. Tsallis, Multiplicative noise: A mechanism leading to nonextensive statistical mechanics, J. Math. Phys. 44 (2003) 594. [7] C. Beck, E.G.D. Cohen, Superstatistics, Phys. A Stat. Mech. Its Appl. 322 (2003) 267 275. [8] G. Wilk, Z. Włodarczyk, Interpretation of the onextensivity Parameter q in Some Applications of Tsallis Statistics and Lévy Distributions, Phys. Rev. Lett. 84 (2000) 2770 2773. doi:0.03/physrevlett.84.2770. [9] S. Umarov, C. Tsallis, On multivariate generalizations of the q- central limit theorem consistent with nonextensive statistical mechanics, AIP Conf. Proc. 965 (2007) 34. [20] K.P. elson, A definition of the coupled- product for multivariate coupled- exponentials, Phys. A Stat. Mech. Its Appl. (204). [2] E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Phys. A Stat. Mech. Its Appl. 340 (2004) 95 0. [22] G. Kaniadakis, M. Lissia, A.M. Scarfone, Two- parameter deformations of logarithm, exponential, and entropy: A consistent framework for generalized statistical mechanics, Phys. Rev. E. 7 (2005) 4628. [23] T.J. Arruda, R.S. González, C.A.S. Terçariol, A.S. Martinez, Arithmetical and geometrical means of generalized logarithmic and exponential functions: generalized sum and product operators, Phys. Lett. A. 372 (2008) 2578 2582. [24] S. Umarov, C. Tsallis, M. Gell- Mann, S. Steinberg, Generalization of symmetric α- stable Lévy distributions for q>, J. Math. Phys. 5 (200) 33502. [25] S. Umarov, C. Tsallis, M. Gell- Mann, S. Steinberg, Generalization of symmetric alpha- stable Lévy distributions for q>., J. Math. Phys. 5 (200) 33502. doi:0.063/.3305292. [26] C. Tsallis, Introduction to onextensive Statistical Mechanics: Approaching a Complex World, Springer Verlag, 2009.

[27] C. Tsallis, A.R. Plastino, R.F. Alvarez- Estrada, Escort mean values and the characterization of power- law- decaying probability densities, J. Math. Phys. 50 (2009) 43303. [28] T. Oikonomou, Tsallis, Renyi and nonextensive Gaussian entropy derived from the respective multinomial coefficients, Phys. A Stat. Mech. Its Appl. 386 (2007) 9 34.