SciELO - Scientific Electronic Library Online

vol.64 número6Electronic excitation of atoms by positron impact using the scaling born approachA study of thermodynamic properties of quadratic exponential-type potential in D-dimensions índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados




Links relacionados

  • Não possue artigos similaresSimilares em SciELO


Revista mexicana de física

versão impressa ISSN 0035-001X

Rev. mex. fis. vol.64 no.6 México Nov./Dez. 2018  Epub 05-Nov-2019 


Peculiarities of some classical variational treatments using the maximum entropy principle

A. Plastinoa  c  d 

M.C. Roccaa  b  c 

a Departamento de Física, Universidad Nacional de La Plata,

b Departamento de Matemática, Universidad Nacional de La Plata,

c Consejo Nacional de Investigaciones Científicas y Tecnológicas, (IFLP-CCT-CONICET)-C. C. 727, 1900 La Plata - Argentina.

d SThAR - EPFL, Lausanne, Switzerland.


We study some peculiarities of the classical variational treatment that applies Jaynes’ maximum entropy principle. The associated variational treatment is usually called MaxEnt. We deal with it in connection with thermodynamics’ reciprocity relations. Two points of view are adopted: (A) One of them is purely abstract, concerned solely with ascertaining compliance of the variational solutions with the reciprocity relations in which one does not need here to have explicit values for the Lagrange multipliers. The other, (B) is a straightforward variation process in which one explicitly obtains the specific values of these multipliers. We focus on the so called q-entropy because it illustrates a situation in which the above two approaches yield different results. We detect an information loss in extracting the explicit form of the normalization-associated Lagrange multipliers.

Keywords: Tsallis-entropy; MaxEnt; Variational treatments; Reciprocity relations

PACS: 05.20.-y; 05.70.Ce; 05.90.+m

1. Introduction

The most popular approach to develop the main Statistical Mechanics formalisms proceeds today via application of Jaynes’ maximum entropy principle, usually abbreviated as MaxEnt [1]. We apply it here in connection with generalized entropies of the Tsallis type, that have become a very important sub-field of statistical mechanics, with several thousand papers and applications on many scientific disciplines[2,3]. Since the number of references is at least ten times the order just mentioned, we will mainly direct the reader to [2,3] and references therein. In this effort we focus attention on reciprocity relations and re-consider some issues concerning generalized entropies that, we believe, lack yet adequate understanding, even if they have been on the discussion table for many years in variegated publications. In particular, we want to shed light on some issues regarding the canonical ensemble. More specifically:

  1. The way to explicitly obtain the normalization Lagrange multiplier λN in the Tsallis’ variational problem with linear constraints.

  2. A two-way approach to reciprocity relations.

  3. Differences between them that entail information loss.

2. Preliminary matters

2.1. Notation

  • λU

  • is the energy U multiplier, which is related to the system’s temperature T,

  • λN

  • is the normalization multiplier.

In statistical mechanics, these multipliers are always endowed with meaningful physical information[4].

We use the q-functions[2]

eq(x)=1/(1-q);eq(x)=exp(x)forq=1; (1)

lnq(x)=x(1-q)-11-q;lnq(x)=ln(x)forq=1. (2)

2.2. Reciprocity relations and thermodynamics

It is well known that the Legendre transform (LT) constitutes an operation that converts a real function f with real variable x into another fT, of another variable y, keeping constant the information content of f. The derivative of f becomes the argument of fT.

fT(y)=xy-f(x);y=f'(x)reciprocity. (3)

The LT becomes its own inverse. Famously, one appeals to it to pass from Lagrangians to Hamiltonians in classical mechanics.

LT’ reciprocity relations constitute thermodynamics’ essential formal ingredient[5]. For two functions I (for instance, information measure) and I~, one has

I(A1,,AM)=I~+k=1MλkAk, (4)

with the Ai extensive variables and the λiindependent intensive ones. Obviously, the Legendre transform main goal is that of changing the identity of our relevant independent variables. For I~ we have

I~(λ1,,λM)=I-k=1MλkAk. (5)

One further has [5]

I~λk=-Ak ;IAk=λk ;Iλi=kMλkAkλi, (6)

the reciprocity relations, the last one being the so-called Euler theorem. In this paper we pay special attention to the specific reciprocity relation

IAk=λk. (7)

Why? Because in Jaynes’ philosophy [1]I is the information amount, to be maximized subject to known constraints Ak. The associated Lagrange multipliers are to be obtained by solving the partial differential equations (7), that here after will be called the determining relations (DR).

2.3. Boltzmann-Gibbs (BG) MaxEnt variational problem

It is useful to recapitulate it. We work on RN. The volume element is called dV. One has

SBG=-dVf(p)=-dVplnp, (8)

with the variational problem leading to

f'(p)-λN-λUU=0, (9)


f'=-lnp-1, (10)

We define now g(ξ) as the inverse of f'(ξ) such that g&[f'(ξ)&]=ξ and here

g(ν)=exp&[-(ν+1)&],, (11)

and thus

ξME=g(λN+λUU), (12)


ξME=exp[-(λN+1+λUU)]=exp[-(λN+1)]exp[-λUU)], (13)

so that one can easily extract, via normalization, a partition function Z

dVexp&[-(λN+1)&]exp&[-λUU)&]=1dVexp&[-λUU)&]=Z=exp&[(λN+1)&]ξME=exp&[-λUU)&]dVexp&[-λUU)&]lnZ=λN+1, (14)

and one obtains explicitly the relation between Z and λN. It goes without saying that the reciprocity relations are satisfied [1], in particular the determining relation of the precedent subsection. Moreover, it is found that

λU=1/kT, (15)

with k Boltzmann’s constant.

3. Normalization of the Lagrange multiplier in Tsallis’ MaxEnt with linear constraints

3.1. Variational problem

This subject was fully treated for i) trace-form entropies and ii) M observables as constraints, in [6]. We regard it instructive the explicit Tsallis-application of such discussion, which, as far as we know, has not been given before in the detailed fashion available here.

Our probability density functions (PDFs) are designed with Greek letters like ξ. ξME stands for the MaxEnt PDF.

We have two identical ways of defining the q- entropy

S1=dVf(ξ), (16)


f(ξ)=ξ-ξqq-1, (17)


S2=1q-11-dVξq, (18)

that however, leads to different variational problems, as we shall immediately see. Our a priori knowledge is the mean energy U (canonical ensemble). The MaxEnt variational problem for S2 becomes

f'(ξ)-λN-λUU=-qξq-1/(q-1)-λN-λUU=0, (19)


f'(ξ)=-qξq-1/(q-1). (20)

Instead, for S1 one has

1-qξq-1q-1-λUU-λN=0, (21)

f'(ξ)=1-qξq-1q-1. (22)

We define now g(ξ) as the inverse of f'(ξ) such that g&[f'(ξ)&]=ξ. One has for the S1 instance

g(ν)=q1-q&[1-(q-1)ν&]1/(q-1)=q1-qe(2-q)(ν). (23)

From (21) it is obvious that

ξME=g(λN+λUU), (24)


ξME=g(λN+λUU)=q1-qe(2-q)(λN+λUU), (25)

so that you cannot extract λN from that expression. One does not obtain explicitly the relation between Z and λN as in (14) for BG.

A similar result is obtained if we consider S2 instead of S1, i.e.,

ξME=g(λN+λUU)=[1-(q-1)(λN+λUU)]1/(q-1). (26)

Note that g for S1 and the g for S2 are slightly different.

This extraction problem affects only Sq’s MaxEnt treatment with linear constraint and can be avoided by recourse to nonlinear constraints, as in Refs. [7,8].

Our interest resides precisely in discussing the peculiarities of the MaxEnt variational treatment that are emerging with regards to the present problem (and the reason for dealing with linear constraints here). The information contained in our variational problem can be managed in two different manners. The following Subsection deals with the fist of these ways, that we may call the abstract one. Sections 4 and 5 consider an alternative route, that we may call the explicit one. The two ways yield different results, as we shall see, originating what we regard as an information management problem.

3.2. Reciprocity relations

These pitfalls notwithstanding, the reciprocity relations hold. This is so because they depend only on the formal existence of the function g(ν). We specify now for this case the general treatment of [6]. We do not need, in the subsequent manipulations, to distinguish between S1 and S2.

Because of ξ-normalization it is clear that, in general, both for S1 and S2, one has

λUdVξ=0=dVg'(λN+λUU)[λNλU+U]=0, (27)

a relation that we presently will employ. Also, we have, for U/(λU)

UλU=dVUg'(λN+λUU)λNλU+U (28)

Next we consider S/λU and write

SλU=λUdVf[g(λN+λUU)]=dVf'[g(λN+λUU)] (29)

×g'(λN+λUU)[λNλU+U]. (30)

Remembering that f'g(ν)=ν and (27) we simplify this to

SλU=λUdVUg'(λN+λUU)λNλU+U, (31)

leading to

SλU=λUUλU, (32)

the Euler relation. Now we have

SU=SλUλUU=λUUλUλUU=λU, (33)

the first reciprocity relation.

Introduce now the Jaynes’ parameter S~ (the Legendre transform of S)

S~(λU)=S(U)-λUU(λU). (34)

It is clear that

S~λU=SUUλU-λUUλU-U=-U, (35)

the second reciprocity relation. Note that we do not need to explicitly ascertain specific values to λU and λN in order to establish the reciprocity relations. In particular, we do not need to solve the determining equation of Subsec. 2.2.

3.3 Solving for the Lagrange multipliers

Usually, this is a very difficult numerical problem. A practical alternative is to numerically solve, once we have ξME, the M+1 set of equations that read, using the notation of Subsec. 2.2,

dVξME(λ0,λ1,,λM)Ak=Ak, (36)

with k=0,1,,M, ( A0=1). This gives the M+1 Lagrange multipliers in terms of the assumedly known M+1 quantities Ak [in particular, if there is an energy multiplier (called λU above) it is set equal to 1/kT ]. See Ref. [9] for details. Can one bypass this difficult process? This is what we will try to do next.

4. Explicit Lagrange multipliers in Tsallis’ original variational problem for S2

Return now to (21). We saw that we can not immediately derive from it a value for λN [2]. A heuristic solution, is to set

λN=-qq-1ZTq-1+1q-1=1q-11-qZTq-1 (37)

with ZT unknown at this stage , andλU as follows

λU=qZT1-qβ. (38)

where β=(1/kBT), T the temperature. The variational problem is now

1-qξq-1(q-1)=-qq-1ZT1-q+1q-1+ZT1-qqβU=0 (39)

ξq-1=ZT1-q[1-(q-1)βU], (40)

so that

ξME=ZT-1[1-(q-1)βU]1/(q-1), (41)

where β is NOT the variational multiplier λU, as stated above. Further,

ZT=dV[1-(q-1)βU]1/(q-1). (42)

We have now

ξqξ1-q=ξ;ξqlnq(ξ)=ξ-ξq1-q, (43)

and then

Sq=dVξqξ=dVξ[1-ξq-1]/(q-1) (44)

=dVξ[1-(1/ZT)q-1[1-(q-1)βU]]/(q-1) (45)

=dVξ[1-(1/ZT)q-1]/(q-1)+(1/ZT)q-1βU (46)

=dVξlnqZT+ZT1-qβU. (47)

Sq=lnq(ZT)+ZT1-qβU, (48)

so that

SqU=βZT1-q=λU/q, (49)

the quasi-reciprocity relation with a denominator q. Giving λN an explicit form has resulted in a wrong reciprocity relation. This fact could be interpreted as an information loss, as a result of not solving the determining equation of Subsec. 2.2.

5. Explicit Lagrange multipliers for S2

Return now to (21). Let us insist on solving the variational problem, though, without solving the determining equation of Subsec. 2.2. An heuristic solution is to set [see [10]]

λN=-qq-1ZTq-1+1q-1=1q-1[1-qZTq-1] (50)

ZT yet unknown (51)

and rename λU as follows

λU=qZT1-qγγ=λUqZT1-q. (52)

Now γ can be set equal to 1/kT. The variational problem is here

-qξq-1/(q-1)+qq-1ZT1-q-qZT1-qγU=0 (53)

ξq-1=ZT1-q[1-q-1γU], (54)

so that

ξME=ZT-1e2-q(-λUU)=ZT-1[1-(q-1)γU]1/(q-1);ξMEq-1=ZT1-q[1-(q-1)γU]1/(q-1) (55)

where β is not the variational λU. Further,

ZT=dV[1-(q-1)γU]1/(q-1). (56)

Thus, we have

ξqξ1-q=ξ;ξqlnq(ξ)=ξ-ξq1-q, (57)

and then

Sq=-dVξqlnq(ξ)=dVξ[1-ξq-1]/(q-1) (58)

=dVξ[1-(1/ZT)q-1[1-(q-1)γU]]/(q-1) (59)

=dVξ[[1-(1/ZT)q-1]/(q-1)]+(1/ZT)q-1γU] (60)

=dVξ[lnq(ZT)+ZT1-qγU]. (61)

Sq=lnq(ZT)+ZT1-qβU=lnq(ZT)+λUU/q, (62)

so that one is led to a slightly modified reciprocity relation

SqU=γZT1-q=λU/q, (63)

identical in form to that for S2, but wrong as well. Once more, giving λN an explicit form has resulted in an incorrect reciprocity relation. We detect then what could be read as an information loss, as a result of not solving the determining equation of Subsec. 2.2.

6. Conclusions

We have revisited Tsallis’ original treatment of the q-entropy[2] with focus on the reciprocity relations. They are valid, of course, as has been known for many years already. However, some peculiarities of the treatment have been detected here.

  • There are two forms of casting the q-entropy, that we called S1 and S2. They are identical, but the associated MaxEnt variational treatments do differ for each of them if one wants explicit values for the Lagrange multipliers.

  • The Lagrange normalization multiplier λU can not be obtained in explicit fashion, as it is well known [2]. We have found here ways to overcome this obstacle and obtained two different versions of λU, associated to S1 and S2, respectively.

  • The ways to overcome the obstacle encountered here cannot be used in physical applications, though, since they entail information loss. The main physical consequence of this fact is that appeal to the methodology of Sec. 3.3 is unavoidable.

  • There is a price to pay for our λU -extraction procedure. The ensuing reciprocity relation for entropy/energy, (Sq/<U>), equals not λU (as it should) but λU/q, for both cases S1 and S2, as a result of not solving the determining equation of Subsec. 2.2. It is nonetheless gratifying that our two wrong equations coincide, since the entropy is just one and the physics, i.e., the reciprocity relation, should not depend on whether we use S1 or S2.

  • Remark that S1=S2 as functions but their associated variational problems are not identical.

  • In abstract form one can show that the reciprocity relations are indeed valid, as we showed in Sec. 3.2, without need for appealing to explicit knowledge of the Legendre multipliers. One only requires that a special function that we called g does exist.

  • This g is different in the S1 or the S2 cases.

  • Obtaining the Lagrange normalization multiplier λU in explicit fashion, although accomplished via a seemingly legitimate symbolic manipulation, entails however some information loss, as the reciprocity relation entropy-energy is not exactly re-obtained, as a result of not solving the determining equation of Subsec. 2.2. Usually, this is a very difficult numerical problem. A practical alternative, as we saw above, is to numerically solve, once we have ξME, the M+1 set of equations (36).


1. E.T. Jaynes, in: Statistical physics, ed. by W.K. Ford (Benjamin, New York, 1963); [ Links ]

A. Katz, Statistical mechanics (Freeman, San Francisco, 1967). [ Links ]

2. M. Gell-Mann and C. Tsallis, Eds. Nonextensive Entropy: Interdisciplinary applications, Oxford University Press, (Oxford, 2004); [ Links ]

C. Tsallis , Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World, Springer, (New York, 2009). [ Links ]

3. See for a regularly updated bibliography on the subject. [ Links ]

4. R. B. Lindsay, H. Margenau, Foundations of physics (Dover, NY, 1957). [ Links ]

5. E. A. Desloge, Thermal physics NY, Holt, (Rhinehart and Winston, 1968). [ Links ]

6. A. Plastino, A. R. Plastino, Phys. Lett. A 226 (1997) 257. [ Links ]

7. E. M. F. Curado, C. Tsallis , J. Phys. A 24 (1991) L69. [ Links ]

8. C. Tsallis , R. S. Mendes, A.R. Plastino, Physica A 261 (1998) 534. [ Links ]

9. J. Skilling, Massive Inference and Maximum Entropy, in Maximum Entropy and Bayesian Methods, pages 1-14, (1997). Proceedings of the 17th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis, Editors J. Erickson, Joshua T. Rychert, C. Ray Smith (Springer, New York). [ Links ]

10. A. Plastino , M. C. Rocca, F. Pennini, Phys. Rev. E 94 (2016) 012145. [ Links ]

Received: June 05, 2018; Accepted: August 02, 2018

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License