Inequalities for Information Potentials and Entropies

Abstract

We consider a probability distribution \((p_0(x), p_1(x), . . .)\) depending on a real parameter \(x\). The associated information potential is \(S(x):=∑_{k}p_{k}^2(x).\) The Rényi entropy and the Tsallis entropy of order \(2\) can be expressed as \(R(x) = − \log S(x)\) and \(T(x) = 1 − S(x)\). We establish recurrence relations, inequalities and bounds for \(S(x)\), which lead immediately to similar relations, inequalities and bounds for the two entropies. We show that some sequences \((R_n(x))_{n≥0}\) and \((T_n(x))_{n≥0}\), associated with sequences of classical positive linear operators, are concave and increasing. Two conjectures are formulated involving the information potentials associated with the Durrmeyer density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.

Authors

Ana Maria Acu
Lucian Blaga University of Sibiu, Sibiu, Romania

Alexandra Măduța
Technical University of Cluj-Napoca, Cluj-Napoca, Romania

Diana Otrocol
Technical University of Cluj-Napoca, Cluj-Napoca, Romania
Tiberiu Popoviciu Institute of Numerical Analysis, Romanian Academy

Ioan Rașa
Technical University of Cluj-Napoca, Cluj-Napoca, Romania

Keywords

probability distribution; Rényi entropy; Tsallis entropy; information potential; functional equations; inequalities

References

see the expanding block below

PDF

Cite this paper as:

A.M. Acu,  A. Măduța,  D. Otrocol, I. Rașa, Inequalities for information potentials and entropies8 (2020) 11, pp. 2056, doi: 10.3390/math8112056

About this paper

Journal

Mathematics

Publisher Name

MDPI

Print ISSN

Not available yet.

Online ISSN

ISSN 2227-7390,

Google Scholar Profile

1. Harremoës, P., Topsoe, F., Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inf. Theory 2001, 47, 2944–2960.
2. Harremoës, P., Binomial and Poisson distribution as maximum entropy distributions. IEEE Trans. Inf. Theory 2001, 47, 2039–2041.
3. Hillion, E., Concavity of entropy along binomial convolution. Electron. Commun. Probab. 2012, 17, 1–9.
4. Hillion, E., Johnson, O., A proof of the Shepp-Olkin entropy concavity conjecture. Bernoulli 2017, 23, 3638–3649.
5. Knessl, C., Integral representation and asymptotic expansions for Shannon and Renyi entropies. Appl. Math. Lett. 1998, 11, 69–74.
6. Adell, J.A., Lekuona, A., Yu, Y., Sharp bound on the entropy of the Poisson low and related quantities. IEEE Trans. Inf. Theory 2010, 56, 2299–2306.
7. Melbourne, J., Tkocz, T., Reversals of Rényi entropy inequalities under log-concavity. IEEE Trans. Inf. Theory 2020.
8. Shepp, L.A., Olkin, I.,  Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution, Contributions to Probability A Collection of Papers Dedicated to Eugene Lukacs; Academic Press: London, UK, 1981; pp. 201–206.
9. Hillion, E., Johnson, O., Discrete versions of the transport equation and the Shepp-Olkin conjecture. Ann. Probab. 2016, 44, 276–306.
10. Alzer, H., A refinement of the entropy inequality. Ann. Univ. Sci. Bp. 1995, 38, 13–18.
11. Chang, S.-C., Weldon, E.,  Coding for T-user multiple-access channels. IEEE Trans. Inf. Theory 1979, 25, 684–691.
12. Xu, D. Energy, Entropy and Information Potential for Neural Computation. Ph.D. Thesis, University of Florida, Gainesville, FL, USA , 1999.
13. Barar, A., Mocanu, G. R., Rasa, I., Bounds for some entropies and special functions. Carpathian J. Math. 2018, 34, 9–15.
14. Principe, J.C., Information Theoretic Learning: Rényi’s Entropy and Kernel Perspectives; Springer: New York, NY, USA, 2010.
15. Acu, A.M., Bascanbaz-Tunca, G.,  Rasa, I., Information potential for some probability density functions. Appl. Math. Comput. 2021, 389, 125578.
16. Acu, A.M., Bascanbaz-Tunca, G., Rasa, I., Bounds for indices of coincidence and entropies. submitted.
17. Rasa, I., Entropies and Heun functions associated with positive linear operators. Appl. Math. Comput. 2015, 268, 422–431.
18. Baskakov, V.A., An instance of a sequence of positive linear operators in the space of continuous functions. Doklady Akademii Nauk SSSR 1957, 113, 249–251.
19. Berdysheva, E., Studying Baskakov–Durrmeyer operators and quasi-interpolants via special functions. J. Approx. Theory 2007, 149, 131–150.
20. Heilmann, M., Erhohung der Konvergenzgeschwindigkeit bei der Approximation von Funktionen mit Hilfe von Linearkombinationen Spezieller Positiver Linearer Operatoren; Habilitationschrift Universitat Dortmund: Dortmund, Germany, 1992.
21. Wagner, M., Quasi-Interpolaten zu genuinen Baskakov-Durrmeyer-Typ Operatoren; Shaker: Aachen, Germany, 2013.
22. Acu, A.M., Heilmann, M., Rasa, I., Linking Baskakov Type Operators, Constructive theory of functions, Sozopol 2019; Draganov, B., Ivanov, K., Nikolov, G., Uluchev, R., Eds.; Prof. Marin Drinov Publishing House of BAS: Sofia, Bulgaria, 2020; pp. 23–38.
23. Heilmann, M., Rasa, I., A nice representation for a link between Baskakov and Szász-Mirakjan-Durrmeyer operators and their Kantorovich variants. Results Math. 2019, 74, 9.
24. Rasa, I., Rényi entropy and Tsallis entropy associated with positive linear operators. arXiv 2014, arXiv:1412.4971v.1.
25. Nikolov, G., Inequalities for ultraspherical polynomials. Proof of a conjecture of I. Ra¸sa. J. Math. Anal. Appl. 2014, 418, 852–860.
26. Gavrea, I., Ivan, M., On a conjecture concerning the sum of the squared Bernstein polynomials. Appl. Math. Comput. 2014, 241, 70–74.
27. Alzer, H., Remarks on a convexity theorem of Rasa. Results Math. 2020, 75, 29.
28. Cover, T.M., Thomas, J.A., Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2006.
29. Abramowitz, M., Stegun, I.A., Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Dover Publications, Inc.: New York, NY, USA, 1970.
30. Barar, A., Mocanu, G., Rasa, I., Heun functions related to entropies. RACSAM 2019, 113, 819–830.
31. Rasa, I., Convexity properties of some entropies (II). Results Math. 2019, 74, 154.
32. Rasa, I., Convexity properties of some entropies. Results Math. 2018, 73, 105.
33. Cloud, M.J., Drachman, B.C., Inequalities: With Applications to Engineering; Springer: Berlin/Heidelberg, Germany, 2006.
34. Abel, U., Gawronski, W., Neuschel, T., Complete monotonicity and zeros of sums of squared Baskakov functions. Appl. Math. Comput. 2015, 258, 130–137.
35. Gould, H.W., Combinatorial Identities—A Standardized Set of Tables Listing 500 Binomial Coefficient Summations; West Virginia University Press: Morgantown, VA, USA, 1972.
36. Altomare, F., Campiti, M., Korovkin-Type Approximation Theory and Its Applications; Walter de Gruyter: Berlin, Germany; New York, NY, USA, 1994.

Inequalities for Information Potentials and Entropies

Ana Maria Acu Lucian Blaga University of Sibiu, Department of Mathematics and Informatics, Str. Dr. I. Ratiu, No. 5-7, Sibiu RO-550012, Romania anamaria.acu@ulbsibiu.ro , Alexandra Măduţa Technical University of Cluj-Napoca, Department of Mathematics, 28 Memorandumului Street, 400114 Cluj-Napoca, Romania boloca.alexandra91@yahoo.com , Diana Otrocol Technical University of Cluj-Napoca, Department of Mathematics, 28 Memorandumului Street, 400114 Cluj-Napoca, Tiberiu Popoviciu Institute of Numerical Analysis, Romanian Academy, P.O.Box. 68-1, 400110 Cluj-Napoca, Romania Diana.Otrocol@math.utcluj.ro and Ioan Raşa Technical University of Cluj-Napoca, Department of Mathematics, 28 Memorandumului Street, 400114 Cluj-Napoca, Romania Ioan.Rasa@math.utcluj.ro
Abstract.

We consider a probability distribution (p0(x),p1(x),)\left(p_{0}(x),p_{1}(x),\ldots\right) depending on a real parameter xx. The associated information potential is S(x):=kpk2(x)S(x):=\sum\limits_{k}p_{k}^{2}(x). The Rényi entropy and the Tsallis entropy of order 2 can be expressed as R(x)=logS(x)R(x)=-\log S(x) and T(x)=1S(x)T(x)=1-S(x). We establish recurrence relations, inequalities and bounds for S(x)S(x), which lead immediately to similar relations, inequalities and bounds for the two entropies. We show that some sequences (Rn(x))n0\left(R_{n}(x)\right)_{n\geq 0} and (Tn(x))n0\left(T_{n}(x)\right)_{n\geq 0}, associated with sequences of classical positive linear operators, are concave and increasing. Two conjectures are formulated involving the information potentials associated with the Durrmeyer density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.
MSC: 39B22, 39B62, 94A17, 26D07.
Keywords: probability distribution; Rényi entropy; Tsallis entropy; information potential; functional equations; inequalities.

1. Introduction

Entropies associated with discrete or continuous probability distributions are usually described by complicated explicit expressions, depending on one or several parameters. Therefore, it is useful to establish lower and upper bounds for them. Convexity-type properties are also useful: they embody valuable information on the behavior of the functions representing the entropies.

This is why bounds and convexity-type properties of entropies, expressed by inequalities, are under an active study: see [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13] and the references therein. Our paper is concerned with this kind of inequalities: we give new results and new proofs or improvements of some existing results, in the framework which is presented below.

Let (p0(x),p1(x),)\left(p_{0}(x),p_{1}(x),\ldots\right) be a probability distribution depending on a parameter xIx\in I, where II is a real interval. The associated information potential (also called index of coincidence, for obvious probabilistic reasons) is defined (see [14]),

(1.1) S(x):=kpk2(x),xI.S(x):=\sum\limits_{k}p_{k}^{2}(x),\ x\in I.

If p(t,x),t,xIp(t,x),\ t\in\mathbb{R},\ x\in I, is a probability density function depending on the parameter x,x, the associated information potential is defined as (see [14]),

(1.2) S(x):=p2(t,x)𝑑t,xI.S(x):=\int\nolimits_{\mathbb{R}}p^{2}(t,x)dt,\ x\in I.

The information potential is the core concept of the book [14]. The reader can find properties, extensions, generalizations of S(x)S(x), as well as applications to Information theoretic learning. Other properties and applications can be found in the recent papers [15] and [16].

It is important to remark that the Rényi entropy and the Tsallis entropy can be expressed in terms of S(x)S(x) as

(1.3) R(x)=logS(x),T(x)=1S(x),xI.R(x)=-\log S(x),\ T(x)=1-S(x),\ x\in I.

So the properties of S(x)S(x) lead immediately to properties of R(x),R(x), respectively T(x)T(x).

On the other hand, we can consider the discrete positive linear operators

(1.4) Lf(x):=kpk(x)f(xk),xI,Lf(x):=\sum\limits_{k}p_{k}(x)f(x_{k}),\ x\in I,

where xkx_{k} are given points in \mathbb{R}, and the integral operators

(1.5) Mf(x)=p(t,x)f(t)𝑑t,xI.Mf(x)=\int\nolimits_{\mathbb{R}}p(t,x)f(t)dt,\ x\in I.

In both cases, ff is a function from a suitable set of functions defined on \mathbb{R}. In this paper, we consider classical operators of this kind, which are used in approximation theory.

Let us mention that the “degree of nonmultiplicativity” of the operator LL can be estimated in terms of the information potential S(x)S(x): see [17] and the references therein.

In this paper, we will be concerned with a special family of discrete probability distributions, described as follows.

Let cc\in\mathbb{R}. Set Ic=[0,1c]I_{c}=\left[0,-\frac{1}{c}\right] if c<0c<0, and Ic=[0,+)I_{c}=[0,+\infty) if c0c\geq 0. For α\alpha\in\mathbb{R} and k0k\in\mathbb{N}_{0} the binomial coefficients are defined as usual by

(αk):=α(α1)(αk+1)k!if k, and (α0):=1.{\binom{\alpha}{k}}:=\frac{\alpha(\alpha-1)\dots(\alpha-k+1)}{k!}\quad\mbox{if }k\in\mathbb{N},\mbox{ and }{\binom{\alpha}{0}}:=1.

Let n>0n>0 be a real number, k0k\in\mathbb{N}_{0} and xIcx\in I_{c}. Define

(1.6) pn,k[c](x):=(1)k(nck)(cx)k(1+cx)nck, if c0,p_{n,k}^{[c]}(x):=(-1)^{k}{\tbinom{-\frac{n}{c}}{k}}(cx)^{k}(1+cx)^{-\frac{n}{c}-k},\quad\mbox{ if }c\neq 0,
(1.7) pn,k[0](x):=limc0pn,k[c](x)=(nx)kk!enx, if c=0.p_{n,k}^{[0]}(x):=\lim_{c\rightarrow 0}p_{n,k}^{[c]}(x)=\frac{(nx)^{k}}{k!}e^{-nx},\quad\mbox{ if }c=0.

Then k=0pn,k[c](x)=1\sum_{k=0}^{\infty}p_{n,k}^{[c]}(x)=1. Suppose that n>cn>c if c0c\geq 0, or n=cln=-cl with some ll\in\mathbb{N} if c<0c<0.

With this notation, we consider the discrete distribution of probability (pn,k[c](x))k=0,1,\left(p_{n,k}^{[c]}(x)\right)_{k=0,1,...} depending on the parameter xIc.x\in I_{c}.

According to (1.1), the associated information potential, or index of coincidence, is

(1.8) Sn,c(x):=k=0(pn,k[c](x))2,xIc.S_{n,c}(x):=\sum\limits_{k=0}^{\infty}\left(p_{n,k}^{[c]}(x)\right)^{2},\ x\in I_{c}.

The Rényi entropy and the Tsallis entropy corresponding to the same distribution of probability are defined, respectively (see (1.3))

(1.9) Rn,c(x)=logSn,c(x)R_{n,c}(x)=-\log S_{n,c}(x)

and

(1.10) Tn,c(x)=1Sn,c(x).T_{n,c}(x)=1-S_{n,c}(x).

For c=1c=-1 (1.6) reduces to the binomial distribution and (1.8) becomes

(1.11) Sn,1(x):=k=0n((nk)xk(1x)nk)2,x[0,1].S_{n,-1}(x):=\sum\limits_{k=0}^{n}\left({\dbinom{n}{k}}x^{k}(1-x)^{n-k}\right)^{2},\ x\in[0,1].

The case c=0c=0 corresponds to the Poisson distribution (see (1.7)), which

(1.12) Sn,0(x):=e2nxk=0(nx)2k(k!)2,x0.S_{n,0}(x):=e^{-2nx}\sum\limits_{k=0}^{\infty}\dfrac{(nx)^{2k}}{\left(k!\right)^{2}},\ x\geq 0.

For c=1c=1, we have the negative binomial distribution, with

(1.13) Sn,1(x):=k=0((n+k1k)xk(1+x)nk)2,x0.S_{n,1}(x):=\sum\limits_{k=0}^{\infty}\left({\dbinom{n+k-1}{k}}x^{k}(1+x)^{-n-k}\right)^{2},\ x\geq 0.

The binomial, Poisson, respectively negative binomial distributions correspond to the classical Bernstein, Szász-Mirakyan, respectively Baskakov operators from approximation theory; all of them are of the form (1.4). In fact, the distribution (pn,k[c](x))k=0,1,\left(p_{n,k}^{[c]}(x)\right)_{k=0,1,...} is instrumental for the construction of the family of positive linear operators introduced by Baskakov in [18]; see also [19, 20, 21, 22, 23]. As a probability distribution, the family of functions (pn,k[c])k=0,1,(p_{n,k}^{[c]})_{k=0,1,\dots} was considered in [24, 17].

The distribution

(1.14) ((nk)xk(1+x)n)k=0,1,,n,x[0,+),\left(\dbinom{n}{k}x^{k}(1+x)^{-n}\right)_{k=0,1,\ldots,n},\ x\in[0,+\infty),

corresponds to the Bleimann–Butzer–Hahn operators, while

(1.15) ((n+kk)xk(1x)n+1)k=0,1,,x[0,1),\left(\dbinom{n+k}{k}x^{k}(1-x)^{n+1}\right)_{k=0,1,\ldots},\ x\in[0,1),

is connected with the Meyer-König and Zeller operators.

The information potentials and the entropies associated with all these distributions were studied in [17]; see also [25, 26, 27]. It should be mentioned that they satisfy Heun-type differential equations: see [17]. We continue this study. To keep the same notation as in [17], let us return to (1.11)–(1.13) and denote

Fn(x)\displaystyle F_{n}(x) :=Sn,1(x),Gn(x):=Sn,1(x),Kn(x):=Sn,0(x).\displaystyle:=S_{n,-1}(x),\ G_{n}(x):=S_{n,1}(x),\ K_{n}(x):=S_{n,0}(x).

Moreover, the information potential corresponding to (1.14) and (1.15) will be denoted by

(1.16) Un(x)\displaystyle U_{n}(x) :=k=0n((nk)xk(1+x)n)2,x[0,+),\displaystyle:={\textstyle\sum\limits_{k=0}^{n}}\left(\dbinom{n}{k}x^{k}(1+x)^{-n}\right)^{2},\ x\in[0,+\infty),
(1.17) Jn(x)\displaystyle J_{n}(x) :=k=0((n+kk)xk(1x)n+1)2,x[0,1).\displaystyle:={\textstyle\sum\limits_{k=0}^{\infty}}\left(\dbinom{n+k}{k}x^{k}(1-x)^{n+1}\right)^{2},\ x\in[0,1).

In Section 2, we present several relations between the functions Fn(x)F_{n}(x), Gn(x),Un(x),Jn(x)G_{n}(x),\ U_{n}(x),\ J_{n}(x), as well as between these functions and the Legendre polynomials. By using the three-terms recurrence relations involving the Legendre polynomials, we establish recurrence relations involving three consecutive terms from the sequences (Fn(x))\left(F_{n}(x)\right), (Gn(x)),(Un(x)),\left(G_{n}(x)\right),\ \left(U_{n}(x)\right),\ respectively (Jn(x))\left(J_{n}(x)\right). We recall also some explicit expressions of these functions.

Section 3 is devoted to inequalities between consecutive terms of the above sequences; in particular, we emphasize that for fixed xx, the four sequences are logarithmicaly convex and hence convex.

Other inequalities are presented in Section 4. All the inequalities can be used to get information about the Rényi entropies and Tsallis entropies connected with the corresponding probability distributions.

Section 5 contains new properties of the function Un(x)U_{n}(x) and a problem of its shape.

Section 6 is devoted to some inequalities involving integrals of the form abf2(x)𝑑x\int_{a}^{b}f^{\prime 2}(x)dx in relation with certain combinatorial identities.

The information potential associated with the Durrmeyer density of probability is computed in Section 7. We recall a conjecture formulated in [24].

As already mentioned, all the results involving the information potential can be used to derive results about Rényi and Tsallis entropies. For the sake of brevity, we will study usually only the information  potential.

Concerning the applications of Rényi entropies and Tsallis entropies see, e.g., [14, 28].

2. Recurrence Relations

Fn(x)F_{n}(x) is a polynomial, Gn(x),Un(x),Jn(x)G_{n}(x),\ U_{n}(x),\ J_{n}(x) are rational functions. On their maximal domains, these functions are connected by several relations (see [17], Cor. 13, (46), (53), (54)):

(2.1) Fn(x)\displaystyle F_{n}(x) =(12x)2n+1Gn+1(x),\displaystyle=(1-2x)^{2n+1}G_{n+1}(-x),
(2.2) Fn(x)\displaystyle F_{n}(x) =Un(x1x),\displaystyle=U_{n}\left(\frac{x}{1-x}\right),
(2.3) Fn(x)\displaystyle F_{n}(x) =(12x)2n+1Jn(x1x).\displaystyle=-(1-2x)^{2n+1}J_{n}\left(\frac{x-1}{x}\right).

Consider the Legendre polynomial (see [29], 22.3.1)

(2.4) Pn(t)=2nk=0n(nk)2(x+1)k(x1)nk.P_{n}(t)=2^{-n}{\textstyle\sum\limits_{k=0}^{n}}\dbinom{n}{k}^{2}(x+1)^{k}(x-1)^{n-k}.

Then (see [17], (39))

(2.5) Pn(t)=(12x)nFn(x),P_{n}(t)=(1-2x)^{-n}F_{n}(x),

where

t=12x+2x212x,x[0,12).t=\frac{1-2x+2x^{2}}{1-2x},\ x\in[0,\frac{1}{2}).

Combining (2.5) with (2.1), (2.2) and (2.3), we get

(2.6) Pn(t)\displaystyle P_{n}(t) =(12x)n+1Gn+1(x),\displaystyle=(1-2x)^{n+1}G_{n+1}(-x),
(2.7) Pn(t)\displaystyle P_{n}(t) =(12x)nUn(x1x),\displaystyle=(1-2x)^{-n}U_{n}\left(\frac{x}{1-x}\right),
(2.8) Pn(t)\displaystyle P_{n}(t) =(12x)n+1Jn(x1x).\displaystyle=-(1-2x)^{n+1}J_{n}\left(\frac{x-1}{x}\right).

In the theory of special functions recurrence, relations play a crucial role. In particular, the Legendre polynomials (2.4) satisfy the important recurrence relation ([29], 22.7.1)

(2.9) (n+1)Pn+1(t)(2n+1)tPn(t)+nPn1(t)=0.(n+1)P_{n+1}(t)-(2n+1)tP_{n}(t)+nP_{n-1}(t)=0.

This leads us to

Theorem 1.

The functions Fn(x),Gn(x),Un(x)F_{n}(x),G_{n}(x),\ U_{n}(x) andJn(x)\ J_{n}(x)\ satisfy the following three-terms recurrence relations:

(2.10) 2(n+1)Fn+1(x)\displaystyle 2(n+1)F_{n+1}(x) =(2n+1)(1+(12x)2)Fn(x)2n(12x)2Fn1(x),\displaystyle=(2n+1)(1+(1-2x)^{2})F_{n}(x)-2n(1-2x)^{2}F_{n-1}(x),
(2.11) n(1+2x)2Gn+1(x)\displaystyle n(1+2x)^{2}\!G_{n+1}(x)\! =(2n1)(1+2x+2x2)Gn(x)(n1)Gn1(x),\displaystyle=\!(2n-1)(1+2x+2x^{2})\!G_{n}(x)\!-\!(n-1)G_{n-1}(x),
(2.12) (n+1)(1+t)2Un+1(t)\displaystyle(n+1)(1+t)^{2}U_{n+1}(t) =(2n+1)(t2+1)Un(t)n(1t)2Un1(t),\displaystyle=(2n+1)(t^{2}+1)U_{n}(t)-n(1-t)^{2}U_{n-1}(t),
(2.13) (n+1)(1+t)2Jn+1(t)\displaystyle(n+1)(1+t)^{2}J_{n+1}(t) =(2n+1)(t2+1)Jn(t)n(1t)2Jn1(t).\displaystyle=(2n+1)(t^{2}+1)J_{n}(t)-n(1-t)^{2}J_{n-1}(t).
Proof.

It suffices to use relations (2.5)–(2.9). ∎

Remark 2.

According to (2.12) and (2.13), Un(x)U_{n}(x) and Jn(x)J_{n}(x)\ satisfy the same recurrence relation. In fact,from ([17], (49), (55)), we have

(2.14) Un(x)\displaystyle U_{n}(x) =k=0ncn,k(1x1+x)2k,\displaystyle={\textstyle\sum\limits_{k=0}^{n}}c_{n,k}\left(\frac{1-x}{1+x}\right)^{2k},
(2.15) Jn(x)\displaystyle J_{n}(x) =k=0ncn,k(1x1+x)2k+1,\displaystyle={\textstyle\sum\limits_{k=0}^{n}}c_{n,k}\left(\frac{1-x}{1+x}\right)^{2k+1},

where

cn,k:=4n(2(nk)nk)(2kk),k=0,1,,n.c_{n,k}:=4^{-n}\dbinom{2(n-k)}{n-k}\dbinom{2k}{k},\ k=0,1,\ldots,n.

From (2.14) and (2.15), we see that

(2.16) Jn(x)=1x1+xUn(x).J_{n}(x)=\frac{1-x}{1+x}U_{n}(x).
Remark 3.

From ([17], (56)) and ([30], (21)), we know that

(2.17) Gn+1(x)\displaystyle G_{n+1}(x) =k=0n1cn,k(1+2x)2k1,\displaystyle={\textstyle\sum\limits_{k=0}^{n-1}}c_{n,k}(1+2x)^{-2k-1},
(2.18) Fn(x)\displaystyle F_{n}(x) =k=0ncn,k(12x)2k.\displaystyle={\textstyle\sum\limits_{k=0}^{n}}c_{n,k}(1-2x)^{2k}.

So, the recurrence relations (2.10)–(2.13) are accompanied by

F0(x)\displaystyle F_{0}(x) =1,F1(x)=12x+2x2;\displaystyle=1,\ F_{1}(x)=1-2x+2x^{2};
G1(x)\displaystyle G_{1}(x) =12x+1,G2(x)=1+2x+2x2(2x+1)3;\displaystyle=\frac{1}{2x+1},\ G_{2}(x)=\frac{1+2x+2x^{2}}{(2x+1)^{3}};
U0(x)\displaystyle U_{0}(x) =1,U1(x)=1+x2(1+x)2;\displaystyle=1,\ U_{1}(x)=\frac{1+x^{2}}{(1+x)^{2}};
J0(x)\displaystyle J_{0}(x) =1x1+x,J1(x)=(1x)(1+x2)(1+x)3.\displaystyle=\frac{1-x}{1+x},\ J_{1}(x)=\frac{(1-x)(1+x^{2})}{(1+x)^{3}}.

3. Inequalities for Information Potentials

In studying a sequence of special functions, not only are recurrence relations important, but also inequalities connecting successive terms; in particular, inequalities showing that the sequence is (logarithmically) convex or concave. This section is devoted to such inequalities involving the sequences (Fn(x))(F_{n}(x)), (Gn(x))(G_{n}(x)), (Un(x))(U_{n}(x)), and (Jn(x))(J_{n}(x)).

Theorem 4.

The function Fn(x)F_{n}(x) satisfies the inequalities

(3.1) Fn+1(x)1+(4n2)x(1x)1+(4n+2)x(1x)Fn1(x),F_{n+1}(x)\leq\frac{1+(4n-2)x(1-x)}{1+(4n+2)x(1-x)}F_{n-1}(x),
(3.2) Fn(x)1+4nx(1x)1+(4n+2)x(1x)Fn1(x),F_{n}(x)\leq\frac{1+4nx(1-x)}{1+(4n+2)x(1-x)}F_{n-1}(x),
(3.3) Fn2(x)Fn1(x)Fn+1(x); 2Fn(x)Fn1(x)+Fn+1(x),F_{n}^{2}(x)\leq F_{n-1}(x)F_{n+1}(x);\ 2F_{n}(x)\leq F_{n-1}(x)+F_{n+1}(x),

for all n1,x[0,1].n\geq 1,\ x\in[0,1].

Proof.

We start with the following integral representation (see [17], (29))

(3.4) Fn(x)=1π01fn(x,t)dtt(1t),F_{n}(x)=\frac{1}{\pi}\int_{0}^{1}f^{n}(x,t)\frac{dt}{\sqrt{t(1-t)}},

where f(x,t):=t+(1t)(12x)2[0,1].f(x,t):=t+(1-t)(1-2x)^{2}\in[0,1].

It follows that

Fn+1(x)Fn(x).F_{n+1}(x)\leq F_{n}(x).

On the other hand,

Fn1(x)+Fn+1(x)2Fn(x)=\displaystyle F_{n-1}(x)+F_{n+1}(x)-2F_{n}(x)=
=1π01fn1(x,t)[1+f2(x,t)2f(x,t)]dtt(1t),\displaystyle=\frac{1}{\pi}\int_{0}^{1}f^{n-1}(x,t)\left[1+f^{2}(x,t)-2f(x,t)\right]\!\!\frac{dt}{\sqrt{t(1-t)}},

which entails

(3.5) 2Fn(x)Fn1(x)+Fn+1(x).2F_{n}(x)\leq F_{n-1}(x)+F_{n+1}(x).

According to (2.10), we have

(3.6) Fn(x)=an(x)Fn1(x)+bn(x)Fn+1(x),F_{n}(x)=a_{n}(x)F_{n-1}(x)+b_{n}(x)F_{n+1}(x),

where

an(x)=n(12x)2(2n+1)(12x+2x2),bn(x)=n+1(2n+1)(12x+2x2).a_{n}(x)=\frac{n(1-2x)^{2}}{(2n+1)(1-2x+2x^{2})},\ b_{n}(x)=\frac{n+1}{(2n+1)(1-2x+2x^{2})}.

Using (3.5) and (3.6) we get

2an(x)Fn1(x)+2bn(x)Fn+1(x)Fn1(x)+Fn+1(x),2a_{n}(x)F_{n-1}(x)+2b_{n}(x)F_{n+1}(x)\leq F_{n-1}(x)+F_{n+1}(x),

which yields

(2bn(x)1)Fn+1(x)(12an(x))Fn1(x),(2b_{n}(x)-1)F_{n+1}(x)\leq(1-2a_{n}(x))F_{n-1}(x),

and this immediately leads to (3.1). To prove (3.2), it suffices to combine (3.5) and (3.1). The inequalities (3.3) were proven in ([31], (3.2) and (3.3)). ∎

Combining (3.2) with (1.9) and (1.10), we obtain

Corollary 5.

The Rényi entropy Rn(x)R_{n}(x) and the Tsallis entropy Tn(x)T_{n}(x) corresponding to the binomial distribution with parameters nn and xx satisfy the inequalities:

(3.7) Rn(x)Rn1(x)\displaystyle R_{n}(x)-R_{n-1}(x) log1+(4n+2)x(1x)1+4nx(1x)0,\displaystyle\geq\log\frac{1+(4n+2)x(1-x)}{1+4nx(1-x)}\geq 0,
(3.8) Tn(x)Tn1(x)\displaystyle T_{n}(x)-T_{n-1}(x) 2x(1x)1+4nx(1x)(1Tn(x))0.\displaystyle\geq\frac{2x(1-x)}{1+4nx(1-x)}\left(1-T_{n}(x)\right)\geq 0.
Theorem 6.

The following inequalities hold:

(3.9) Un2\displaystyle U_{n}^{2} Un1Un+1, 2UnUn1+Un+1,\displaystyle\leq U_{n-1}U_{n+1},\ 2U_{n}\leq U_{n-1}+U_{n+1},
(3.10) Un+1(x)\displaystyle U_{n+1}(x) 1+4nx+x21+(4n+4)x+x2Un1(x),\displaystyle\leq\frac{1+4nx+x^{2}}{1+(4n+4)x+x^{2}}U_{n-1}(x),
(3.11) Un(x)\displaystyle U_{n}(x) 1+(4n+2)x+x21+(4n+4)x+x2Un1(x),\displaystyle\leq\frac{1+(4n+2)x+x^{2}}{1+(4n+4)x+x^{2}}U_{n-1}(x),
(3.12) Gn2\displaystyle G_{n}^{2} Gn1Gn+1, 2GnGn1+Gn+1,\displaystyle\leq G_{n-1}G_{n+1},\ 2G_{n}\leq G_{n-1}+G_{n+1},
(3.13) Gn+1(x)\displaystyle G_{n+1}(x) 1+(4n2)x(1+x)1+(4n+2)x(1+x)Gn1(x),\displaystyle\leq\frac{1+(4n-2)x(1+x)}{1+(4n+2)x(1+x)}G_{n-1}(x),
(3.14) Gn(x)\displaystyle G_{n}(x) 1+4nx(x+1)1+(4n+2)x(x+1)Gn1(x),\displaystyle\leq\frac{1+4nx(x+1)}{1+(4n+2)x(x+1)}G_{n-1}(x),
(3.15) Jn2\displaystyle J_{n}^{2} Jn1Jn+1, 2JnJn1+Jn+1,\displaystyle\leq J_{n-1}J_{n+1},\ 2J_{n}\leq J_{n-1}+J_{n+1},
(3.16) Jn+1(x)\displaystyle J_{n+1}(x) 1+4nx+x21+(4n+4)x+x2Jn1(x),\displaystyle\leq\frac{1+4nx+x^{2}}{1+(4n+4)x+x^{2}}J_{n-1}(x),
(3.17) Jn(x)\displaystyle J_{n}(x) 1+(4n+2)x+x21+(4n+4)x+x2Jn1(x).\displaystyle\leq\frac{1+(4n+2)x+x^{2}}{1+(4n+4)x+x^{2}}J_{n-1}(x).
Proof.

The proof is similar to that of Theorem 4, starting from (see ([17], (48), (58), (63))):

Un(x)\displaystyle U_{n}(x) =1π01(t+(1t)(1x1+x)2)ndtt(1t),\displaystyle=\frac{1}{\pi}\int\nolimits_{0}^{1}\left(t+(1-t)\left(\frac{1-x}{1+x}\right)^{2}\right)^{n}\frac{dt}{\sqrt{t(1-t)}},
Gn(x)\displaystyle G_{n}(x) =1π01(t+(1t)(1+2x)2)ndtt(1t),\displaystyle=\frac{1}{\pi}\int\nolimits_{0}^{1}\left(t+(1-t)(1+2x)^{2}\right)^{-n}\frac{dt}{\sqrt{t(1-t)}},
Jn(x)\displaystyle J_{n}(x) =1π01(t+(1t)(1+x1x)2)n1dtt(1t).\displaystyle=\frac{1}{\pi}\int\nolimits_{0}^{1}\left(t+(1-t)\left(\frac{1+x}{1-x}\right)^{2}\right)^{-n-1}\!\!\!\!\frac{dt}{\sqrt{t(1-t)}}.

These integral representations, together with the representation of Fn(x)F_{n}(x) given by (3.4), are consequences of the important results of Elena Berdysheva ([19], Theorem 1).

From (3.9)–(3.17), we can derive inequalities similar to (3.7) and (3.8), for the entropies associated with the probability distributions corresponding to Un(x),Gn(x)U_{n}(x),\ G_{n}(x), and Jn(x)J_{n}(x).

Remark 7.

Let us remark that the inequalities (3.3), (3.9), (3.12), (3.15) show that for each xx, the sequences (Fn(x))n0,(Un(x))n0,(Gn(x))n0,(Jn(x))n0,(F_{n}(x))_{n\geq 0},\ (U_{n}(x))_{n\geq 0},\ (G_{n}(x))_{n\geq 0},\ (J_{n}(x))_{n\geq 0}, are logarithmically convex, and so convex; the other inequalities from Theorems 4 and 6 show that the same sequences are decreasing. It immediately follows that the associated sequences of entropies (Rn(x))n0(R_{n}(x))_{n\geq 0} and (Tn(x))n0(T_{n}(x))_{n\geq 0} are concave and increasing; see also (3.7) and (3.8).

4. Other Inequalities

Besides their own interest, the next Theorems 8 and 12 will be instrumental in establishing new lower and upper bounds for the information potentials (Fn(x))n0,(Un(x))n0,(Gn(x))n0,(Jn(x))n0,(F_{n}(x))_{n\geq 0},\ (U_{n}(x))_{n\geq 0},\ (G_{n}(x))_{n\geq 0},\ (J_{n}(x))_{n\geq 0}, and consequently for the associated Rényi and Tsallis entropies.

Let us return to the information potential (1.8). According to ([17], (10)) we have for c0,c\neq 0,

(4.1) Sn,c(x)=1π01[t+(1t)(1+2cx)2]ncdtt(1t).S_{n,c}(x)=\frac{1}{\pi}\int\nolimits_{0}^{1}\left[t+(1-t)(1+2cx)^{2}\right]^{-\frac{n}{c}}\frac{dt}{\sqrt{t(1-t)}}.

Let c<0c<0. Using (4.1) and Chebyshev’s inequality for synchronous functions, we can write

Snc,c(x)\displaystyle S_{n-c,c}(x) =1π01[t+(1t)(1+2cx)2]nc[t+(1t)(1+2cx)2]dtt(1t)\displaystyle=\frac{1}{\pi}\!\!\int\nolimits_{0}^{1}\!\!\left[t\!+\!(1\!-\!t)(1\!+\!2cx)^{2}\right]^{-\frac{n}{c}}\!\!\left[t\!+\!(1\!-\!t)(1\!+\!2cx)^{2}\right]\frac{dt}{\sqrt{t(1\!\!-\!\!t)}}
1π01[t+(1t)(1+2cx)2]ncdtt(1t)\displaystyle\geq\frac{1}{\pi}\int\nolimits_{0}^{1}\left[t+(1-t)(1+2cx)^{2}\right]^{-\frac{n}{c}}\frac{dt}{\sqrt{t(1-t)}}
1π01[t+(1t)(1+2cx)2]dtt(1t)\displaystyle\quad\cdot\frac{1}{\pi}\int\nolimits_{0}^{1}\left[t+(1-t)(1+2cx)^{2}\right]\frac{dt}{\sqrt{t(1-t)}}
=Sn,c(x)(1+2cx+2c2x2).\displaystyle=S_{n,c}(x)(1+2cx+2c^{2}x^{2}).

For c>0c>0, we use Chebyshev’s inequality for asynchronous functions and obtain the reverse inequality. So we have

Theorem 8.

If c<0c<0, then

(4.2) Snc,c(x)(1+2cx(1+cx))Sn,c(x).S_{n-c,c}(x)\geq(1+2cx(1+cx))S_{n,c}(x).

If c>0c>0, the inequality is reversed.

Corollary 9.

For c=1,c=-1, (4.2) and (3.2) yield

(4.3) (12x(1x))Fn(x)Fn+1(x)1+(4n+4)x(1x)1+(4n+6)x(1x)Fn(x).(1-2x(1-x))F_{n}(x)\leq F_{n+1}(x)\leq\frac{1+(4n+4)x(1-x)}{1+(4n+6)x(1-x)}F_{n}(x).

For c=1c=1, we obtain

(4.4) 11+2x(1+x)Gn(x)Gn+1(x)1+(4n+4)x(1+x)1+(4n+6)x(1+x)Gn(x).\frac{\ 1}{1+2x(1+x)}G_{n}(x)\leq G_{n+1}(x)\leq\dfrac{1+(4n+4)x(1+x)}{1+(4n+6)x(1+x)}G_{n}(x).

Now, using [17, (48)], we have

Un+1(x)\displaystyle U_{n+1}(x) =1π01((1x1+x)2+4x(1+x)2t)n\displaystyle=\frac{1}{\pi}\int\nolimits_{0}^{1}\left(\left(\frac{1-x}{1+x}\right)^{2}+\frac{4x}{(1+x)^{2}}t\right)^{n}\cdot
((1x1+x)2+4x(1+x)2t)dtt(1t)\displaystyle\quad\cdot\left(\left(\frac{1-x}{1+x}\right)^{2}+\frac{4x}{(1+x)^{2}}t\right)\frac{dt}{\sqrt{t(1-t)}}
Un(x)1π01((1x1+x)2+4x(1+x)2t)dtt(1t)\displaystyle\geq U_{n}(x)\frac{1}{\pi}\int\nolimits_{0}^{1}\left(\left(\frac{1-x}{1+x}\right)^{2}+\frac{4x}{(1+x)^{2}}t\right)\frac{dt}{\sqrt{t(1-t)}}
=1+x2(1+x)2Un(x).\displaystyle=\frac{1+x^{2}}{(1+x)^{2}}U_{n}(x).

Therefore, using also (3.11), we get

(4.5) 1+x2(1+x)2Un(x)Un+1(x)1+(4n+6)x+x21+(4n+8)x+x2Un(x).\frac{1+x^{2}}{(1+x)^{2}}U_{n}(x)\leq U_{n+1}(x)\leq\dfrac{1+(4n+6)x+x^{2}}{1+(4n+8)x+x^{2}}U_{n}(x).

Now (4.5) and (2.16) yield

(4.6) 1+x2(1+x)2Jn(x)Jn+1(x)1+(4n+6)x+x21+(4n+8)x+x2Jn(x).\frac{1+x^{2}}{\left(1+x\right)^{2}}J_{n}(x)\leq J_{n+1}(x)\leq\dfrac{1+(4n+6)x+x^{2}}{1+(4n+8)x+x^{2}}J_{n}(x).
Theorem 10.

The following inequalities are satisfied:

(4.7) (12x(1x))nFn(x)1+4x(1x)1+(4n+4)x(1x),n0,x[0,1],(1-2x(1-x))^{n}\leq F_{n}(x)\leq\sqrt{\dfrac{1+4x(1-x)}{1+(4n+4)x(1-x)}},\,n\geq 0,\,x\in[0,1],
(4.8) (11+2x(1+x))n1(1+2x)Gn(x)1+8x(1+x)1+(4n+4)x(1+x),n1,x0,\left(\dfrac{1}{1+2x(1+x)}\right)^{n-1}\leq(1+2x)G_{n}(x)\leq\sqrt{\dfrac{1+8x(1+x)}{1+(4n+4)x(1+x)}},\,n\geq 1,x\geq 0,
(4.9) (1+x2(1+x)2)nUn(x)1+6x+x21+(4n+6)x+x2,n0,x0,\left(\dfrac{1+x^{2}}{(1+x)^{2}}\right)^{n}\leq U_{n}(x)\leq\sqrt{\dfrac{1+6x+x^{2}}{1+(4n+6)x+x^{2}}},\,\,n\geq 0,\,x\geq 0,
(4.10) 1x1+x(1+x2(1+x)2)nJn(x)1x1+x1+6x+x21+(4n+6)x+x2,n0,x[0,1].\dfrac{1-x}{1+x}\left(\dfrac{1+x^{2}}{(1+x)^{2}}\right)^{n}\leq J_{n}(x)\leq\dfrac{1-x}{1+x}\sqrt{\dfrac{1+6x+x^{2}}{1+(4n+6)x+x^{2}}},\,n\geq 0,\,x\in[0,1].
Proof.

Writing (4.3) for n=0,1,,m1n=0,1,\dots,m-1, and multiplying term by term, we get

(12x(1x))mFm(x)1+4x(1x)1+6x(1x)1+8x(1x)1+10x(1x)1+4mx(1x)1+(4m+2)x(1x).\left(1-2x(1-x)\right)^{m}\leq F_{m}(x)\leq\dfrac{1+4x(1-x)}{1+6x(1-x)}\dfrac{1+8x(1-x)}{1+10x(1-x)}\cdots\dfrac{1+4mx(1-x)}{1+(4m+2)x(1-x)}.

Using

1+tx(1x)1+(t+2)x(1x)1+(t+2)x(1x)1+(t+4)x(1x),t0,\dfrac{1+tx(1-x)}{1+(t+2)x(1-x)}\leq\dfrac{1+(t+2)x(1-x)}{1+(t+4)x(1-x)},\,\,t\geq 0,

, it follows that

Fm2(x)1+4x(1x)1+(4m+4)x(1x),F_{m}^{2}(x)\leq\dfrac{1+4x(1-x)}{1+(4m+4)x(1-x)},

and so (4.7) is proven. The other three relations can be proved similarly, using (4.4), (4.5) and (4.6). ∎

Remark 11.

The inequalities (4.7)–(4.10) in Theorem 10 provide lower and upper bounds for the information potentials FnF_{n}, GnG_{n}, UnU_{n}, JnJ_{n}, and consequently for the associated entropies. They can be compared with other bounds existing in the literature, obtained with other methods. For the moment, let us prove the inequality

(4.11) (14x(1x))n/2Fn(x),n0,x[0,1],(1-4x(1-x))^{n/2}\leq F_{n}(x),\,n\geq 0,\,x\in[0,1],

and compare it with the first inequality in (4.7).

According to ([17], (4.6), (4.2)),

Fn(x)=j=0ncn,j(12x)2j,F_{n}(x)=\displaystyle\sum_{j=0}^{n}c_{n,j}(1-2x)^{2j},

where (see also (6.11) and (6.8))

cn,j:=4n(2jj)(2n2jnj),j=0ncn,j=1,j=0njcn,j=n2.c_{n,j}:=4^{-n}{2j\choose j}{2n-2j\choose n-j},\,\,\sum_{j=0}^{n}c_{n,j}=1,\,\sum_{j=0}^{n}jc_{n,j}=\dfrac{n}{2}.

Using the weighted arithmetic mean-geometric mean inequality, we have

Fn(x)j=0n(12x)2jcn,j=(12x)2j=0njcn,j=(14x(1x))n/2,F_{n}(x)\geq\prod_{j=0}^{n}(1-2x)^{2jc_{n,j}}=(1-2x)^{2\sum_{j=0}^{n}jc_{n,j}}=(1-4x(1-x))^{n/2},

and this is (4.11). Clearly, the first inequality in (4.7) provides a lower bound for Fn(x)F_{n}(x), which is better than the lower bound provided by (4.11).

Theorem 12.

The information potential satisfies the following inequality for all cc\in\mathbb{R}:

(4.12) Sm+n,c(x)Sm,c(x)Sn,c(x).S_{m+n,c}(x)\geq S_{m,c}(x)S_{n,c}(x).
Proof.

If c0,c\neq 0, we can use (4.1) to get

Sm+n,c(x)=\displaystyle S_{m+n,c}(x)= 1π01[t+(1t)(1+2cx)2]mc\displaystyle\frac{1}{\pi}{\textstyle\displaystyle\int\nolimits_{0}^{1}}\left[t+(1-t)(1+2cx)^{2}\right]^{-\frac{m}{c}}\cdot
[t+(1t)(1+2cx)2]ncdtt(1t).\displaystyle\quad\cdot\left[t+(1-t)(1+2cx)^{2}\right]^{-\frac{n}{c}}\frac{dt}{\sqrt{t(1-t)}}.

Applying Chebyshev’s inequality for synchronous functions, we obtain

Sm+n,c(x)\displaystyle S_{m+n,c}(x) 1π01[t+(1t)(1+2cx)2]mcdtt(1t)\displaystyle\geq\frac{1}{\pi}{\textstyle\displaystyle\int\nolimits_{0}^{1}}\left[t+(1-t)(1+2cx)^{2}\right]^{-\frac{m}{c}}\frac{dt}{\sqrt{t(1-t)}}\cdot
1π01[t+(1t)(1+2cx)2]ncdtt(1t)\displaystyle\quad\cdot\frac{1}{\pi}\int\nolimits_{0}^{1}\left[t+(1-t)(1+2cx)^{2}\right]^{-\frac{n}{c}}\frac{dt}{\sqrt{t(1-t)}}
=Sm,c(x)Sn,c(x).\displaystyle=S_{m,c}(x)S_{n,c}(x).

For c=0c=0, we have (see [17], (13))

Sn,0(x)=1π11e2nx(1+t)dt1t2.S_{n,0}(x)=\frac{1}{\pi}{\textstyle\displaystyle\int\nolimits_{-1}^{1}}e^{-2nx(1+t)}\frac{dt}{\sqrt{1-t^{2}}}.

With the same Chebyshev inequality, one obtains (4.12). ∎

From Theorem 12, we derive

Corollary 13.

For the Rényi entropy Rn,c(x)R_{n,c}(x) and the Tsallis entropy Tn,c(x)T_{n,c}(x), we have

(4.13) Rm+n,c(x)\displaystyle R_{m+n,c}(x) Rm,c(x)+Rn,c(x),\displaystyle\leq R_{m,c}(x)+R_{n,c}(x),
(4.14) Tm+n,c(x)\displaystyle T_{m+n,c}(x) Tm,c(x)+Tn,c(x)Tm,c(x)Tn,c(x).\displaystyle\leq T_{m,c}(x)+T_{n,c}(x)-T_{m,c}(x)T_{n,c}(x).
Remark 14.

The inequalities (4.13) and (4.14) express the subadditivity of the sequences (Rn(x))n0(R_{n}(x))_{n\geq 0} and (Tn(x))n0(T_{n}(x))_{n\geq 0}.

Remark 15.

From (4.12) with c=1c=-1, we obtain

(4.15) Fm+n(x)Fm(x)Fn(x),x[0,1].F_{m+n}(x)\geq F_{m}(x)F_{n}(x),\ x\in[0,1].

Here is a probabilistic proof of this inequality.

Let Xm,Xn,Ym,YnX_{m},X_{n},Y_{m},Y_{n} be independent binomial random variables with the same parameter x[0,1].x\in[0,1]. Then

Fn(x)=k=0nP(Xn=Yn=k)=P(Xn=Yn),F_{n}(x)=\sum\limits_{k=0}^{n}P\left(X_{n}=Y_{n}=k\right)=P(X_{n}=Y_{n}),

and consequently

Fm+n(x)\displaystyle F_{m+n}(x) =P(Xm+n=Ym+n)=P(Xm+Xn=Ym+Yn)\displaystyle=P\left(X_{m+n}=Y_{m+n}\right)=P\left(X_{m}+X_{n}=Y_{m}+Y_{n}\right)\geq
P(Xm=Ym and Xn=Yn)=P(Xm=Ym)P(Xn=Yn)\displaystyle\geq P\left(X_{m}=Y_{m}\text{ and }X_{n}=Y_{n}\right)=P\left(X_{m}=Y_{m}\right)P\left(X_{n}=Y_{n}\right)
=Fm(x)Fn(x),\displaystyle=F_{m}(x)F_{n}(x),

and this proves (4.15). It would be useful to have purely probabilistic proofs of other inequalities in this specific framework; they would facilitate a deeper understanding of the interplay between analytic proofs/results and probabilistic proofs/results.

Inequalities similar to (4.15) hold for Gn(x)G_{n}(x) (apply (4.12) with c=1c=1) and for Un(x)U_{n}(x) and Jn(x).J_{n}(x). Indeed, according to (2.2),

Um+n(x1x)=Fm+n(x)Fm(x)Fn(x)=Um(x1x)Un(x1x),U_{m+n}\left(\frac{x}{1-x}\right)=F_{m+n}(x)\geq F_{m}(x)F_{n}(x)=U_{m}\left(\frac{x}{1-x}\right)U_{n}\left(\frac{x}{1-x}\right),

for all x[0,1),x\in[0,1), which implies

(4.16) Um+n(t)Um(t)Un(t),t[0,+).U_{m+n}\left(t\right)\geq U_{m}(t)U_{n}(t),\ t\in[0,+\infty).

From (4.16), by multiplication with (1t1+t)2\left(\dfrac{1-t}{1+t}\right)^{2} and using (2.16), we get

(4.17) Jm+n(t)1t1+tJm+n(t)Jm(t)Jn(t),t[0,1).J_{m+n}\left(t\right)\geq\dfrac{1-t}{1+t}J_{m+n}\left(t\right)\geq J_{m}(t)J_{n}(t),\ t\in[0,1).

Let us remark that the inequality (4.17) is stronger than the similar inequalities for FnF_{n}, GnG_{n} and UnU_{n}.

Corollary 16.

For m0m\geq 0, n0n\geq 0, k0k\geq 0, we have

Fm+kn(x)Fm(x)Fkn(x),x[0,1].F_{m+kn}(x)\geq F_{m}(x)F_{k}^{n}(x),\,\,x\in[0,1].

In particular,

(4.18) Fkn(x)Fkn(x);Fn(x)F1n(x).F_{kn}(x)\geq F_{k}^{n}(x);\,\,F_{n}(x)\geq F_{1}^{n}(x).
Proof.

Starting from (4.15), it suffices to use induction on nn. ∎

Similar results hold for Gn(x)G_{n}(x), Un(x)U_{n}(x), Jn(x)J_{n}(x), but we omit the details. However, let us remark that F1(x)=12x(1x)F_{1}(x)=1-2x(1-x) and so the second inequality in (4.18) is the first inequality in (4.7).

Remark 17.

Convexity properties of the information potentials and the associated entropies were presented in [32, 31], but the hypothesis ank=ak,k=0,1,,na_{n-k}=a_{k},\ k=0,1,\ldots,n was inadvertently omitted in ([31], Conjecture 6.1).

5. More about Un(t)U_{n}(t)

This section contains some additional properties of the function UnU_{n} defined initially by (1.16). Using the simple relation (2.16) connecting JnJ_{n} and UnU_{n}, one can easily derive new properties of the function JnJ_{n} given by (1.17).

Theorem 18.
  • (i)

    UnU_{n} is decreasing on [0,1][0,1] and increasing on [1,).[1,\infty).

  • (ii)

    UnU_{n} is logarithmically convex on [0,1].[0,1].

Proof.

It was proved (see [27, 31, 32]) that FnF_{n} is a logarithmically convex function on [0,1],[0,1], i.e.,

(5.1) Fn′′(x)Fn(x)Fn2(x)0,x[0,1].F_{n}^{\prime\prime}(x)F_{n}(x)-F_{n}^{\prime 2}(x)\geq 0,\ x\in[0,1].

Let x=tt+1,t[0,),x[0,1).x=\dfrac{t}{t+1},\ t\in[0,\infty),\ x\in[0,1). Then t=x1xt=\dfrac{x}{1-x} and (2.2) shows that Un(t)=Fn(x).U_{n}(t)=F_{n}(x).\ Consequently,

(5.2) Un(t)=Fn(x)dxdt=Fn(x)(t+1)2.U_{n}^{\prime}(t)=F_{n}^{\prime}(x)\frac{dx}{dt}=F_{n}^{\prime}(x)(t+1)^{-2}.

It is known (see [17]) that Fn(x)0,x[0,12],F_{n}^{\prime}(x)\leq 0,\ x\in[0,\frac{1}{2}], and Fn(x)0,x[12,1].F_{n}^{\prime}(x)\geq 0,\ x\in[\frac{1}{2},1]. It follows that Un(t)0,t[0,1],U_{n}^{\prime}(t)\leq 0,\ t\in[0,1], and Un(t)0,t[1,).U_{n}^{\prime}(t)\geq 0,\ t\in[1,\infty). This proves (i).

To prove (ii), let us remark that

Un′′(t)=Fn′′(x)(t+1)42Fn(x)(t+1)3.U_{n}^{\prime\prime}(t)=F_{n}^{\prime\prime}(x)(t+1)^{-4}-2F_{n}^{\prime}(x)(t+1)^{-3}.

Combined with (5.2), this yields

Un′′(t)Un(t)Un2(t)\displaystyle U_{n}^{\prime\prime}(t)U_{n}(t)-U_{n}^{\prime 2}(t) =(Fn′′(x)Fn(x)Fn2(t))(t+1)4\displaystyle=\left(F_{n}^{\prime\prime}(x)F_{n}(x)-F_{n}^{\prime 2}(t)\right)(t+1)^{-4}-
2Fn(x)Fn(x)(t+1)3.\displaystyle-2F_{n}^{\prime}(x)F_{n}(x)(t+1)^{-3}.

Using (5.1) and Fn(x)0,x[0,12],F_{n}^{\prime}(x)\leq 0,\ x\in[0,\frac{1}{2}], we obtain

Un′′(t)Un(t)Un2(t)0,t[0,1],U_{n}^{\prime\prime}(t)U_{n}(t)-U_{n}^{\prime 2}(t)\geq 0,\ t\in[0,1],

which proves (ii). ∎

Remark 19.

Un(0)=limt+Un(t)=1,U_{n}(0)=\underset{t\rightarrow+\infty}{\lim}U_{n}(t)=1,\ (see (2.14)). These equalities, theorem 18, and graphical experiments(see Figure 1) suggest that UnU_{n} is convex on [0,tn][0,t_{n}] and concave on [tn,+),[t_{n},+\infty), for a suitable tn>1.t_{n}>1. It would be interesting to have a proof for this shape of Un(t)U_{n}(t), and to find the value of tn.t_{n}.

Refer to caption
Figure 1. Graphics of UnU_{n} for n=10,20,30,40n=10,20,30,40.

In order to compute Un(x)U_{n}(x), we have the explicit expressions (1.16) and (2.14), and the three terms of recurrence relation (2.12). In what follows, we provide two terms of recurrence relation.

According to ([31], (2.3)),

x(1x)Fn(x)=n(12x)(Fn(x)Fn1(x)),n1,x[0,1].x(1-x)F_{n}^{\prime}(x)=n(1-2x)(F_{n}(x)-F_{n-1}(x)),\ n\geq 1,\ x\in[0,1].

Setting again x=x= tt+1,t[0,),\dfrac{t}{t+1},\ t\in[0,\infty), and Un(t)=Fn(x),U_{n}(t)=F_{n}(x),, we obtain, after some computation,

(5.3) t(t+1)U(t)=n(1t)(Un(t)Un1(t)),t[0,).t(t+1)U^{\prime}(t)=n(1-t)\left(U_{n}(t)-U_{n-1}(t)\right),\ t\in[0,\infty).

Multiplying (5.3) by (t+1)2n1/tn+1(t+1)^{2n-1}/t^{n+1}, we obtain

(5.4) ((t+1)2ntnUn(t))=((t+1)2ntn)Un1(t),t1.\left(\frac{(t+1)^{2n}}{t^{n}}U_{n}(t)\right)^{\prime}=\left(\frac{(t+1)^{2n}}{t^{n}}\right)^{\prime}U_{n-1}(t),\ t\geq 1.

Let s(t):=(t+1)2t,t1,s(t):=\dfrac{(t+1)^{2}}{t},\ t\geq 1, and

Vn(t):=sn(t)Un(t),t1.V_{n}(t):=s^{n}(t)U_{n}(t),\ t\geq 1.
Theorem 20.

The sequence (Vn(t))n0(V_{n}(t))_{n\geq 0} satisfies the recurrence relation

(5.5) Vn(t)=(2nn)+n1tx21x2Vn1(x)𝑑x,n1,V_{n}(t)=\binom{2n}{n}+n\int\nolimits_{1}^{t}\frac{x^{2}-1}{x^{2}}V_{n-1}(x)dx,\ n\geq 1,

with V0(t)=1,t1.V_{0}(t)=1,\ t\geq 1.

Proof.

From (5.4), we obtain

(sn(x)Un(x))=(sn(x))Un1(x),x1,\left(s^{n}(x)U_{n}(x)\right)^{\prime}=\left(s^{n}(x)\right)^{\prime}U_{n-1}(x),\ x\geq 1,

i.e.,

Vn(x)=nsn1(x)s(x)Vn1(x)sn1(x).V_{n}^{\prime}(x)=ns^{n-1}(x)s^{\prime}(x)\frac{V_{n-1}(x)}{s^{n-1}(x)}.

This reduces to

Vn(x)=ns(x)Vn1(x),V_{n}^{\prime}(x)=ns^{\prime}(x)V_{n-1}(x),

and therefore

(5.6) Vn(t)Vn(1)=n1tx21x2Vn1(x)𝑑x.V_{n}(t)-V_{n}(1)=n\int\nolimits_{1}^{t}\frac{x^{2}-1}{x^{2}}V_{n-1}(x)dx.

Now Vn(1)=sn(1)Un(1),V_{n}(1)=s^{n}(1)U_{n}(1), and (2.14) shows that Vn(1)=4n14n(2nn)=(2nn).V_{n}(1)=4^{n}\frac{1}{4^{n}}\binom{2n}{n}=\binom{2n}{n}. Together with (5.6) this proves (5.5). We have also V0(t)=U0(t)=1V_{0}(t)=U_{0}(t)=1 (see Remark 3). ∎

Example 21.

From (5.5), we deduce

V1(t)=2+1t(11x2)𝑑x=t+1t,V_{1}(t)=2+\int\nolimits_{1}^{t}\left(1-\frac{1}{x^{2}}\right)dx=t+\frac{1}{t},

and consequently

U1(t)=V1(t)s(t)=t2+1(t+1)2.U_{1}(t)=\frac{V_{1}(t)}{s(t)}=\frac{t^{2}+1}{(t+1)^{2}}.

Moreover,

V2(t)=6+21t(11x2)(x+1x)𝑑x=4+t2+1t2,V_{2}(t)=6+2\int\nolimits_{1}^{t}\left(1-\frac{1}{x^{2}}\right)\left(x+\frac{1}{x}\right)dx=4+t^{2}+\frac{1}{t^{2}},

i.e., U2(t)=t4+4t2+1(t+1)4,U_{2}(t)=\dfrac{t^{4}+4t^{2}+1}{(t+1)^{4}}, and so on.

Remark 22.

A recurrence relation similar to (5.5) and defining a sequence of Appell polynomials was instrumental in ([31], Section 5) for studying the function FnF_{n}.

Remark 23.

According to (4.9), limn+Un(x)=0,x>0,\underset{n\rightarrow+\infty}{\lim}U_{n}(x)=0,\ x>0, i.e., the sequence of functions (Un)n0(U_{n})_{n\geq 0} is pointwise convergent to zero on (0,)(0,\infty). The convergence is not uniform, because Un(0)=limxUn(x)=1U_{n}(0)=\underset{x\rightarrow\infty}{\lim}U_{n}(x)=1 for all n0.n\geq 0.

6. Inequalities for the Integral of the Squared Derivative

Integrals of the form abf2(x)𝑑x{\textstyle\int\nolimits_{a}^{b}}f^{\prime 2}(x)dx are important for several applications; see, e.g., ([33], Section 3.10). In this section, we present bounds for such integrals using the logarithmic convexity of the functions Fn,Gn,Kn.F_{n},\ G_{n},\ K_{n}. The results involve some combinatorial identities.

Theorem 24.

The following inequalities are valid for n=0,1,n=0,1,\ldots:

(6.1) 01Fn2(x)𝑑x\displaystyle{\textstyle\int\nolimits_{0}^{1}}F_{n}^{\prime 2}(x)dx 2n,\displaystyle\leq 2n,
(6.2) 0Gn+12(x)𝑑x\displaystyle{\textstyle\int\nolimits_{0}^{\infty}}G_{n+1}^{\prime 2}(x)dx n+1,\displaystyle\leq n+1,
(6.3) 0Kn2(x)𝑑x\displaystyle{\textstyle\int\nolimits_{0}^{\infty}}K_{n}^{\prime 2}(x)dx n.\displaystyle\leq n.
Proof.

Let us return to (5.1). Integrating by parts, we obtain

(6.4) 01Fn2(x)𝑑x12(Fn(1)Fn(1)Fn(0)Fn(0)).{\textstyle\int\nolimits_{0}^{1}}F_{n}^{\prime 2}(x)dx\leq\frac{1}{2}\left(F_{n}^{\prime}(1)F_{n}(1)-F_{n}^{\prime}(0)F_{n}(0)\right).

Remembering that Fn(x)=Sn,1(x)F_{n}(x)=S_{n,-1}(x) and using (1.11), we obtain

(6.5) Fn(0)=2n,Fn(1)=2n,Fn(0)=Fn(1)=1.F_{n}^{\prime}(0)=-2n,\ F_{n}^{\prime}(1)=2n,\ F_{n}(0)=F_{n}(1)=1.

Now (6.1) is a consequence of (6.4) and (6.5).

The logarithmic convexity of the functions Gn+1G_{n+1} and KnK_{n} on [0,)[0,\infty) was proven in [34]. Using Gn(x)=Sn,1(x)G_{n}(x)=S_{n,1}(x) and (1.13), it is easy to derive

(6.6) Gn+1(0)\displaystyle G_{n+1}(0) =1,Gn+1(0)=2(n+1)\displaystyle=1,\ G_{n+1}^{\prime}(0)=-2(n+1)
(6.7) limxGn+1(x)\displaystyle\underset{x\rightarrow\infty}{\lim}G_{n+1}(x) =limxGn+1(x)=0,\displaystyle=\underset{x\rightarrow\infty}{\lim}G_{n+1}^{\prime}(x)=0,

and from

0Gn+12(x)𝑑x12(Gn+1()Gn+1()Gn+1(0)Gn+1(0)){\textstyle\int\nolimits_{0}^{\infty}}G_{n+1}^{\prime 2}(x)dx\leq\frac{1}{2}\left(G_{n+1}(\infty)G_{n+1}^{\prime}(\infty)-G_{n+1}(0)G_{n+1}^{\prime}(0)\right)

combined with (6.6) and (6.7), we obtain (6.2).

The proof of (6.3) is similar and we omit it. ∎

Remark 25.

If we compute Fn(1)F_{n}^{\prime}(1) starting from (2.18), we obtain

Fn(1)=4k=1nkcn,k.F_{n}^{\prime}(1)=4{\textstyle\sum\limits_{k=1}^{n}}kc_{n,k}.

Combined with (6.5), this yields

(6.8) k=1nk(2(nk)nk)(2kk)=n4n/2.{\textstyle\sum\limits_{k=1}^{n}}k\binom{2(n-k)}{n-k}\binom{2k}{k}=n4^{n}/2.

On the other hand, (2.17) leads to

(6.9) Gn+1(0)=2k=0n(2k+1)cn,k.G_{n+1}^{\prime}(0)=-2{\textstyle\sum\limits_{k=0}^{n}}(2k+1)c_{n,k}.

Now (6.6) and (6.9) produce

(6.10) k=0n(2k+1)(2(nk)nk)(2kk)=(n+1)4n.{\textstyle\sum\limits_{k=0}^{n}}(2k+1)\binom{2(n-k)}{n-k}\binom{2k}{k}=(n+1)4^{n}.

From (6.8) and (6.10), we obtain

(6.11) k=0n(2(nk)nk)(2kk)=4n,{\textstyle\sum\limits_{k=0}^{n}}\binom{2(n-k)}{n-k}\binom{2k}{k}=4^{n},

which is (3.90) in Gould [35].

7. Information Potential for the Durrmeyer Density of Probability

Consider the Durrmeyer operators

Lnf(x):=01(n+1)(i=0nbni(x)bni(t))f(t)𝑑t,L_{n}f(x):=\int\nolimits_{0}^{1}(n+1)\left(\sum\limits_{i=0}^{n}b_{ni}(x)b_{ni}(t)\right)f(t)dt,

fC[0,1],x[0,1];f\in C[0,1],\ x\in[0,1]; see, e.g., [36]. They are of the form (1.5).

Here, bni(x)=(ni)xi(1x)nib_{ni}(x)=\binom{n}{i}x^{i}(1-x)^{n-i}. The kernel is

Kn(x,t):=(n+1)i=0nbni(x)bni(t),K_{n}(x,t):=(n+1)\sum\limits_{i=0}^{n}b_{ni}(x)b_{ni}(t),

and according to (1.2), the associated information potential is

Sn(x)\displaystyle S_{n}(x) :=(n+1)201(i,j=0nbni(x)bnj(x)bni(t)bnj(t))𝑑t\displaystyle:=(n+1)^{2}\int\nolimits_{0}^{1}\left(\sum\limits_{i,j=0}^{n}b_{ni}(x)b_{nj}(x)b_{ni}(t)b_{nj}(t)\right)dt
=(n+1)2i,j=0nbni(x)bnj(x)(ni)(nj)(2ni+j)101b2n,i+j(t)𝑑t\displaystyle=(n+1)^{2}\sum\limits_{i,j=0}^{n}b_{ni}(x)b_{nj}(x)\binom{n}{i}\binom{n}{j}\binom{2n}{i+j}^{-1}\int\nolimits_{0}^{1}b_{2n,i+j}(t)dt
=(n+1)22n+1i,j=0n(ni)2(nj)2(2ni+j)1xi+j(1x)2nij\displaystyle=\frac{(n+1)^{2}}{2n+1}\sum\limits_{i,j=0}^{n}\binom{n}{i}^{2}\binom{n}{j}^{2}\binom{2n}{i+j}^{-1}x^{i+j}(1-x)^{2n-i-j}
=(n+1)22n+1i,j=0n((ni)(nj)(2ni+j)1)2b2n,i+j(x).\displaystyle=\frac{(n+1)^{2}}{2n+1}\sum\limits_{i,j=0}^{n}\left(\binom{n}{i}\binom{n}{j}\binom{2n}{i+j}^{-1}\right)^{2}b_{2n,i+j}(x).

Setting i+j=ki+j=k, we obtain

Sn(x)=k=02nqn,kb2n,k(x),x[0,1],S_{n}(x)=\sum\limits_{k=0}^{2n}q_{n,k}b_{2n,k}(x),\ x\in[0,1],

where

qn,k\displaystyle q_{n,k} :=(n+1)22n+1(2nk)2l=0k(nl)2(nkl)2\displaystyle:=\frac{(n+1)^{2}}{2n+1}\binom{2n}{k}^{-2}\sum\limits_{l=0}^{k}\binom{n}{l}^{2}\binom{n}{k-l}^{2}
=(n+1)(2nn)1(2n+1n)1l=0k(kl)2(2nknl)2,k=0,1,,2n,\displaystyle=(n+1)\binom{2n}{n}^{-1}\binom{2n+1}{n}^{-1}\sum\limits_{l=0}^{k}\binom{k}{l}^{2}\binom{2n-k}{n-l}^{2},\ k=0,1,\ldots,2n,

with (nm)=0\binom{n}{m}=0 if m>n.m>n.

It is easy to see that q2nk=qn,k,k=0,1,,2n.q_{2n-k}=q_{n,k},\ k=0,1,\ldots,2n.

We recall here Conjecture 4.6 from [24].

Conjecture 26.

([24]) The sequence (qn,k)k=0,1,,2n(q_{n,k})_{k=0,1,\ldots,2n} is convex and, consequently, the function SnS_{n} is convex on [0,1].[0,1].

The following numerical and graphical experiments support this conjecture (see Table 1 and Figure 2).

Table 1. Values of the coefficients qn,kq_{n,k}.

𝒏𝒌\boldsymbol{nk} 0 1 2 3 4 5 6 7 8 9 10 11 12
11 43\dfrac{4}{3} 23\dfrac{2}{3} 43\dfrac{4}{3}
22 95\dfrac{9}{5} 910\dfrac{9}{10} 910\dfrac{9}{10} 910\dfrac{9}{10} 95\dfrac{9}{5}
33 167\dfrac{16}{7} 87\dfrac{8}{7} 176175\dfrac{176}{175} 164175\dfrac{164}{175} 176175\dfrac{176}{175} 87\dfrac{8}{7} 167\dfrac{16}{7}
44 259\dfrac{25}{9} 2518\dfrac{25}{18} 1025882\dfrac{1025}{882} 925882\dfrac{925}{882} 905882\dfrac{905}{882} 925882\dfrac{925}{882} 1025882\dfrac{1025}{882} 2518\dfrac{25}{18} 259\dfrac{25}{9}
55 3611\dfrac{36}{11} 1811\dfrac{18}{11} 43\dfrac{4}{3} 1311\dfrac{13}{11} 8677\dfrac{86}{77} 2321\dfrac{23}{21} 8677\dfrac{86}{77} 1311\dfrac{13}{11} 43\dfrac{4}{3} 1811\dfrac{18}{11} 3611\dfrac{36}{11}
66 4913\dfrac{49}{13} 4926\dfrac{49}{26} 47533146\dfrac{4753}{3146} 41653146\dfrac{4165}{3146} 1739514157\dfrac{17395}{14157} 6678756628\dfrac{66787}{56628} 73296292\dfrac{7329}{6292} 6678756628\dfrac{66787}{56628} 1739514157\dfrac{17395}{14157} 41653146\dfrac{4165}{3146} 47533146\dfrac{4753}{3146} 4926\dfrac{49}{26} 4913\dfrac{49}{13}
Refer to caption
Figure 2. Graphics of SnS_{n} for n=1,2,3,4,5,6n=1,2,3,4,5,6

8. Concluding Remarks and Future Work

Bounds and convexity type properties of entropies are important and useful, especially when the entropies are expressed as complicated functions depending on one or several variables. Of course, bounds and convexity properties are presented in terms of inequalities; therefore, their study is a branch of the theory of inequalities, under active research. Our paper contains some contributions in this framework. We have obtained analytic inequalities, with analytic methods, but involving certain information potentials and their associated Rényi and Tsallis entropies. The probabilistic flavor is underlined by the purely probabilistic proof of the inequality (4.15). Finding such probabilistic proofs for other inequalities in this context will be a topic for future research. For example, is there a purely probabilistic proof of the subadditivity property (4.13) of the Rényi entropy Rn,c(x)R_{n,c}(x)?

The area in which our results can be placed is delineated by the papers [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13] and the references therein: the titles are expressive by themselves.

Basically, we are concerned with the family of probability distributions (pn,k[c](x))k=0,1,(p_{n,k}^{[c]}(x))_{k=0,1,\cdots}, strongly related with the family of generalized Baskakov positive linear operators. The interplay between the theory of positive linear operators and probability theory is still a rich source of important results. Besides the binomial distribution, Poisson distribution, and negative binomial distribution (corresponding, respectively to c=1c=-1, c=0c=0, c=1c=1) and associated with the Bernstein, Szász-Mirakyan, respectively classical Baskakov operators, we consider in our paper, from an analytic point of view, the distributions associated with the Bleimann–Butzer–Hahn, Meyer-König and Zeller, and Durrmeyer operators. Their study in a probabilistic perspective is deferred to a future paper. Another possible direction of further research is to investigate with our methods the distributions associated with other classical or more recent sequences of positive linear operators.

The information potential FnF_{n}, GnG_{n}, UnU_{n}, and JnJ_{n} have strong relations with the Legendre polynomials. Quite naturally, the recurrence relations satisfied by these polynomials yield similar relations for the information potentials. It should be mentioned that the differential equation characterizing the Legendre polynomials was used in [17], in order to show that FnF_{n}, GnG_{n}, UnU_{n} and JnJ_{n} satisfy Heun-type differential equations and consequently to obtain bounds for them. Other bounds are obtained in this paper, starting from the important integral representations given in [19]. They can be compared with other bounds from the literature, and this is another possible topic for further research.

For a fixed nn, the convexity and even the logarithmic convexity of the function Fn(x)F_{n}(x) were established in [17, 31, 32, 27, 34]. In this paper, we prove that for a fixed xx, the sequence (Fn(x))n0(F_{n}(x))_{n\geq 0} is logarithmically convex. Similar results hold for the other information potentials, and they have consequences concerning the associated entropies. However, we think that this direction of research can be continued and developed.

Two conjectures, accompanied by graphical experiments supporting them, are mentioned in our paper.

References

  • [1] Harremoës, P.; Topsoe, F. Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inf. Theory 2001, 47, 2944–2960.
  • [2] Harremoës, P. Binomial and Poisson distribution as maximum entropy distributions. IEEE Trans. Inf. Theory 2001, 47, 2039–2041.
  • [3] Hillion, E. Concavity of entropy along binomial convolution. Electron. Commun. Probab. 2012, 17, 1–9.
  • [4] Hillion, E.; Johnson, O. A proof of the Shepp-Olkin entropy concavity conjecture. Bernoulli 2017, 23, 3638–3649.
  • [5] Knessl, C. Integral representation and asymptotic expansions for Shannon and Renyi entropies. Appl. Math. Lett. 1998, 11, 69–74.
  • [6] Adell, J.A.; Lekuona, A.; Yu, Y. Sharp bound on the entropy of the Poisson low and related quantities. IEEE Trans. Inf. Theory 2010, 56, 2299–2306.
  • [7] Melbourne, J.; Tkocz, T. Reversals of Rényi entropy inequalities under log-concavity. IEEE Trans. Inf. Theory 2020, doi: 10.1109/TIT.2020.3024025.
  • [8] Shepp, L.A.; Olkin, I. Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution, Contributions to Probability A Collection of Papers Dedicated to Eugene Lukacs; 1981; pp. 201–206.
  • [9] Hillion, E.; Johnson, O. Discrete versions of the transport equation and the Shepp-Olkin conjecture. Ann. Probab. 2016, 44, 276–306.
  • [10] Alzer, H. A refinement of the entropy inequality. Ann. Univ. Sci. Bp. 1995, 38, 13–18.
  • [11] Chang, S.-C.; Weldon, E. Coding for T-user multiple-access channels. IEEE Trans. Inf. Theory 1979, 25, 684–691.
  • [12] Xu, D. Energy, Entropy and Information Potential for Neural Computation. Ph.D. Thesis, University of Florida, Gainesville, FL, USA, 1999.
  • [13] Bărar, A.; Mocanu, G. R.; Raşa, I. Bounds for some entropies and special functions. Carpathian J. Math. 2018, 34, 9–15.
  • [14] Principe, J.C. Information Theoretic Learning: Rényi’s Entropy and Kernel Perspectives; Springer: Berlin/Heidelberg, Germany, 2010.
  • [15] Acu, A.M.; Başcanbaz-Tunca, G.; Raşa, I. Information potential for some probability density functions. Appl. Math. Comput. 2021, 389, 125578.
  • [16] Acu, A.M.; Başcanbaz-Tunca, G.; Raşa, I. Bounds for indices of coincidence and entropies. submitted.
  • [17] Raşa, I. Entropies and Heun functions associated with positive linear operators. Appl. Math. Comput. 2015, 268, 422–431.
  • [18] Baskakov, V.A.. An instance of a sequence of positive linear operators in the space of continuous functions. Doklady Akademii Nauk SSSR 1957, 113, 249–251.
  • [19] Berdysheva, E. Studying Baskakov–Durrmeyer operators and quasi-interpolants via special functions. J. Approx. Theory 2007, 149, 131–150.
  • [20] Heilmann, M. Erhohung der Konvergenzgeschwindigkeit bei der Approximation von Funktionen mit Hilfe von Linearkombinationen Spezieller Positiver Linearer Operatoren. Habilitationschrift Universitat Dortmund , Dortmund, Germany, 1992.
  • [21] Wagner, M. Quasi-Interpolaten zu genuinen Baskakov-Durrmeyer-Typ Operatoren; Shaker: Aachen, Germany, 2013.
  • [22] Acu, A.M.; Heilmann, M.; Rasa, I. Linking Baskakov Type Operators, Constructive theory of functions, Sozopol 2019; Prof. Marin Drinov Academic Publishing House: Sofia , 2020, in press.
  • [23] Heilmann, M.; Raşa, I. A nice representation for a link between Baskakov and Szász-Mirakjan-Durrmeyer operators and their Kantorovich variants. Results Math. 2019, 74, 9.
  • [24] Raşa, I. Rényi entropy and Tsallis entropy associated with positive linear operators. arXiv 2014, arXiv:1412.4971v.1.
  • [25] Nikolov, G. Inequalities for ultraspherical polynomials. Proof of a conjecture of I. Raşa. J. Math. Anal. Appl. 2014, 418, 852–860.
  • [26] Gavrea, I.; Ivan, M. On a conjecture concerning the sum of the squared Bernstein polynomials. Appl. Math. Comput. 2014, 241, 70–74.
  • [27] Alzer, H. Remarks on a convexity theorem of Raşa. Results Math. 2020, 75, 29.
  • [28] Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2006.
  • [29] Abramowitz, M.; Stegun, I.A. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Dover Publications, Inc.: New York, NY, USA, 1970.
  • [30] Bărar, A.; Mocanu, G.; Raşa, I. Heun functions related to entropies. RACSAM 2019, 113, 819–830.
  • [31] Raşa, I. Convexity properties of some entropies (II). Results Math. 2019, 74, 154.
  • [32] Raşa, I. Convexity properties of some entropies. Results Math. 2018, 73, 105.
  • [33] Cloud, M.J.; Drachman, B.C. Inequalities: With Applications to Engineering; Springer: Berlin/Heidelberg, Germany, 2006.
  • [34] Abel, U.; Gawronski, W.; Neuschel, T. Complete monotonicity and zeros of sums of squared Baskakov functions. Appl. Math. Comput. 2015, 258, 130–137.
  • [35] Gould, H.W. Combinatorial Identities—A Standardized Set of Tables Listing 500 Binomial Coefficient Summations; Morgantown, VA, USA, 1972.
  • [36] Altomare, F.; Campiti, M. Korovkin-Type Approximation Theory and Its Applications; Walter de Gruyter: Berlin, Germany; New York, NY, USA, 1994.
2020

Related Posts