Inequalities for Information Potentials and Entropies

Abstract

We consider a probability distribution \((p_0(x), p_1(x), . . .)\) depending on a real parameter \(x\). The associated information potential is \(S(x):=∑_{k}p_{k}^2(x).\) The Rényi entropy and the Tsallis entropy of order \(2\) can be expressed as \(R(x) = − \log S(x)\) and \(T(x) = 1 − S(x)\). We establish recurrence relations, inequalities and bounds for \(S(x)\), which lead immediately to similar relations, inequalities and bounds for the two entropies. We show that some sequences \((R_n(x))_{n≥0}\) and \((T_n(x))_{n≥0}\), associated with sequences of classical positive linear operators, are concave and increasing. Two conjectures are formulated involving the information potentials associated with the Durrmeyer density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.

Authors

Ana Maria Acu
Lucian Blaga University of Sibiu, Sibiu, Romania

Alexandra Măduța
Technical University of Cluj-Napoca, Cluj-Napoca, Romania

Diana Otrocol
Technical University of Cluj-Napoca, Cluj-Napoca, Romania
Tiberiu Popoviciu Institute of Numerical Analysis, Romanian Academy

Ioan Rașa
Technical University of Cluj-Napoca, Cluj-Napoca, Romania

Keywords

probability distribution; Rényi entropy; Tsallis entropy; information potential; functional equations; inequalities

References

see the expanding block below

PDF

Cite this paper as:

A.M. Acu,  A. Măduța,  D. Otrocol, I. Rașa, Inequalities for information potentials and entropies8 (2020) 11, pp. 2056, doi: 10.3390/math8112056

About this paper

Journal

Mathematics

Publisher Name

MDPI

Print ISSN

Not available yet.

Online ISSN

ISSN 2227-7390,

Google Scholar Profile
2020

Related Posts