Abstract
We consider a probability distribution \((p_0(x), p_1(x), . . .)\) depending on a real parameter \(x\). The associated information potential is \(S(x):=∑_{k}p_{k}^2(x).\) The Rényi entropy and the Tsallis entropy of order \(2\) can be expressed as \(R(x) = − \log S(x)\) and \(T(x) = 1 − S(x)\). We establish recurrence relations, inequalities and bounds for \(S(x)\), which lead immediately to similar relations, inequalities and bounds for the two entropies. We show that some sequences \((R_n(x))_{n≥0}\) and \((T_n(x))_{n≥0}\), associated with sequences of classical positive linear operators, are concave and increasing. Two conjectures are formulated involving the information potentials associated with the Durrmeyer density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.
Authors
Ana Maria Acu
Lucian Blaga University of Sibiu, Sibiu, Romania
Alexandra Măduța
Technical University of Cluj-Napoca, Cluj-Napoca, Romania
Diana Otrocol
Technical University of Cluj-Napoca, Cluj-Napoca, Romania
Tiberiu Popoviciu Institute of Numerical Analysis, Romanian Academy
Ioan Rașa
Technical University of Cluj-Napoca, Cluj-Napoca, Romania
Keywords
probability distribution; Rényi entropy; Tsallis entropy; information potential; functional equations; inequalities
References
see the expanding block below
Cite this paper as:
A.M. Acu, A. Măduța, D. Otrocol, I. Rașa, Inequalities for information potentials and entropies, 8 (2020) 11, pp. 2056, doi: 10.3390/math8112056
About this paper
Print ISSN
Not available yet.
Online ISSN
ISSN 2227-7390,
Google Scholar Profile
1. Harremoës, P., Topsoe, F., Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inf. Theory 2001, 47, 2944–2960.
2. Harremoës, P., Binomial and Poisson distribution as maximum entropy distributions. IEEE Trans. Inf. Theory 2001, 47, 2039–2041.
3. Hillion, E., Concavity of entropy along binomial convolution. Electron. Commun. Probab. 2012, 17, 1–9.
4. Hillion, E., Johnson, O., A proof of the Shepp-Olkin entropy concavity conjecture. Bernoulli 2017, 23, 3638–3649.
5. Knessl, C., Integral representation and asymptotic expansions for Shannon and Renyi entropies. Appl. Math. Lett. 1998, 11, 69–74.
6. Adell, J.A., Lekuona, A., Yu, Y., Sharp bound on the entropy of the Poisson low and related quantities. IEEE Trans. Inf. Theory 2010, 56, 2299–2306.
7. Melbourne, J., Tkocz, T., Reversals of Rényi entropy inequalities under log-concavity. IEEE Trans. Inf. Theory 2020.
8. Shepp, L.A., Olkin, I., Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution, Contributions to Probability A Collection of Papers Dedicated to Eugene Lukacs; Academic Press: London, UK, 1981; pp. 201–206.
9. Hillion, E., Johnson, O., Discrete versions of the transport equation and the Shepp-Olkin conjecture. Ann. Probab. 2016, 44, 276–306.
10. Alzer, H., A refinement of the entropy inequality. Ann. Univ. Sci. Bp. 1995, 38, 13–18.
11. Chang, S.-C., Weldon, E., Coding for T-user multiple-access channels. IEEE Trans. Inf. Theory 1979, 25, 684–691.
12. Xu, D. Energy, Entropy and Information Potential for Neural Computation. Ph.D. Thesis, University of Florida, Gainesville, FL, USA , 1999.
13. Barar, A., Mocanu, G. R., Rasa, I., Bounds for some entropies and special functions. Carpathian J. Math. 2018, 34, 9–15.
14. Principe, J.C., Information Theoretic Learning: Rényi’s Entropy and Kernel Perspectives; Springer: New York, NY, USA, 2010.
15. Acu, A.M., Bascanbaz-Tunca, G., Rasa, I., Information potential for some probability density functions. Appl. Math. Comput. 2021, 389, 125578.
16. Acu, A.M., Bascanbaz-Tunca, G., Rasa, I., Bounds for indices of coincidence and entropies. submitted.
17. Rasa, I., Entropies and Heun functions associated with positive linear operators. Appl. Math. Comput. 2015, 268, 422–431.
18. Baskakov, V.A., An instance of a sequence of positive linear operators in the space of continuous functions. Doklady Akademii Nauk SSSR 1957, 113, 249–251.
19. Berdysheva, E., Studying Baskakov–Durrmeyer operators and quasi-interpolants via special functions. J. Approx. Theory 2007, 149, 131–150.
20. Heilmann, M., Erhohung der Konvergenzgeschwindigkeit bei der Approximation von Funktionen mit Hilfe von Linearkombinationen Spezieller Positiver Linearer Operatoren; Habilitationschrift Universitat Dortmund: Dortmund, Germany, 1992.
21. Wagner, M., Quasi-Interpolaten zu genuinen Baskakov-Durrmeyer-Typ Operatoren; Shaker: Aachen, Germany, 2013.
22. Acu, A.M., Heilmann, M., Rasa, I., Linking Baskakov Type Operators, Constructive theory of functions, Sozopol 2019; Draganov, B., Ivanov, K., Nikolov, G., Uluchev, R., Eds.; Prof. Marin Drinov Publishing House of BAS: Sofia, Bulgaria, 2020; pp. 23–38.
23. Heilmann, M., Rasa, I., A nice representation for a link between Baskakov and Szász-Mirakjan-Durrmeyer operators and their Kantorovich variants. Results Math. 2019, 74, 9.
24. Rasa, I., Rényi entropy and Tsallis entropy associated with positive linear operators. arXiv 2014, arXiv:1412.4971v.1.
25. Nikolov, G., Inequalities for ultraspherical polynomials. Proof of a conjecture of I. Ra¸sa. J. Math. Anal. Appl. 2014, 418, 852–860.
26. Gavrea, I., Ivan, M., On a conjecture concerning the sum of the squared Bernstein polynomials. Appl. Math. Comput. 2014, 241, 70–74.
27. Alzer, H., Remarks on a convexity theorem of Rasa. Results Math. 2020, 75, 29.
28. Cover, T.M., Thomas, J.A., Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2006.
29. Abramowitz, M., Stegun, I.A., Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Dover Publications, Inc.: New York, NY, USA, 1970.
30. Barar, A., Mocanu, G., Rasa, I., Heun functions related to entropies. RACSAM 2019, 113, 819–830.
31. Rasa, I., Convexity properties of some entropies (II). Results Math. 2019, 74, 154.
32. Rasa, I., Convexity properties of some entropies. Results Math. 2018, 73, 105.
33. Cloud, M.J., Drachman, B.C., Inequalities: With Applications to Engineering; Springer: Berlin/Heidelberg, Germany, 2006.
34. Abel, U., Gawronski, W., Neuschel, T., Complete monotonicity and zeros of sums of squared Baskakov functions. Appl. Math. Comput. 2015, 258, 130–137.
35. Gould, H.W., Combinatorial Identities—A Standardized Set of Tables Listing 500 Binomial Coefficient Summations; West Virginia University Press: Morgantown, VA, USA, 1972.
36. Altomare, F., Campiti, M., Korovkin-Type Approximation Theory and Its Applications; Walter de Gruyter: Berlin, Germany; New York, NY, USA, 1994.
Inequalities for Information Potentials and Entropies
Abstract.
We consider a probability distribution depending on a real parameter . The associated information
potential is . The Rényi entropy and the
Tsallis entropy of order 2 can be expressed as and
. We establish recurrence relations, inequalities and bounds for ,
which lead immediately to similar relations, inequalities and bounds for the two
entropies. We show that some sequences
and , associated with sequences of classical positive
linear operators, are concave and increasing. Two
conjectures are formulated involving the information potentials associated with the Durrmeyer
density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.
MSC: 39B22, 39B62, 94A17, 26D07.
Keywords: probability distribution; Rényi entropy; Tsallis entropy; information potential; functional equations; inequalities.
1. Introduction
Entropies associated with discrete or continuous probability distributions are usually described by complicated explicit expressions, depending on one or several parameters. Therefore, it is useful to establish lower and upper bounds for them. Convexity-type properties are also useful: they embody valuable information on the behavior of the functions representing the entropies.
This is why bounds and convexity-type properties of entropies, expressed by inequalities, are under an active study: see [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13] and the references therein. Our paper is concerned with this kind of inequalities: we give new results and new proofs or improvements of some existing results, in the framework which is presented below.
Let be a probability distribution depending on a parameter , where is a real interval. The associated information potential (also called index of coincidence, for obvious probabilistic reasons) is defined (see [14]),
(1.1) |
If , is a probability density function depending on the parameter the associated information potential is defined as (see [14]),
(1.2) |
The information potential is the core concept of the book [14]. The reader can find properties, extensions, generalizations of , as well as applications to Information theoretic learning. Other properties and applications can be found in the recent papers [15] and [16].
It is important to remark that the Rényi entropy and the Tsallis entropy can be expressed in terms of as
(1.3) |
So the properties of lead immediately to properties of respectively .
On the other hand, we can consider the discrete positive linear operators
(1.4) |
where are given points in , and the integral operators
(1.5) |
In both cases, is a function from a suitable set of functions defined on . In this paper, we consider classical operators of this kind, which are used in approximation theory.
Let us mention that the “degree of nonmultiplicativity” of the operator can be estimated in terms of the information potential : see [17] and the references therein.
In this paper, we will be concerned with a special family of discrete probability distributions, described as follows.
Let . Set if , and if . For and the binomial coefficients are defined as usual by
Let be a real number, and . Define
(1.6) |
(1.7) |
Then . Suppose that if , or with some if .
With this notation, we consider the discrete distribution of probability depending on the parameter
According to (1.1), the associated information potential, or index of coincidence, is
(1.8) |
The Rényi entropy and the Tsallis entropy corresponding to the same distribution of probability are defined, respectively (see (1.3))
(1.9) |
and
(1.10) |
The case corresponds to the Poisson distribution (see (1.7)), which
(1.12) |
For , we have the negative binomial distribution, with
(1.13) |
The binomial, Poisson, respectively negative binomial distributions correspond to the classical Bernstein, Szász-Mirakyan, respectively Baskakov operators from approximation theory; all of them are of the form (1.4). In fact, the distribution is instrumental for the construction of the family of positive linear operators introduced by Baskakov in [18]; see also [19, 20, 21, 22, 23]. As a probability distribution, the family of functions was considered in [24, 17].
The distribution
(1.14) |
corresponds to the Bleimann–Butzer–Hahn operators, while
(1.15) |
is connected with the Meyer-König and Zeller operators.
The information potentials and the entropies associated with all these distributions were studied in [17]; see also [25, 26, 27]. It should be mentioned that they satisfy Heun-type differential equations: see [17]. We continue this study. To keep the same notation as in [17], let us return to (1.11)–(1.13) and denote
Moreover, the information potential corresponding to (1.14) and (1.15) will be denoted by
(1.16) | ||||
(1.17) |
In Section 2, we present several relations between the functions , , as well as between these functions and the Legendre polynomials. By using the three-terms recurrence relations involving the Legendre polynomials, we establish recurrence relations involving three consecutive terms from the sequences , respectively . We recall also some explicit expressions of these functions.
Section 3 is devoted to inequalities between consecutive terms of the above sequences; in particular, we emphasize that for fixed , the four sequences are logarithmicaly convex and hence convex.
Other inequalities are presented in Section 4. All the inequalities can be used to get information about the Rényi entropies and Tsallis entropies connected with the corresponding probability distributions.
Section 5 contains new properties of the function and a problem of its shape.
Section 6 is devoted to some inequalities involving integrals of the form in relation with certain combinatorial identities.
The information potential associated with the Durrmeyer density of probability is computed in Section 7. We recall a conjecture formulated in [24].
As already mentioned, all the results involving the information potential can be used to derive results about Rényi and Tsallis entropies. For the sake of brevity, we will study usually only the information potential.
2. Recurrence Relations
is a polynomial, are rational functions. On their maximal domains, these functions are connected by several relations (see [17], Cor. 13, (46), (53), (54)):
(2.1) | ||||
(2.2) | ||||
(2.3) |
Consider the Legendre polynomial (see [29], 22.3.1)
(2.4) |
In the theory of special functions recurrence, relations play a crucial role. In particular, the Legendre polynomials (2.4) satisfy the important recurrence relation ([29], 22.7.1)
(2.9) |
This leads us to
Theorem 1.
The functions andsatisfy the following three-terms recurrence relations:
(2.10) | ||||
(2.11) | ||||
(2.12) | ||||
(2.13) |
3. Inequalities for Information Potentials
In studying a sequence of special functions, not only are recurrence relations important, but also inequalities connecting successive terms; in particular, inequalities showing that the sequence is (logarithmically) convex or concave. This section is devoted to such inequalities involving the sequences , , , and .
Theorem 4.
The function satisfies the inequalities
(3.1) |
(3.2) |
(3.3) |
for all
Proof.
It follows that
On the other hand,
which entails
(3.5) |
Corollary 5.
The Rényi entropy and the Tsallis entropy corresponding to the binomial distribution with parameters and satisfy the inequalities:
(3.7) | ||||
(3.8) |
Theorem 6.
The following inequalities hold:
(3.9) | ||||
(3.10) | ||||
(3.11) | ||||
(3.12) | ||||
(3.13) | ||||
(3.14) | ||||
(3.15) | ||||
(3.16) | ||||
(3.17) |
These integral representations, together with the representation of given by (3.4), are consequences of the important results of Elena Berdysheva ([19], Theorem 1).
From (3.9)–(3.17), we can derive inequalities similar to (3.7) and (3.8), for the entropies associated with the probability distributions corresponding to , and .
Remark 7.
Let us remark that the inequalities (3.3), (3.9), (3.12), (3.15) show that for each , the sequences are logarithmically convex, and so convex; the other inequalities from Theorems 4 and 6 show that the same sequences are decreasing. It immediately follows that the associated sequences of entropies and are concave and increasing; see also (3.7) and (3.8).
4. Other Inequalities
Besides their own interest, the next Theorems 8 and 12 will be instrumental in establishing new lower and upper bounds for the information potentials and consequently for the associated Rényi and Tsallis entropies.
Let . Using (4.1) and Chebyshev’s inequality for synchronous functions, we can write
For , we use Chebyshev’s inequality for asynchronous functions and obtain the reverse inequality. So we have
Theorem 8.
If , then
(4.2) |
If , the inequality is reversed.
Now, using [17, (48)], we have
Therefore, using also (3.11), we get
(4.5) |
Theorem 10.
The following inequalities are satisfied:
(4.7) |
(4.8) |
(4.9) |
(4.10) |
Proof.
Writing (4.3) for , and multiplying term by term, we get
Remark 11.
The inequalities (4.7)–(4.10) in Theorem 10 provide lower and upper bounds for the information potentials , , , , and consequently for the associated entropies. They can be compared with other bounds existing in the literature, obtained with other methods. For the moment, let us prove the inequality
(4.11) |
and compare it with the first inequality in (4.7).
Theorem 12.
The information potential satisfies the following inequality for all :
(4.12) |
Proof.
If we can use (4.1) to get
Applying Chebyshev’s inequality for synchronous functions, we obtain
For , we have (see [17], (13))
With the same Chebyshev inequality, one obtains (4.12). ∎
From Theorem 12, we derive
Corollary 13.
For the Rényi entropy and the Tsallis entropy , we have
(4.13) | ||||
(4.14) |
Remark 15.
Let be independent binomial random variables with the same parameter Then
and consequently
and this proves (4.15). It would be useful to have purely probabilistic proofs of other inequalities in this specific framework; they would facilitate a deeper understanding of the interplay between analytic proofs/results and probabilistic proofs/results.
Inequalities similar to (4.15) hold for (apply (4.12) with ) and for and Indeed, according to (2.2),
for all which implies
(4.16) |
Let us remark that the inequality (4.17) is stronger than the similar inequalities for , and .
Corollary 16.
For , , , we have
In particular,
(4.18) |
Proof.
Starting from (4.15), it suffices to use induction on . ∎
5. More about
This section contains some additional properties of the function defined initially by (1.16). Using the simple relation (2.16) connecting and , one can easily derive new properties of the function given by (1.17).
Theorem 18.
-
(i)
is decreasing on and increasing on
-
(ii)
is logarithmically convex on
Proof.
Let Then and (2.2) shows that Consequently,
(5.2) |
It is known (see [17]) that and It follows that and This proves (i).
To prove (ii), let us remark that
Combined with (5.2), this yields
Remark 19.
(see (2.14)). These equalities, theorem 18, and graphical experiments(see Figure 1) suggest that is convex on and concave on for a suitable It would be interesting to have a proof for this shape of , and to find the value of

In order to compute , we have the explicit expressions (1.16) and (2.14), and the three terms of recurrence relation (2.12). In what follows, we provide two terms of recurrence relation.
According to ([31], (2.3)),
Setting again and , we obtain, after some computation,
(5.3) |
Multiplying (5.3) by , we obtain
(5.4) |
Let and
Theorem 20.
The sequence satisfies the recurrence relation
(5.5) |
with
Proof.
This reduces to
and therefore
(5.6) |
Remark 22.
Remark 23.
According to (4.9), i.e., the sequence of functions is pointwise convergent to zero on . The convergence is not uniform, because for all
6. Inequalities for the Integral of the Squared Derivative
Integrals of the form are important for several applications; see, e.g., ([33], Section 3.10). In this section, we present bounds for such integrals using the logarithmic convexity of the functions The results involve some combinatorial identities.
Theorem 24.
The following inequalities are valid for :
(6.1) | ||||
(6.2) | ||||
(6.3) |
7. Information Potential for the Durrmeyer Density of Probability
Setting , we obtain
where
with if
It is easy to see that
We recall here Conjecture 4.6 from [24].
Conjecture 26.
([24]) The sequence is convex and, consequently, the function is convex on
The following numerical and graphical experiments support this conjecture (see Table 1 and Figure 2).
0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | |

8. Concluding Remarks and Future Work
Bounds and convexity type properties of entropies are important and useful, especially when the entropies are expressed as complicated functions depending on one or several variables. Of course, bounds and convexity properties are presented in terms of inequalities; therefore, their study is a branch of the theory of inequalities, under active research. Our paper contains some contributions in this framework. We have obtained analytic inequalities, with analytic methods, but involving certain information potentials and their associated Rényi and Tsallis entropies. The probabilistic flavor is underlined by the purely probabilistic proof of the inequality (4.15). Finding such probabilistic proofs for other inequalities in this context will be a topic for future research. For example, is there a purely probabilistic proof of the subadditivity property (4.13) of the Rényi entropy ?
The area in which our results can be placed is delineated by the papers [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13] and the references therein: the titles are expressive by themselves.
Basically, we are concerned with the family of probability distributions , strongly related with the family of generalized Baskakov positive linear operators. The interplay between the theory of positive linear operators and probability theory is still a rich source of important results. Besides the binomial distribution, Poisson distribution, and negative binomial distribution (corresponding, respectively to , , ) and associated with the Bernstein, Szász-Mirakyan, respectively classical Baskakov operators, we consider in our paper, from an analytic point of view, the distributions associated with the Bleimann–Butzer–Hahn, Meyer-König and Zeller, and Durrmeyer operators. Their study in a probabilistic perspective is deferred to a future paper. Another possible direction of further research is to investigate with our methods the distributions associated with other classical or more recent sequences of positive linear operators.
The information potential , , , and have strong relations with the Legendre polynomials. Quite naturally, the recurrence relations satisfied by these polynomials yield similar relations for the information potentials. It should be mentioned that the differential equation characterizing the Legendre polynomials was used in [17], in order to show that , , and satisfy Heun-type differential equations and consequently to obtain bounds for them. Other bounds are obtained in this paper, starting from the important integral representations given in [19]. They can be compared with other bounds from the literature, and this is another possible topic for further research.
For a fixed , the convexity and even the logarithmic convexity of the function were established in [17, 31, 32, 27, 34]. In this paper, we prove that for a fixed , the sequence is logarithmically convex. Similar results hold for the other information potentials, and they have consequences concerning the associated entropies. However, we think that this direction of research can be continued and developed.
Two conjectures, accompanied by graphical experiments supporting them, are mentioned in our paper.
References
- [1] Harremoës, P.; Topsoe, F. Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inf. Theory 2001, 47, 2944–2960.
- [2] Harremoës, P. Binomial and Poisson distribution as maximum entropy distributions. IEEE Trans. Inf. Theory 2001, 47, 2039–2041.
- [3] Hillion, E. Concavity of entropy along binomial convolution. Electron. Commun. Probab. 2012, 17, 1–9.
- [4] Hillion, E.; Johnson, O. A proof of the Shepp-Olkin entropy concavity conjecture. Bernoulli 2017, 23, 3638–3649.
- [5] Knessl, C. Integral representation and asymptotic expansions for Shannon and Renyi entropies. Appl. Math. Lett. 1998, 11, 69–74.
- [6] Adell, J.A.; Lekuona, A.; Yu, Y. Sharp bound on the entropy of the Poisson low and related quantities. IEEE Trans. Inf. Theory 2010, 56, 2299–2306.
- [7] Melbourne, J.; Tkocz, T. Reversals of Rényi entropy inequalities under log-concavity. IEEE Trans. Inf. Theory 2020, doi: 10.1109/TIT.2020.3024025.
- [8] Shepp, L.A.; Olkin, I. Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution, Contributions to Probability A Collection of Papers Dedicated to Eugene Lukacs; 1981; pp. 201–206.
- [9] Hillion, E.; Johnson, O. Discrete versions of the transport equation and the Shepp-Olkin conjecture. Ann. Probab. 2016, 44, 276–306.
- [10] Alzer, H. A refinement of the entropy inequality. Ann. Univ. Sci. Bp. 1995, 38, 13–18.
- [11] Chang, S.-C.; Weldon, E. Coding for T-user multiple-access channels. IEEE Trans. Inf. Theory 1979, 25, 684–691.
- [12] Xu, D. Energy, Entropy and Information Potential for Neural Computation. Ph.D. Thesis, University of Florida, Gainesville, FL, USA, 1999.
- [13] Bărar, A.; Mocanu, G. R.; Raşa, I. Bounds for some entropies and special functions. Carpathian J. Math. 2018, 34, 9–15.
- [14] Principe, J.C. Information Theoretic Learning: Rényi’s Entropy and Kernel Perspectives; Springer: Berlin/Heidelberg, Germany, 2010.
- [15] Acu, A.M.; Başcanbaz-Tunca, G.; Raşa, I. Information potential for some probability density functions. Appl. Math. Comput. 2021, 389, 125578.
- [16] Acu, A.M.; Başcanbaz-Tunca, G.; Raşa, I. Bounds for indices of coincidence and entropies. submitted.
- [17] Raşa, I. Entropies and Heun functions associated with positive linear operators. Appl. Math. Comput. 2015, 268, 422–431.
- [18] Baskakov, V.A.. An instance of a sequence of positive linear operators in the space of continuous functions. Doklady Akademii Nauk SSSR 1957, 113, 249–251.
- [19] Berdysheva, E. Studying Baskakov–Durrmeyer operators and quasi-interpolants via special functions. J. Approx. Theory 2007, 149, 131–150.
- [20] Heilmann, M. Erhohung der Konvergenzgeschwindigkeit bei der Approximation von Funktionen mit Hilfe von Linearkombinationen Spezieller Positiver Linearer Operatoren. Habilitationschrift Universitat Dortmund , Dortmund, Germany, 1992.
- [21] Wagner, M. Quasi-Interpolaten zu genuinen Baskakov-Durrmeyer-Typ Operatoren; Shaker: Aachen, Germany, 2013.
- [22] Acu, A.M.; Heilmann, M.; Rasa, I. Linking Baskakov Type Operators, Constructive theory of functions, Sozopol 2019; Prof. Marin Drinov Academic Publishing House: Sofia , 2020, in press.
- [23] Heilmann, M.; Raşa, I. A nice representation for a link between Baskakov and Szász-Mirakjan-Durrmeyer operators and their Kantorovich variants. Results Math. 2019, 74, 9.
- [24] Raşa, I. Rényi entropy and Tsallis entropy associated with positive linear operators. arXiv 2014, arXiv:1412.4971v.1.
- [25] Nikolov, G. Inequalities for ultraspherical polynomials. Proof of a conjecture of I. Raşa. J. Math. Anal. Appl. 2014, 418, 852–860.
- [26] Gavrea, I.; Ivan, M. On a conjecture concerning the sum of the squared Bernstein polynomials. Appl. Math. Comput. 2014, 241, 70–74.
- [27] Alzer, H. Remarks on a convexity theorem of Raşa. Results Math. 2020, 75, 29.
- [28] Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2006.
- [29] Abramowitz, M.; Stegun, I.A. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Dover Publications, Inc.: New York, NY, USA, 1970.
- [30] Bărar, A.; Mocanu, G.; Raşa, I. Heun functions related to entropies. RACSAM 2019, 113, 819–830.
- [31] Raşa, I. Convexity properties of some entropies (II). Results Math. 2019, 74, 154.
- [32] Raşa, I. Convexity properties of some entropies. Results Math. 2018, 73, 105.
- [33] Cloud, M.J.; Drachman, B.C. Inequalities: With Applications to Engineering; Springer: Berlin/Heidelberg, Germany, 2006.
- [34] Abel, U.; Gawronski, W.; Neuschel, T. Complete monotonicity and zeros of sums of squared Baskakov functions. Appl. Math. Comput. 2015, 258, 130–137.
- [35] Gould, H.W. Combinatorial Identities—A Standardized Set of Tables Listing 500 Binomial Coefficient Summations; Morgantown, VA, USA, 1972.
- [36] Altomare, F.; Campiti, M. Korovkin-Type Approximation Theory and Its Applications; Walter de Gruyter: Berlin, Germany; New York, NY, USA, 1994.