Fisher's information measures and truncated normal distributions (II)
DOI:
https://doi.org/10.33993/jnaat322-746Keywords:
Fisher information, truncated distributionAbstract
The aim of this paper is to give some properties for the Fisher information measure when a random variable \(X\) follows a truncated probability distribution. A truncated probability distribution can be regarded as a conditional probability distribution, in the sense that if \(X\) has an unrestricted distribution with the probability density function \(f(x), \) then \(f_{a\leftrightarrow b}(x)\) is the probability density function which governs the behavior of \(X\), subject to the condition that \(X\) is known to lie in \([a,b]\).Downloads
References
Kullback, S., Information Theory and Statistics, Wiley, New York, 1959.
Mihoc, I. and Fătu, C. I., Fisher's Information Measures for the Truncated Normal Distribution (I), Analysis, Functional Equations, Approximation and Convexity, Proceedings of the Conference Held in Honour Professor E. Popoviciu on the Occasion of her 75-th Birthday, Cluj-Napoca, October 15-16, 1999, Editura Carpatica, pp. 171-182, 1999.
Rao, C. R., Liniar Statistical Inference and Its Applications, John Wiley and Sons, Inc., New York, 1965.
Rényi, A., Some fundamental questions of information theory, MTA III, Oszt. Közl., 10, pp. 251-282, 1960.
Rényi, A., Probability Theory, Akadémiai Kiado, Budapest, 1970.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2015 Journal of Numerical Analysis and Approximation Theory
This work is licensed under a Creative Commons Attribution 4.0 International License.
Open Access. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.