Jensen-Fisher and Jensen-χ2 information measures for finite mixture distributions

Document Type : Original Paper


1 Department of Statistics, Faculty of Mathematical Sciences, Vali-e-Asr University of Rafsanjan, Iran

2 Department of Statistics, Faculty of Intelligent Systems Engineering and Data Science, Persian Gulf University, Bushehr, Iran


In this paper, first, considering Fisher information of parametric type, we introduce a new information

measure based on Jensen inequality. Then, the Fisher information matrix and Jensen-Fisher are studied

or a finite mixture distribution of probability density functions. Further, another information criterion is

introduced as Jensen-χ2 based on a mixture of probability density functions. Generalizations of Jensen-Fisher and Jensen-χ2 information measures are presented based on m probability density functions and the relationship between these two new information criteria as well as the relationship between Jensen-Fisher information and some known information criteria such as Jensen-Shannon and Jeffreys information

measures are studied.


Main Subjects

[1] C. E. Shannon, A mathematical theory of communication, Bell System Technical Journal, 27 (1948), 379-423.
[2] R. A. Fisher, Tests of significance in harmonic analysis,Proceedings of the Royal Society of London, Series A, 125 (1929), 54-59.
[3] C. R. Rao, Diversity and dissimilarity coefficients: a unified approach, Theoretical population biology, 21 (1982), 24–43.
[4] O. Kharazmi & N. Balakrishnan, Jensen-information generating function and its connections to some well-known information measures, Statistics & Probability Letters, 170 (2021), .108995
[5] O. Kharazmi & N. Balakrishnan, Discrete Versions of Jensen–Fisher, Fisher and Bayes–Fisher Information Measures of Finite Mixture Distributions, Entropy, 23 (2021), .363
[6] O. Kharazmi & N. Balakrishnan, Cumulative residual and relative cumulative residual Fisher information and their properties, IEEE Transactions on Information Theory,(2021), DOI: 10.1109/TIT.2021.3073789.
[7] B. R. Frieden, Science from Fisher Information: A Unification, Cambridge: Cambridge University Press, (2004), MR2069674.
[8] P. Sánchez-Moreno, A. Zarzo, and J. S. Dehesa, Jensen divergence based on Fisher’s information, Journal of Physics A: Mathematical and Theoretical, 45 (2012), 125305.
[9] J. Lin, Divergence measures based on the Shannon entropy,IEEE Transactions on Information Theory, 37 (1991), 145–151.