New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application
-
2729
Downloads
-
4302
Views
Authors
K. C. Jain
- Department of Mathematics, Malaviya National Institute of Technology, Jaipur (Rajasthan), India
P. Chhabra
- Department of Mathematics, Malaviya National Institute of Technology, Jaipur (Rajasthan), India
Abstract
In this work, firstly we introduce the new information divergence measure, characterize it and get the mathematical relations with other divergences. Further, we introduce new information inequalities on the new generalized f- divergence measure in terms of the well-known one parametric generalized divergence. Further, we obtain bounds of the new divergence and the Relative J- divergence as an application of new information inequalities by using Logarithmic power mean and Identric mean, together with numerical verification by taking two discrete probability distributions: Binomial and Poisson. Approximate relations of the new divergence and Relative J- divergence with Chi- square divergence, have been obtained respectively.
Share and Cite
ISRP Style
K. C. Jain, P. Chhabra, New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application, Journal of Mathematics and Computer Science, 15 (2015), no. 1, 1-22
AMA Style
Jain K. C., Chhabra P., New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application. J Math Comput SCI-JM. (2015); 15(1):1-22
Chicago/Turabian Style
Jain, K. C., Chhabra, P.. "New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application." Journal of Mathematics and Computer Science, 15, no. 1 (2015): 1-22
Keywords
- New divergence
- New information inequalities
- Parametric generalized divergence
- Bounds
- Logarithmic power mean
- Identric mean
- Binomial and Poisson distributions
- Asymptotic approximation.
MSC
References
-
[1]
R. K. Bajaj, D. S. Hooda , Generalized measures of fuzzy directed divergence, total ambiguity and information improvement, Journal of Applied Mathematics, Statistics and Informatics, 6 (2010), 31- 44.
-
[2]
M. B. Bassat , f- Entropies, probability of error and feature selection, Inform. Control, 39 (1978), 227-242
-
[3]
P. K. Bhatia, S. Singh , On a new Csiszar’s f- divergence measure, Cybernetics and Information Technologies, 13 (2013), 43- 56.
-
[4]
J. S. Bhullar, O. P. Vinocha, M. Gupta , Generalized measure for two utility distributions, Proceedings of the World Congress on Engineering, 3 (2010), June 30- July 2.
-
[5]
L. M. Bregman , The relaxation method to find the common point of convex sets and its applications to the solution of problems in convex programming, USSR Comput. Math. Phys., 7 (1967), 200 –217.
-
[6]
J. Burbea, C. R. Rao , On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Inform. Theory, IT, 28 (1982), 489-495.
-
[7]
H. C. Chen , Statistical pattern recognition, Hoyderc Book Co., Rocelle Park, New York (1973)
-
[8]
C. K. Chow, C. N. Lin , Approximating discrete probability distributions with dependence trees, IEEE Trans. Inform. Theory, 14 (1968), 462-467.
-
[9]
I. Csiszar , Information measures: A Critical survey, in Trans. – In: Seventh Prague Conf. on Information Theory, Academia, Prague, (1974), 73-86.
-
[10]
I. Csiszar , Information type measures of differences of probability distribution and indirect observations, Studia Math. Hungarica, 2 (1967), 299- 318.
-
[11]
D. Dacunha- Castelle , Ecole d’Ete de Probabilites de, Saint-Flour VII-1977, Heidelberg, New York: Springer, Berlin (1978)
-
[12]
S. S. Dragomir, V. Gluscevic, C. E. M. Pearce , Approximation for the Csiszar’s f- divergence via midpoint inequalities, in Inequality Theory and Applications - Y.J. Cho, J.K. Kim, and S.S. Dragomir (Eds.), Nova Science Publishers, Inc., Huntington, New York, 1 (2001), 139-154.
-
[13]
S. S. Dragomir, M. L. Scholz, J. Sunde , Some upper bounds for relative entropy and applications, Computers and Mathematics with Applications, 39 (2000), 91- 100.
-
[14]
D. V. Gokhale, S. Kullback , Information in contingency Tables, New York, Marcel Dekker (1978)
-
[15]
D. S. Hooda , On generalized measures of fuzzy entropy, Mathematica Slovaca, 54 (2004), 315- 325.
-
[16]
K. C. Jain, P. Chhabra , Series of new information divergences, properties and corresponding series of metric spaces, International Journal of Innovative Research in Science, Engineering and Technology, 3 (2014), 12124- 12132.
-
[17]
K. C. Jain, P. Chhabra , New series of information divergence measures and their properties, Appl. Math. Inf. Sci., 10 (2016), 1433-1446
-
[18]
K. C. Jain, R. N. Saraswat , Some new information inequalities and its applications in information theory, International Journal of Mathematics Research, 4 (2012), 295- 307.
-
[19]
P. Jha, V. K. Mishra , Some new trigonometric, hyperbolic and exponential measures of fuzzy entropy and fuzzy directed divergence, International Journal of Scientific and Engineering Research, 3 (2012), 1- 5.
-
[20]
L. Jones, C. Byrne , General entropy criteria for inverse problems with applications to data compression, pattern classification and cluster analysis, IEEE Trans. Inform. Theory, 36 (1990), 23- 30.
-
[21]
T. T. Kadota, L. A. Shepp , On the best finite set of linear observables for discriminating two Gaussian signals, IEEE Trans. Inform. Theory, 13 (1967), 278 - 284
-
[22]
T. Kailath , The divergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Comm. Technology, 15 (1967), 52- 60.
-
[23]
D. Kazakos, T. Cotsidas , A decision theory approach to the approximation of discrete probability densities, IEEE Trans. Perform. Anal. Machine Intell, 1 (1980), 61- 67.
-
[24]
F. Nielsen, S. Boltz , The Burbea-Rao and Bhattacharyya centroids, Arxiv, (2010),
-
[25]
K. Pearson , On the Criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling, Phil. Mag., 50 (1900), 157-172.
-
[26]
E. C. Pielou , Ecological diversity, Wiley, New York (1975)
-
[27]
A. Renyi , On measures of entropy and information, Proc. 4th Berkeley Symposium on Math. Statist. and Prob., 1 (1961), 547-561.
-
[28]
R. Santos-Rodriguez, D. Garcia-Garcia, J. Cid-Sueiro , Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, Fl., USA, December 13-15, (2009), (2009), 551- 556
-
[29]
R. Sibson , Information radius, Z. Wahrs. Undverw. Geb., 14 (1969), 149-160.
-
[30]
H. C. Taneja, R. K. Tuteja , Characterization of a quantitative- qualitative measure of inaccuracy, Kybernetika, 22 (1968), 393- 402.
-
[31]
I. J. Taneja , New developments in generalized information measures, Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes, 91 (1995), 37-135.
-
[32]
I. J. Taneja, P. Kumar , Generalized non-symmetric divergence measures and inequalities, The Natural Science and Engineering Research Council’s Discovery grant to Pranesh Kumar, (2000)
-
[33]
B. Taskar, S. Lacoste-Julien, M. I. Jordan , Structured prediction, dual extra gradient and Bregman projections, Journal of Machine Learning Research, 7 (2006), 1627- 1653.
-
[34]
H. Theil , Statistical decomposition analysis, North-Holland, Amsterdam (1972)
-
[35]
H. Theil , Economics and information theory, North-Holland, Amsterdam (1967)
-
[36]
B. Vemuri, M. Liu, S. Amari, F. Nielsen , Total Bregman divergence and its applications to DTI analysis, IEEE Transactions on Medical Imaging, (2010)