In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated by considering practical problems with empirical data.
Published in | American Journal of Theoretical and Applied Statistics (Volume 2, Issue 2) |
DOI | 10.11648/j.ajtas.20130202.12 |
Page(s) | 15-20 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2013. Published by Science Publishing Group |
Maximum Entropy Principle, Contingency Table, Chi-Square Statistics, Lagrange’s Multipliers And Depen-dence Measure
[1] | Burg J.P.(1970). "The relationship between Maximum Entropy Spectra and Maximum Likelihood in Modern Spectra Analysis", ed D.G. Childers, pp130-131. |
[2] | Harvda, J. and Charvat, F. (1967). Quantification method of classification processes concepts of structural - entropy, Kybernetika, 3: 30-35. |
[3] | Kapur, J.N. and Kesavan, H.K. (1992)." Entropy optimization principles with applications." Academic press, San Diego.K. Elissa, "Title of paper if known," unpublished. |
[4] | Soofi, E.S. and Gokhale, D.V. (1997). "Information theoretic methods for categorical data. Advances in Econometrics." , JAI Press, Greenwich. |
[5] | Watanabe,S.(1969)."Knowing and Guessing".John Wiley,New York,1969. |
[6] | Watanabe, S. (1981). "Pattern recognition as a quest for minimum entropy.", Pattern Recognition. 13:381-387. |
[7] | Yates, F. (1933). "The analysis of replicated experiments when the field experiments are incomplete.", Emp. Journ. Exp. Agri. 1, 129-142. |
APA Style
D. S. Hooda, Permil Kumar. (2013). Information Theoretic Models for Dependence Analysis And missing Data Estimation. American Journal of Theoretical and Applied Statistics, 2(2), 15-20. https://doi.org/10.11648/j.ajtas.20130202.12
ACS Style
D. S. Hooda; Permil Kumar. Information Theoretic Models for Dependence Analysis And missing Data Estimation. Am. J. Theor. Appl. Stat. 2013, 2(2), 15-20. doi: 10.11648/j.ajtas.20130202.12
AMA Style
D. S. Hooda, Permil Kumar. Information Theoretic Models for Dependence Analysis And missing Data Estimation. Am J Theor Appl Stat. 2013;2(2):15-20. doi: 10.11648/j.ajtas.20130202.12
@article{10.11648/j.ajtas.20130202.12, author = {D. S. Hooda and Permil Kumar}, title = {Information Theoretic Models for Dependence Analysis And missing Data Estimation}, journal = {American Journal of Theoretical and Applied Statistics}, volume = {2}, number = {2}, pages = {15-20}, doi = {10.11648/j.ajtas.20130202.12}, url = {https://doi.org/10.11648/j.ajtas.20130202.12}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajtas.20130202.12}, abstract = {In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated by considering practical problems with empirical data.}, year = {2013} }
TY - JOUR T1 - Information Theoretic Models for Dependence Analysis And missing Data Estimation AU - D. S. Hooda AU - Permil Kumar Y1 - 2013/03/10 PY - 2013 N1 - https://doi.org/10.11648/j.ajtas.20130202.12 DO - 10.11648/j.ajtas.20130202.12 T2 - American Journal of Theoretical and Applied Statistics JF - American Journal of Theoretical and Applied Statistics JO - American Journal of Theoretical and Applied Statistics SP - 15 EP - 20 PB - Science Publishing Group SN - 2326-9006 UR - https://doi.org/10.11648/j.ajtas.20130202.12 AB - In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated by considering practical problems with empirical data. VL - 2 IS - 2 ER -