A Comparative Study of Traditional and Kullback-Leibler Divergence of Survival Functions Estimators for the Parameter of Lindley Distribution

  • Sultan Parveen Aliah University
  • Sanjay Kumar Singh Banaras Hindu University
  • Umesh Singh Banaras Hindu University.
  • Dinesh Kumar Banaras Hindu University

Abstract

A new point estimation method based on Kullback-Leibler divergence of survival functions (KLS), measuring the distance between an empirical and prescribed survival functions, has been used to estimate the parameter of Lindley distribution. The simulation studies have been carried out to compare the performance of the proposed estimator with the corresponding Least square (LS), Maximum likelihood (ML) and Maximum product spacing (MPS) methods of estimation.

Author Biographies

Sultan Parveen, Aliah University

Assistan Professor

Department of Statistics & Informatics.

Sanjay Kumar Singh, Banaras Hindu University

Professor

Department of Statistics.

Umesh Singh, Banaras Hindu University.

Professor

Department of Statistics.

Dinesh Kumar, Banaras Hindu University

Assistant Professor

Department of Statistics.

References

Rafail Abramov, Andrew Majda, and Richard Kleeman. Information

theory and predictability for low-frequency variability. Journal of the

atmospheric sciences, 62(1):65-87, 2005.

Janos Aczel and Zoltan Daroczy. On measures of information and their characterizations. New York, 1975.

Thomas M Cover and Joy A Thomas. Elements of information theory 2nd edition. 2006.

Inderjit S Dhillon, Subramanyam Mallela, and Rahul Kumar. A divisive information-theoretic feature clustering algorithm for text classication. Journal of machine learning research, 3(Mar):1265-1287, 2003.

B Forte and W Hughes. The maximum entropy principle: a tool to

define new entropies. Reports on mathematical physics, 26(2):227-235, 1988.

ME Ghitany, DK Al-Mutairi, and SM Aboukhamseen. Estimation of the reliability of a stress-strength system from power lindley distributions. Communications in Statistics-Simulation and Computation, 44(1):118-136, 2015.

ME Ghitany, F Alqallaf, DK Al-Mutairi, and HA Husain. A two parameter weighted lindley distribution and its applications to survival data. Mathematics and Computers in Simulation, 81(6):1190-1201, 2011.

ME Ghitany, B Atieh, and S Nadarajah. Lindley distribution and its

application. Mathematics and computers in simulation, 78(4):493-506, 2008.

Young Kyung Lee and Byeong U Park. Estimation of kullback-leibler divergence by local likelihood. Annals of the Institute of Statistical Mathematics, 58(2):327-340, 2006.

Dennis V Lindley. Fiducial distributions and bayes' theorem. Journal of the Royal Statistical Society. Series B (Methodological), pages 102-107, 1958.

Bruce G Lindsay. Efficiency versus robustness: the case for minimum hellinger distance and related methods. The annals of statistics, pages 1081-1114, 1994.

Juan Liu. Information theoretic content and probability. PhD thesis, Citeseer, 2007.

Pedro J Moreno, Purdy P Ho, and Nuno Vasconcelos. A kullback-leibler divergence based kernel for svm classication in multimedia applications. In Advances in neural information processing systems, page None, 2003.

Fernando Perez-Cruz. Kullback-leibler divergence estimation of continuous distributions. In 2008 IEEE international symposium on information theory, pages 1666-1670. IEEE, 2008.

Murali Rao, Yunmei Chen, Baba C Vemuri, and Fei Wang. Cumulative residual entropy: a new measure of information. IEEE transactions on Information Theory, 50(6):1220-1228, 2004.

Umesh Singh, Sanjay Kumar Singh, and Rajwant Kumar Singh. Product spacings as an alternative to likelihood for bayesian inferences. Journal of Statistics Applications & Probability, 3(2):179, 2014.

Qing Wang, Sanjeev R Kulkarni, and Sergio Verdu. Divergence estimation of continuous distributions based on data-dependent partitions. IEEE Transactions on Information Theory, 51(9):3064-3074, 2005.

Qing Wang, Sanjeev R Kulkarni, and Sergio Verdu. A nearest-neighbor approach to estimating divergence between continuous random vectors. convergence, 1000(1):11, 2006.

Gholamhossein Yari, Alireza Mirhabibi, and Abolfazl Sagha. Estimation of the weibull parameters by kullback-leibler divergence of survival functions. Appl. Math, 7(1):187-192, 2013.

Published
2019-07-30
How to Cite
Parveen, S., Singh, S. K., Singh, U., & Kumar, D. (2019). A Comparative Study of Traditional and Kullback-Leibler Divergence of Survival Functions Estimators for the Parameter of Lindley Distribution. Austrian Journal of Statistics, 48(5), 45-53. https://doi.org/10.17713/ajs.v48i5.772