In , we have discussed the mutual information of two random variables and how it can be obtained from entropies. We considered the Shannon entropy and the nonadditive Tsallis entropy. Here, following the same approach used in the Tsallis case, we propose a method for discussing the mutual entropy of another nonadditive entropy, the Kaniadakis entropy.
Mutual Information, Entropy, Tsallis Entropy, Kaniadakis Entropy, Generalized Additivity, Image Registration
- A.C. Sparavigna (2015). Mutual Information and Nonadditive Entropies: The Case of Tsallis Entropy, Int. J. Sci. in print.
- C. Tsallis (1988). Possible Generalization of Boltzmann-Gibbs Statistics, Journal of Statistical Physics 52:479. DOI:10.1007/BF01016429
- G. Kaniadakis (2002). Statistical mechanics in the context of special relativity, Phys. Rev. E 66: 056125. DOI: 10.1103/physreve.66.056125
- G. Kaniadakis (2013). Theoretical foundations and mathematical formalism of the power-law tailed statistical distributions, Entropy 15:3983. DOI: 10.3390/e15103983
- AC. Sparavigna (2015). On the Generalized Additivity of Kaniadakis Entropy, Int. J. Sci., 2015, 4(2):44; DOI: 10.18483/ijSci.627
- A. Rényi (1960). On Measures of Information and Entropy, Proceedings of the Fourth Berkeley Symposium on Mathematics, Statistics and Probability, pp. 547–561.
- S. Furuichi (2006). Information Theoretical Properties of Tsallis Entropies, J. Math. Phys. 47:023302. DOI:10.1063/1.2165744
- S. Abe, A.K. Rajagopal (2000). Nonadditive Conditional Entropy and its Significance for Local Realism, arXiv:quant-ph/0001085, 24 Jan 2000.
- A.C. Sparavigna (2015). Conditional Kaniadakis Entropy: a Preliminary Discussion. PHILICA.COM Article number 524.
Cite this Article:
International Journal of Sciences is Open Access Journal.
This article is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License.
Author(s) retain the copyrights of this article, though, publication rights are with Alkhaer Publications.