[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Ethics Considerations::
Contact us::
Site Facilities::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Indexing and Abstracting



 
..
Social Media

..
Licenses
Creative Commons License
This Journal is licensed under a Creative Commons Attribution NonCommercial 4.0
International License
(CC BY-NC 4.0).
 
..
Similarity Check Systems


..
:: ::
Back to the articles list Back to browse issues page
Generalization of Discrete Chi-Squared Information Measure and and Its Application in Image Processing
Omid Kharazmi * , Faezeh Shirazi-Niya
Abstract:   (215 Views)
In this paper, by considering the generalized chi-squared information and the relative generalized chi-squared information measures, discrete versions of these information measures are introduced. Then, generalizations of these information quantities based on their convexity property are presented. Some essential features of these new measures and their relationships are studied. Moreover, the performance of these new information measures is investigated for some well-known and widely used models in coding theory and thermodynamics, such as escort distributions and generalized escort distributions. Finally, two applications of the introduced discrete generalized chi-squared information measure are examined in the context of image quality assessment. In addition, the results obtained from the performance of these measures are compared with the performance of the critical metric, peak signal-to-noise ratio. It is shown that the generalized chi-squared divergence measure exhibits performance similar to the peak signal-to-noise ratio and can be used as an alternative metric.
Keywords: Chi-square divergence measure, Convex function, Jensen's inequality, Escort distribution.
Full-Text [PDF 1672 kb]   (65 Downloads)    
Type of Study: Research | Subject: Probabilty and Applications
Received: 2025/03/5 | Accepted: 2025/04/30
References
1. خوارزمی، ا. و علیزاده، م. (۱۳۹۹). اندازه‌های اطلاع جنسن-فیشر و جنسن-کای‌دو برای توزیع‌های آمیخته متناهی، مجله مدل‌سازی پیشرفته ریاضی، 11، 1-15..
2. Ahrari, V., & Habibirad, A. (2019). Quantile Based Tsallis Residual Entropy and its Divergence Measure. Journal of Statistical Sciences, 12(2), 295-321. [DOI:10.29252/jss.12.2.295]
3. Anjali, & Gupta, A. (2025). A Novel Shannon Entropy-Based Backward Cloud Model and Cloud K-Means Clustering. The Journal of Supercomputing, 81(1), 65. [DOI:10.1007/s11227-024-06528-5]
4. Basu, A., Harris, I. R., Hjort, N. L., & Jones, M. C. (1998). Robust and Efficient Estimation by Minimizing a Density Power Divergence. Biometrika, 85, 549-559. [DOI:10.1093/biomet/85.3.549]
5. Bercher, J. F. (2009). Source Coding with Escort Distributions and Rényi Entropy Bounds. Physics Letters A, 36, 3235-3238. [DOI:10.1016/j.physleta.2009.07.015]
6. Bercher, J. F. (2013). Some Properties of Generalized Fisher Information in the Context of Nonextensive Thermostatistics. Journal of Physics A: Mathematical and Theoretical, 392, 3140-3154. [DOI:10.1016/j.physa.2013.03.062]
7. Bruni, V., Rossi, E., & Vitulano, D. (2013). Jensen-Shannon Divergence for Visual Quality Assessment. Signal, Image and Video Processing, 7, 411-421. [DOI:10.1007/s11760-013-0444-3]
8. Cover, T. M., & Thomas, J. A. (2006). Elements of Information Theory. John Wiley & Sons, Hoboken.
9. Eskicioglu, A. M., & Fisher, P. S. (1995). Image Quality Measures and Their Performance. IEEE Transactions on Communications, 43, 2959-2965. [DOI:10.1109/26.477498]
10. Gonzalez, R. C. (2009). Digital Image Processing. Prentice-Hall, New York.
11. Gray, R. M. (2011). Entropy and Information Theory. Springer Science & Business Media, New York. [DOI:10.1007/978-1-4419-7970-4]
12. Kaehler, A., & Bradski, G. (2016). Learning OpenCV 3: Computer Vision in C++ with the OpenCV Library. O'Reilly Media, Sebastopol.
13. Kharazmi, O., & Alizadeh, M. (2020). Jensen-Fisher and Jensen-χ_α^2 Information Measures for Finite Mixture Distributions. Journal of Advanced Mathematical Modeling, 11, 1-15.
14. Kharazmi, O., & Balakrishnan, N. (2021a). Jensen-Information Generating Function and Its Connections to Some Well-Known Information Measures. Statistics & Probability Letters, 170, 108995. [DOI:10.1016/j.spl.2020.108995]
15. Kharazmi, O., & Balakrishnan, N. (2021b). Discrete Versions of Jensen-Fisher, Fisher and Bayes-Fisher Information Measures of Finite Mixture Distributions. Entropy, 23, 363. [DOI:10.3390/e23030363] [PMID] []
16. Kharazmi, O., & Balakrishnan, N. (2021c). Cumulative Residual and Relative Cumulative Residual Fisher Information and Their Properties. IEEE Transactions on Information Theory, 67(10), 6306-6312. [DOI:10.1109/TIT.2021.3073789]
17. Kharazmi, O., Balakrishnan, N., & Ozonur, D. (2023a). Jensen-Discrete Information Generating Function with an Application to Image Processing. Soft Computing, 27, 4543-4552. [DOI:10.1007/s00500-023-07863-0]
18. Kharazmi, O., Contreras-Reyes, J. E., & Balakrishnan, N. (2023b). Jensen-Fisher Information and Jensen-Shannon Entropy Measures Based on Complementary Discrete Distributions with an Application to Conway's Game of Life. Physica D, 453, 133822. [DOI:10.1016/j.physd.2023.133822]
19. Kharazmi, O., Contreras-Reyes, J. E., & Balakrishnan, N. (2023c). Optimal Information, Jensen-RIG Function and α-Onicescu's Correlation Coefficient in Terms of Information Generating Functions. Physica A: Statistical Mechanics and Its Applications, 609, 128362. [DOI:10.1016/j.physa.2022.128362]
20. Kharazmi, O., & Balakrishnan, N. (2024a). On Jensen-χ_α^2 Divergence Measure. Probability in the Engineering and Informational Sciences, 38, 403-427. [DOI:10.1017/S0269964823000189]
21. Kharazmi, O., & Balakrishnan, N. (2024b). Fisher and Bayes-Fisher Information Measures for Finite Mixture Distributions. Stochastic Models. DOI: https://doi.org/10.1080/15326349.2024.2355537 [DOI:10.1080/15326349.2024.2355537.]
22. Kharazmi, O., Contreras-Reyes, J. E., & Basirpour, M. B. (2024c). Jensen-Variance Distance Measure: A Unified Framework for Statistical and Information Measures. Computational and Applied Mathematics, 43, 144. [DOI:10.1007/s40314-024-02666-x]
23. Lin, J. (1991). Divergence Measures Based on the Shannon Entropy. IEEE Transactions on Information Theory, 37, 145-151. [DOI:10.1109/18.61115]
24. Melbourne, J., Talukdar, S., Bhaban, S., Madiman, M., & Salapaka, M. V. (2022). The Differential Entropy of Mixtures: New Bounds and Applications. IEEE Transactions on Information Theory, 68, 2123-2146. [DOI:10.1109/TIT.2022.3140661]
25. Navarro, J., Buono, F., & Arevalillo, J. M. (2023). A New Separation Index and Classification Techniques Based on Shannon Entropy. Methodology and Computing in Applied Probability, 25(4), 78. [DOI:10.1007/s11009-023-10055-w]
26. Nielsen, F., & Nock, R. (2013). On the Chi Square and Higher-Order Chi Distances for Approximating f-Divergences. IEEE Signal Processing Letters, 21, 10-13. [DOI:10.1109/LSP.2013.2288355]
27. Sanei Tabass, M., & Mohtashami Borzadaran, G. (2017). The Generalization of Maximum Entropy Principle for Generalized Information Measures. Journal of Statistical Sciences, 11(1), 101-118. [DOI:10.29252/jss.11.1.101]
28. Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27, 379-423. https://doi.org/10.1002/j.1538-7305.1948.tb00917.x [DOI:10.1002/j.1538-7305.1948.tb01338.x]
29. Steele, J. M. (2004). The Cauchy-Schwarz Master Class: An Introduction to the Art of Mathematical Inequalities. Cambridge University Press, Cambridge. [DOI:10.1017/CBO9780511817106]
30. Sourati, J., Gholipour, A., Dy, J. G., Tomas-Fernandez, X., Kurugol, S., & Warfield, S. K. (2019). Intelligent Labeling Based on Fisher Information for Medical Image Segmentation Using Deep Learning. IEEE Transactions on Medical Imaging, 38(11), 2642-2653. [DOI:10.1109/TMI.2019.2907805] [PMID] []
31. Wang, S., & Fan, J. (2024). Image Thresholding Method Based on Tsallis Entropy Correlation. Multimedia Tools and Applications, 1-37. [DOI:10.1007/s11042-024-19332-3]
32. Wang, N., Zhang, H., Kong, X., & Xu, D. (2025). Explicit Entropy Error Bound for Compressive DOA Estimation in Sensor Array. Digital Signal Processing, 156, 104785. [DOI:10.1016/j.dsp.2024.104785]
33. Woods, R. E., & Gonzalez, R. C. (2001). Digital Image Processing (3rd ed.). Computer Science E-Books.
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA


XML   Persian Abstract   Print



Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Back to the articles list Back to browse issues page
مجله علوم آماری – نشریه علمی پژوهشی انجمن آمار ایران Journal of Statistical Sciences

Persian site map - English site map - Created in 0.05 seconds with 45 queries by YEKTAWEB 4710