What is it about?

The generalized information theory is a natural extension of Shannon's information theory. It is also a semantic information theory. The semantic information measure is defined with log normalized likelihood and can reflect Popper's thought. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. Shannon's rate-distortion function R(D) is reformed into the R(G) function, which can be used for datum compression and channel's matching. Applications include semantic communication and statistical learning.

Featured Image

Why is it important?

It bridges Shannon's information theory and likelihood method.

Perspectives

This theory has been applied to statistical learning so that a new iterative algorithm the CM algorithm for maximum likelihood tests, estimations and mixture models was found

Professor Chenguang Lu
Retired

Read the Original

This page is a summary of: A GENERALIZATION OF SHANNON'S INFORMATION THEORY, International Journal of General Systems, December 1999, Taylor & Francis,
DOI: 10.1080/03081079908935247.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page