What is it about?

De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.

Featured Image

Read the Original

This page is a summary of: Extension of de Bruijn's identity to dependent non-Gaussian noise channels, Journal of Applied Probability, June 2016, Cambridge University Press,
DOI: 10.1017/jpr.2016.5.
You can read the full text:

Read

Contributors

The following have contributed to this page