What is it about?

Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to objective prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work is on the application of the entropic prior idea to Bayesian inference with discrete classes in signal processing problems. Unfortunately, it is well known that entropic priors, when applied to sequences, may lead to excessive spreading of the entropy as the number of samples grows. In this paper we show that the spreading of the entropy may be tolerated if the posterior probabilities remain consistent. We derive a condition based on conditional entropies and KL-divergences for posterior consistency using the Asymptotic Equipartition Property (AEP). Furthermore, we show that entropic priors can be modified to force posterior consistency by adding a constraint to joint entropy maximization. Simulations on the application of entropic priors to a coin flipping experiment are included.

Featured Image

Read the Original

This page is a summary of: Consistency of sequence classification with entropic priors, January 2012, American Institute of Physics,
DOI: 10.1063/1.3703652.
You can read the full text:

Read

Contributors

The following have contributed to this page