What is it about?

We consider an elementary probability problem: given three events, A, B, and C, how can knowledge of A and B be used to predict C? We find a condition that guarantees predictability based on minimal statistical information. The results are useful for analysis of a wide variety of data types.

Featured Image

Why is it important?

There is a well-known mathematical condition called "conditional independence". When it is satisfied, knowledge of event C simplifies the relationship between events A and B, making their joint occurrence easily predictable. Our main result reveals a symmetry: when conditional independence is satisfied, knowledge of A and B also makes C predictable based on limited information. This finding has important practical applications in the realm of statistical analysis. This is illustrated with four realistic examples.

Perspectives

The results in this paper solved a nagging problem we had in the lab. We studied the performance of human participants when localizing either a sound or a faint visual stimulus, and wanted to know what to expect when they were presented with both stimuli simultaneously. We wanted a prediction based only on their unisensory performance curves and without making any mechanistic assumptions. The search for a solution led us to a general finding with wide applicability.

Emilio Salinas
Wake Forest University School of Medicine

Read the Original

This page is a summary of: Conditional independence as a statistical assessment of evidence integration processes, PLoS ONE, May 2024, PLOS,
DOI: 10.1371/journal.pone.0297792.
You can read the full text:

Read
Open access logo

Contributors

The following have contributed to this page