What is it about?
Publication bias - or the “file drawer problem” - is the tendency to prioritise publishing research with statistically significant findings. When findings are pooled across studies testing the same hypothesis (a process known as meta-analysis) publication bias affects the accuracy of the resulting estimates. We reanalyzed over four hundred published meta-analytic datasets to see how the results would change after applying various methods that adjust for publication bias. The typical adjustment found in our sample was minimal and normally would not change the conclusion of the meta-analysis. We argue that the minimal change in the estimates should not be taken as an indication of low levels of publication bias in Psychology, and encourage researchers to use the adjustment methods to explore a range of plausible estimates for the effect under study.
Featured Image
Photo by Maksym Kaharlytskyi on Unsplash
Why is it important?
By understanding how the better known adjustments for publication bias perform and when they fall short, researchers are better informed about how to correct for publication bias and how other methods (e.g. robust Bayesian meta-analysis) might help. We also discuss how broader systemic changes - like the adoption of preregistration and registered reports - are more effective in combating publication bias than post-hoc adjustments.
Perspectives
Read the Original
This page is a summary of: Estimating the change in meta-analytic effect size estimates after the application of publication bias adjustment methods., Psychological Methods, April 2022, American Psychological Association (APA),
DOI: 10.1037/met0000470.
You can read the full text:
Contributors
The following have contributed to this page