“Bayesian Meta-Analysis with Weakly Informative Prior Distributions”

Donny Williams sends along this paper, with Philippe Rast and Paul-Christian Bürkner, and writes:

This paper is similar to the Chung et al. avoiding boundary estimates papers (here and here), but we use fully Bayesian methods, and specifically the half-Cauchy prior. We show it has as good of performance as a fully informed prior based on tau values in psychology. Further, we consider KL-divergence between estimated meta-analytic distribution and the “true” meta-analytic distribution. Here we show a striking advantage for the Bayesian models, which has never been shown in the context of meta-analysis.

Cool! I love to see our ideas making a difference.

And here’s the abstract to the Williams, Rast, and Bürkner paper:

Developing meta-analytic methods is an important goal for psychological science. When there are few studies in particular, commonly used methods have several limitations, most notably of which is underestimating between-study variability. Although Bayesian methods are often recommended for small sample situations, their performance has not been thoroughly examined in the context of meta-analysis. Here, we characterize and apply weakly-informative priors for estimating meta-analytic models and demonstrate with extensive simulations that fully Bayesian methods overcome boundary estimates of exactly zero between-study variance, be er maintain error rates, and have lower frequentist risk according to Kullback-Leibler divergence. While our results show that combining evidence with few studies is non-trivial, we argue that this is an important goal that deserves further consideration in psychology. Further, we suggest that frequentist properties can provide important information for Bayesian modeling. We conclude with meta-analytic guidelines for applied researchers that can be implemented with the provided computer code.

I completely agree with this remark: “frequentist properties can provide important information for Bayesian modeling.”