Unexamined #priors are not worth having.
@KatyElphinstone This line of thinking entirely makes sense to me, and ties in with my idea of #autism being due to the lack of a hardwired #EnvironmentalYoke that constrains neurotypical interests and engagement far more strongly than our interests and engagement are constrained. One of the ways that constraint could be exercised is by the #EnvironmentalYoke imposing far more dogmatic #priors than we have. Beds contain pillows, not ravioli!

#statstab #466 Bayesian workflow: Prior determination, predictive checks and sensitivity analyses

Thoughts: Having a good bayesian work flow can be challenging with complex models.

#priors #bayesian #sensitivityanalysis #posterior #ppc #brms

https://pablobernabeu.github.io/2022/bayesian-workflow-prior-determination-predictive-checks-and-sensitivity-analyses/

Bayesian workflow: Prior determination, predictive checks and sensitivity analyses | Pablo Bernabeu

This post presents a run-through of a Bayesian workflow in R. The content is closely based on Bernabeu (2022), which was in turn based on lots of other references, also cited here.

Pablo Bernabeu

#statstab #459 Getting Comfortable with Expressing Beliefs as Distributions

Thoughts: Bayesian stats requires a good understanding of priors, but these are often unintuitive. Plots help.

#bayesian #priors #ggplot #r #dataviz #learing #education

https://brian-lookabaugh.github.io/website-brianlookabaugh/blog/2025/priors-distributions/

Getting Comfortable with Expressing Beliefs as Distributions – Brian Lookabaugh

Thinking about our beliefs as distributions is not super intuitive for most people, which creates a stumbling blog for getting into Bayesian statistics. Check this blog out to break down the mystique!

#statstab #445 What are credible priors and what are skeptical priors?

Thoughts: An excellent thread on prior elicitation by some of the big names in the field (frequentist and bayesian).

#priors #bayesian #bayes #likelihood #medicine #clinical #debate

https://discourse.datamethods.org/t/what-are-credible-priors-and-what-are-skeptical-priors/580

What are credible priors and what are skeptical priors?

A few weeks ago Dan Scharfstein asked a group of colleagues about how to report an odds ratio of 1.70 with 95% confidence limits of 0.96 and 3.02. Back-calculating from these statistics gives a two-sided P of 0.06 or 0.07, corresponding to an S-value (surprisal, log base 2 of P) of about 4 bits of information against the null hypothesis of OR=1. So, not much evidence against the null from the result, but still favoring a positive association over an inverse one, and so thought worthy of reportin...

Datamethods Discussion Forum

#statstab #427 Incorporating Historical Control Data Into an RCT

Thoughts: In frequentist stats, historical controls are a dangerous proposition. But in bayesian, they may have a more useful purpose.

#historicalcontrol #rct #bayesian #bias #priors #BMA

https://www.fharrell.com/post/hxcontrol/

Incorporating Historical Control Data Into an RCT – Statistical Thinking

Historical data (HD) are being used increasingly in Bayesian analyses when it is difficult to randomize enough patients to study effectiveness of a treatment. Such analyses summarize observational studies’ posterior effectiveness distribution (for two-arm HD) or standard-of-care outcome distribution (for one-arm HD) then turn that into a prior distribution for an RCT. The prior distribution is then flattened somewhat to discount the HD. Since Bayesian modeling makes it easy to fit multiple models at once, incorporation of the raw HD into the RCT analysis and discounting HD by explicitly modeling bias is perhaps a more direct approach than lowering the effective sample size of HD. Trust the HD sample size but not what the HD is estimating, and realize several benefits from using raw HD in the RCT analysis instead of relying on HD summaries that may hide uncertainties.

Statistical Thinking

#statstab #402 On Bayes factors for hypothesis tests {emBayes Factor}

Thoughts: On bsky there were renewed debates about BFs. This paper provides "better" priors (mixture t centred on the ES). Also some p-value BFs

#bayesian #bayesfactor #priors #cohend

https://link.springer.com/article/10.3758/s13423-024-02612-2

On Bayes factors for hypothesis tests - Psychonomic Bulletin & Review

We develop alternative families of Bayes factors for use in hypothesis tests as alternatives to the popular default Bayes factors. The alternative Bayes factors are derived for the statistical analyses most commonly used in psychological research – one-sample and two-sample t tests, regression, and ANOVA analyses. They possess the same desirable theoretical and practical properties as the default Bayes factors and satisfy additional theoretical desiderata while mitigating against two features of the default priors that we consider implausible. They can be conveniently computed via an R package that we provide. Furthermore, hypothesis tests based on Bayes factors and those based on significance tests are juxtaposed. This discussion leads to the insight that default Bayes factors as well as the alternative Bayes factors are equivalent to test-statistic-based Bayes factors as proposed by Johnson. Journal of the Royal Statistical Society Series B: Statistical Methodology, 67, 689–701. (2005). We highlight test-statistic-based Bayes factors as a general approach to Bayes-factor computation that is applicable to many hypothesis-testing problems for which an effect-size measure has been proposed and for which test power can be computed.

SpringerLink
Certainly too many are, complicating public policy discussions. It is continuously necessary to work around the publics prior assumptions. I am now getting sidetracked skimming: #Priors #PriorAssumptions #PriorProbability en.wikipedia.org/wiki/Prior_p...

Prior probability - Wikipedia
Prior probability - Wikipedia

'Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights', by Insung Kong, Yongdai Kim.

http://jmlr.org/papers/v26/24-0425.html

#priors #sparse #bnn