#statstab #507 Nonlinear models in {flocker}

Thoughts: I'm thinking these are useful if my theory predicts some natural limit (L) for a process. Like memory recall.

#nonlinear #glmm #brms #mem #asymtotic #flocker #r #rstats

https://www.maths.bris.ac.uk/R/web/packages/flocker/vignettes/nonlinear_models.html

Nonlinear models in flocker

Due to a recent discussion with colleagues on whether and when to use #LinearMixedModels (#LMM), I wrote a blog post comparing LMM to other approaches using simulated data. I thought, it may also be useful for others working with hierarchical data structures in #neuroscience and beyond.

๐ŸŒ https://www.fabriziomusacchio.com/blog/2026-01-31-linear_mixed_models/

#Python #Statistics #DataScience #MixedModels #Statsmodels #ANOVA #ANCOVA #GLMM #regression

#statstab #450 Fitting GAMs with brms

Thoughts: Assuming linearity of your continuous predictors is not needed when you can add wiggles!

#gam #glmm #linearmodel #modelling #brms #rstats #bayes #tutorial #splines #r

https://fromthebottomoftheheap.net/2018/04/21/fitting-gams-with-brms/

Fitting GAMs with brms: part 1

Regular readers will know that I have a somewhat unhealthy relationship with GAMs and the mgcv package. I use these models all the time in my research but recently weโ€™ve been hitting the limits of the range of models that mgcv can fit. So Iโ€™ve been looking into alternative ways...

From the Bottom of the Heap

#statstab #401 Common issues, conundrums, and other things that might come up when implementing mixed models

Thoughts: GLMMs are cool, but come with their own quirks.

#glmm #lmer #brms #mixedeffects #hierarchicalmodels #r

https://m-clark.github.io/mixed-models-with-R/issues.html

Issues | Mixed Models with R

This is an introduction to using mixed models in R. It covers the most common techniques employed, with demonstration primarily via the lme4 package. Discussion includes extensions into generalized mixed models, Bayesian approaches, and realms beyond.

ใ€๐Ÿ’กHigh Cited 2020-2022 ใ€‘
glmm.hp: an R package for computing individual effect of predictors in generalized linear mixed models

#CommonalityAnalysis | #FixedEffect | #GLMM | #HierarchicalPartitioning | #RelativeImportance | #VariancePartitioning

https://doi.org/10.1093/jpe/rtac096

Extremely nice review of REML estimation for generalized linear mixed models. Covers a couple of important papers that I was not aware of (e.g. Schall 1991 and Stiratelli 1984) until today, but also the work of Simon Wood in #rstats mgcv #GAM #GLMM

Restricted maximum likelihood ...
Bluesky

Bluesky Social
Extremely nice review of REML estimation for generalized linear mixed models.
Covers a couple of important papers that I was not aware of (e.g. Schall 1991 and Stiratelli 1984) until today, but also the work of Simon Wood in #rstats mgcv #GAM #GLMM
https://buff.ly/405IaB0
Restricted maximum likelihood estimation in generalized linear mixed models

Restricted maximum likelihood (REML) estimation is a widely accepted and frequently used method for fitting linear mixed models, with its principal advantage being that it produces less biased estimates of the variance components. However, the concept of REML does not immediately generalize to the setting of non-normally distributed responses, and it is not always clear the extent to which, either asymptotically or in finite samples, such generalizations reduce the bias of variance component estimates compared to standard unrestricted maximum likelihood estimation. In this article, we review various attempts that have been made over the past four decades to extend REML estimation in generalized linear mixed models. We establish four major classes of approaches, namely approximate linearization, integrated likelihood, modified profile likelihoods, and direct bias correction of the score function, and show that while these four classes may have differing motivations and derivations, they often arrive at a similar if not the same REML estimate. We compare the finite sample performance of these four classes, along with methods for REML estimation in hierarchical generalized linear models, through a numerical study involving binary and count data, with results demonstrating that all approaches perform similarly well reducing the finite sample size bias of variance components. Overall, we believe REML estimation should more widely adopted by practitioners using generalized linear mixed models, and that the exact choice of which REML approach to use should, at this point in time, be driven by software availability and ease of implementation.

arXiv.org
Extremely nice review of REML estimation for generalized linear mixed models.
Covers a couple of historically important papers that I was not aware of (e.g. Schall 1991 and Stiratelli 1984) until today, but also the work of Simon Wood in #rstats mgcv #GAM #GLMM
https://buff.ly/405IaB0
Restricted maximum likelihood estimation in generalized linear mixed models

Restricted maximum likelihood (REML) estimation is a widely accepted and frequently used method for fitting linear mixed models, with its principal advantage being that it produces less biased estimates of the variance components. However, the concept of REML does not immediately generalize to the setting of non-normally distributed responses, and it is not always clear the extent to which, either asymptotically or in finite samples, such generalizations reduce the bias of variance component estimates compared to standard unrestricted maximum likelihood estimation. In this article, we review various attempts that have been made over the past four decades to extend REML estimation in generalized linear mixed models. We establish four major classes of approaches, namely approximate linearization, integrated likelihood, modified profile likelihoods, and direct bias correction of the score function, and show that while these four classes may have differing motivations and derivations, they often arrive at a similar if not the same REML estimate. We compare the finite sample performance of these four classes, along with methods for REML estimation in hierarchical generalized linear models, through a numerical study involving binary and count data, with results demonstrating that all approaches perform similarly well reducing the finite sample size bias of variance components. Overall, we believe REML estimation should more widely adopted by practitioners using generalized linear mixed models, and that the exact choice of which REML approach to use should, at this point in time, be driven by software availability and ease of implementation.

arXiv.org
Extremely nice review of REML estimation for generalized linear mixed models.
Covers a couple of historically important papers that I was not aware of (e.g. Schall 1991 and Stiratelli 1984) until today, but also the work of Simon Wood in #rstats mgcv #GAM #GLMM
https://buff.ly/405IaB0
Restricted maximum likelihood estimation in generalized linear mixed models

Restricted maximum likelihood (REML) estimation is a widely accepted and frequently used method for fitting linear mixed models, with its principal advantage being that it produces less biased estimates of the variance components. However, the concept of REML does not immediately generalize to the setting of non-normally distributed responses, and it is not always clear the extent to which, either asymptotically or in finite samples, such generalizations reduce the bias of variance component estimates compared to standard unrestricted maximum likelihood estimation. In this article, we review various attempts that have been made over the past four decades to extend REML estimation in generalized linear mixed models. We establish four major classes of approaches, namely approximate linearization, integrated likelihood, modified profile likelihoods, and direct bias correction of the score function, and show that while these four classes may have differing motivations and derivations, they often arrive at a similar if not the same REML estimate. We compare the finite sample performance of these four classes, along with methods for REML estimation in hierarchical generalized linear models, through a numerical study involving binary and count data, with results demonstrating that all approaches perform similarly well reducing the finite sample size bias of variance components. Overall, we believe REML estimation should more widely adopted by practitioners using generalized linear mixed models, and that the exact choice of which REML approach to use should, at this point in time, be driven by software availability and ease of implementation.

arXiv.org

#statstab #204 GLMMadaptive: Generalized Linear Mixed Models using Adaptive Gaussian Quadrature

Thoughts: No clue what this package is does, but seems useful. Maybe someone can explain some use cases.

#glmm #gaussian #modelling #r #stats

https://drizopoulos.github.io/GLMMadaptive/

Generalized Linear Mixed Models using Adaptive Gaussian Quadrature

Fits generalized linear mixed models for a single grouping factor under maximum likelihood approximating the integrals over the random effects with an adaptive Gaussian quadrature rule; Jose C. Pinheiro and Douglas M. Bates (1995) <doi:10.1080/10618600.1995.10474663>.