San Antonio, part one
January, 2026. Digital (Canon R6)

Today a red district in Texas voted for a Democrat. Thatโ€™s what choice looks like when people still have values.
Congratulations to the free people of Texas

#photography #photo #digitalphotography #streetphotography #urbanphotography #cityscape #architecture #urban #citylife #everydayamerica #travelphotography #texas #sanantonio #usa #visualdiary #documentaryphotography #observational #quietmoments #modernlife

#statstab #458 Causal inference for observational data using {modelbased}

Thoughts: IPW, g-computation, and more. Learning OS and ways to compute ATE for (more accurate, but still not great) inference.

#gcomputation #ipw #iptw #observational #inference

https://easystats.github.io/modelbased/articles/practical_causality.html

Case Study: Causal inference for observational data using modelbased

#statstab #446 {causaldata} Packages of Example Data for The Effect

Thoughts: On your journey to learning Causal Inference you can use some nice datasets to figure out how horrible it can all go.

#causalinference #observational #python #r #DAG #OS

https://github.com/NickCH-K/causaldata

GitHub - NickCH-K/causaldata: Packages of Example Data for The Effect

Packages of Example Data for The Effect. Contribute to NickCH-K/causaldata development by creating an account on GitHub.

GitHub

#statstab #441 Bayes-by Shower

Thoughts: Comprehensive (read: long) tutorial on bayesian analysis and how to think about research.

#bayes #rstats #bayesian #tutorial #DAGs #estimand #observational #design

https://betanalpha.github.io/assets/chapters_html/fertility.html

Bayes-by Shower

#statstab #440 Computing Statistical Power for the Difference in Differences Design

Thoughts: DiD studies are all the rage in Obs research. But how does the concept of power apply to them?

#poweranalysis #DiD #causalinference #samplesize #observational

https://journals.sagepub.com/doi/10.1177/0193841X251380898

#statstab #409 Sensitivity Analyses for Unmeasured Confounders

Thoughts: Some assumptions of obs. research are untestable. One way around this is testing what could break your inference.

#causalinference #confounder #collider #bias #sensitivityanalysis #observational

https://link.springer.com/article/10.1007/s40471-022-00308-6

Sensitivity Analyses for Unmeasured Confounders - Current Epidemiology Reports

Purpose of Review This review expands on sensitivity analyses for unmeasured confounding techniques, demonstrating state-of-the-art methods as well as specifying which should be used under various scenarios, depending on the information about a potential unmeasured confounder available to the researcher. Recent Findings Methods to assess how sensitive an observed estimate is to unmeasured confounding have been developed for decades. Recent advancements have allowed for the incorporation of measured confounders in these assessments, updating the methods used to quantify the impact of an unmeasured confounder, whether specified in terms of the magnitude of the effect from a regression standpoint, for example, as a risk ratio, or with respect to the percent of variation in the outcome or exposure explained by the unmeasured confounder. Additionally, single number summaries, such as the E-value or robustness value, have been proposed to allow for ease of computation when less is known about a specific potential unmeasured confounder. Summary This paper aimed to provide methods and tools to implement sensitivity to unmeasured confounder analyses appropriate for various research settings depending on what is known or assumed about a potential unmeasured confounder. We have provided mathematical justification, recommendations, as well as R code to ease the implementation of these methods.

SpringerLink