This guy here... this is a guy. I love this guy 🩷🧸🍷

How do you keep your little gooses on path? 🪿🦮

We're calling it Looking Glass Development. It's what happens when TDD and BDD have a wee bit too much wine when they're over for a visit with your teddy and what comes out has a wee bit more computational markdown in it than anyone wants to admit.

🧸🤷🏻‍♀️

You guys are already doing that thing where you use your user guide as the spec right?

Just add your unit tests at the bottom.

Living markdown.

🔴 ➡️ 🟢 ➡️ 🔄
♥️ 🏹 💚 🏹 🫶🏻

If the guide is accurate enough to direct accurate usage and the tests are passing then the garden has been well tended. 🌱

#ai #platform #living-documentation #computational-markdown

The Controversial Camera with Built-in Generative AI Blew Past Its Crowdfunding Goal

The Caira camera, the rebrand of the camera formerly known as Alice, has blown past its Kickstarter funding goal in its first day.

PetaPixel

@axoaxonic @adredish Fully agree 👍 Horner's framework really begs for a formal dynamical model: defining trajectories, #attractors, and #manifolds within that 3D space. Something that could turn his conceptual #StateSpace into a genuine #computational theory of #memory dynamics.

I didn’t know Redish's book ("Beyond the Cognitive Map") before your comment! Sounds highly relevant and I’ll definitely put it on my reading list 👌

Caira Is an 'AI-Native' Micro Four Thirds Camera With Google's 'Nano Banana' Generative AI Built-In

This new Micro Four Thirds camera from the makers of Alice is all-in on generative AI.

PetaPixel

Proud of Vahid's use of #computational #CogSci to identify and compare #reasoning errors in #Reddit users and communities.

He's presenting it at an #AI + #decisionSci workshop at #CMU : https://www.cmu.edu/ai-sdm/research/human-ai-workshop/2025-programming/index.html

Follow him for alerts about this and more: https://www.researchgate.net/profile/Vahid-Ashrafi

Excited to share our latest work published in Nature Scientific Reports:

"Machine learning based characterization of high risk carriers of HTLV-1-associated myelopathy (HAM)"

This work represents a significant step forward in precision medicine for neglected viral infections — I am grateful to work with an incredible team of clinicians and Bioinformaticians.

#Bioinformatics #Computational biology #Biostatistics

Further reading: 🔗 https://www.nature.com/articles/s41598-025-09635-2

Machine learning based characterization of high risk carriers of HTLV-1-associated myelopathy (HAM) - Scientific Reports

HTLV-1-associated myelopathy (HAM) develops in a part of HTLV-1-infected individuals while most of the individuals remain asymptomatic. This complicates the identification of HTLV-1 carriers at elevated risk. In this study, we integrated HTLV-1 proviral load and antibody titers against Tax, Env, Gag p15, p19, and p24 proteins in a machine learning (ML) framework to identify and characterize high-risk individuals likely to develop HAM. We stratified asymptomatic carrier samples employing an anomaly detection model. We further developed and validated classifier models capable of distinguishing three clinical subgroups, carrier, ATL, and HAM for assessing the anomaly carrier samples as unseen test data. With most anomaly carrier samples (~ 76.47%) predicted as HAM, further statistical and interpretative analysis revealed the ‘HAM-like’ characteristics of the anomaly carrier samples indicating elevated risk. Additionally, significant heterogeneity in immune response was observed among other asymptomatic carriers. As an exploratory, hypothesis-generating study, our findings are preliminary and aim to propose potential biomarkers and computational strategies that warrant validation in future longitudinal investigations. Our machine learning-based approach offers a novel and insightful tool for identifying and evaluating high-risk characteristics for HAM, providing a holistic view of the complex immune dynamics of asymptomatic carriers of HTLV-1.

Nature
rant: i know there are a lot of alternative (typically anti-) definitions of "AI" flying around, but i just thought of another one: Actual Insanity! anyone who has done serious work with finite precision #arithmetic, #linear #algebra, #statistics, #computational #modeling, or #numerical analysis, knows that minimizing the number of parameters is essential for building truly explanatory models. but establishment scientists have collectively signed onto the idea that large models are intelligent
Via #LLRX #AI #slop and the #destruction of #knowledge – Prof. Iris van Rooij, #Computational #CognitiveScience at the School of #ArtificialIntelligence, shares thread of her email communications w #Elsevier Helpdesk detail with concerns about #AI generated definitions and links within scholarly articles, and the fact that authors cannot say ‘no’ to their work being used for #AItraining and #AIgenerated # texts. https://www.llrx.com/2025/08/ai-slop-and-the-destruction-of-knowledge/ #education #learning #academia #teaching #knowledge
Oxbow makes genomic data ready for high-performance analytics.
File formats are a major source of friction and headache in #computational #biology. Oxbow makes it easier to retrieve and manipulate data stored in conventional genomic formats using modern data tooling.
#bioinformatics
https://oxbow.readthedocs.io/en/latest/
Oxbow — oxbow documentation

Simulations of beating heart may inform treatments for atrial fibrillation

You may have heard the phrase "my heart skipped a beat" when someone was talking about a romantic encounter. In truth, hearts that beat irregularly are dangerous for your health. Atrial fibrillation (AF) is the most common type of irregular heartbeat, and over time, it can worsen and become a permanent condition, a severe disorder that's the leading preventable cause of ischemic stroke, according to the NIH.