CERN uses tiny AI models burned into silicon for real-time LHC data filtering

https://theopenreader.org/Journalism:CERN_Uses_Tiny_AI_Models_Burned_into_Silicon_for_Real-Time_LHC_Data_Filtering

CERN Uses Tiny AI Models Burned into Silicon for Real-Time LHC Data Filtering

CERN has developed ultra-small AI models embedded directly into custom chips to filter massive data streams from the Large Hadron Collider in real time, addressing the enormous data challenge of the world’s most powerful particle accelerator.

TheOpenReader

They used a custom neural net with autoencoders, which contain convolutional layers. They trained it on previous experiment data.

https://arxiv.org/html/2411.19506v1

Why is it so hard to elaborate what AI algorithm / technique they integrate? Would have made this article much better

Real-time Anomaly Detection at the L1 Trigger of CMS Experiment

I'm half expecting to see "AI model" appearing as stand-in for "linear regression" at this point in the cycle.
Having work with people who do that, I can guarantee that’s not the case.
See https://ssummers.web.cern.ch/conifer/ and HSL4ML, these run BDT and CNN
API Reference — conifer 1.8 documentation