CERN uses tiny AI models burned into silicon for real-time LHC data filtering

https://theopenreader.org/Journalism:CERN_Uses_Tiny_AI_Models_Burned_into_Silicon_for_Real-Time_LHC_Data_Filtering

CERN Uses Tiny AI Models Burned into Silicon for Real-Time LHC Data Filtering

CERN has developed ultra-small AI models embedded directly into custom chips to filter massive data streams from the Large Hadron Collider in real time, addressing the enormous data challenge of the world’s most powerful particle accelerator.

TheOpenReader

They used a custom neural net with autoencoders, which contain convolutional layers. They trained it on previous experiment data.

https://arxiv.org/html/2411.19506v1

Why is it so hard to elaborate what AI algorithm / technique they integrate? Would have made this article much better

Real-time Anomaly Detection at the L1 Trigger of CMS Experiment

It seems like most of the implementation is FPGA, which I wouldn’t call “physically burned into silicon.” That’s quite a stretch of language