Am #CERN bewältigt nun eigene KI-Hardware riesige Datenmengen
Die Detektoren des LHC produzieren riesige Datenmengen: Bei 63 TBit/s bleibt der Hardware nur wenige Nanosekunden Zeit, um zu bestimmen, was wichtig ist.
Am #CERN bewältigt nun eigene KI-Hardware riesige Datenmengen
Die Detektoren des LHC produzieren riesige Datenmengen: Bei 63 TBit/s bleibt der Hardware nur wenige Nanosekunden Zeit, um zu bestimmen, was wichtig ist.
#Science Friday: #CERN finds a new particle + #News alerts for the cosmos
Episode webpage: https://www.wnycstudios.org/podcasts/science-friday
New #openaccess publication #SciPost #Physics
Physics case for low-sqrt{s} QCD studies at FCC-ee
David d'Enterria, Pier Francesco Monni, Peter Skands, Andrii Verbytskyi
SciPost Phys. 20, 092 (2026)
https://scipost.org/SciPostPhys.20.3.092
Hackaday Links: March 29, 2026
https://fed.brid.gy/r/https://hackaday.com/2026/03/29/hackaday-links-march-29-2026/
Cool
[...] During peak operation, the data stream can reach hundreds of terabytes per second, far exceeding the capacity of any feasible storage or conventional computing system.
Because it is physically impossible to store or process the full dataset, CERN must make split-second decisions at the detector level: which collision events contain potentially groundbreaking scientific value, and which should be discarded forever. This real-time selection process is one of the most demanding computational challenges in modern science.
CERN Uses Tiny AI Models Burned into Silicon for Real-Time LHC Data Filtering

CERN has developed ultra-small AI models embedded directly into custom chips to filter massive data streams from the Large Hadron Collider in real time, addressing the enormous data challenge of the world’s most powerful particle accelerator.

CERN has developed ultra-small AI models embedded directly into custom chips to filter massive data streams from the Large Hadron Collider in real time, addressing the enormous data challenge of the world’s most powerful particle accelerator.