MIT researchers have developed a technique called CompreSSM that compresses AI models during training, cutting compute costs without sacrificing performance. The method uses control theory to identify and remove unnecessary components early in the training process, achieving up to 1.5x faster training while maintaining accuracy. https://news.mit.edu/2026/new-technique-makes-ai-models-leaner-faster-while-still-learning-0409 #AIagent #AI #GenAI #AIResearch

New technique makes AI models leaner and faster while they’re still learning
CompreSSM, an algorithm from MIT CSAIL, trims dead weight from AI models, shedding unnecessary complexity while also making them faster as they continue to learn. It helps the model find its own efficient structure while cutting compute costs.