A tiny new open-source AI model performs as well as powerful big ones
The Allen Institute for Artificial Intelligence (#Ai2), called #Molmo, that it says perform as well as top proprietary models from OpenAI, Google, and Anthropic. The results suggest that training models on less, but higher-quality, data can lower computing costs.
They claim its biggest Molmo model, which has 72B parameters, outperforms GPT-4o, which is estimated to have over a trillion parameters
https://www.technologyreview.com/2024/09/25/1104465/a-tiny-new-open-source-ai-model-performs-as-well-as-powerful-big-ones/
A tiny new open-source AI model performs as well as powerful big ones

The results suggest that training models on less, but higher-quality, data can lower computing costs.

MIT Technology Review