• 🧠 #AI2 unveils #opensource #Molmo #LLM family, competing with top proprietary models

• 🏆 72B-parameter Molmo outperforms #GPT4 in image and document comprehension tests

• 🎯 7B-parameter version approaches state-of-the-art performance with significantly less data

• 📊 Trained on 600k high-quality, annotated images vs. billions in other models

• 👆 New "pointing" capability allows Molmo to identify specific elements in images

• 🌐 Available for developers on #HuggingFace, promoting open-source #AI development

https://www.technologyreview.com/2024/09/25/1104465/a-tiny-new-open-source-ai-model-performs-as-well-as-powerful-big-ones/

A tiny new open-source AI model performs as well as powerful big ones

The results suggest that training models on less, but higher-quality, data can lower computing costs.

MIT Technology Review