Thứ Ba, 16/12 từ 1-2pm PST, tham gia AMA với các nhà nghiên cứu AI2 (tác giả mô hình Olmo & Molmo mở toàn phần). Đặt câu hỏi ngay! #AI2 #Olmo #Molmo #NghiênCứuAI #MôHìnhMở #OpenModeling #AIResearch

https://www.reddit.com/r/LocalLLaMA/comments/1pniwfj/ai2_open_modeling_ama_ft_researchers_from_the/

@frescosecco #Molmo is stubborny refusing to accept my position on the matter ... Maybe the two of you know something I don't 😅

These Mini AI Models Match OpenAI With 1,000 Times Less Data

Jason Dorrier discusses the AI industry's focus on scaling up models, and contrasts this with the Allen Institute for AI's (Ai2) approach of creating efficient, smaller models like Molmo.
Molmo outperforms larger models using high-quality data and is open-source.

#ArtificialIntelligence #LLM #OpenSource #Ai2
#OpenAI #Molmo

https://singularityhub.com/2024/10/04/these-mini-ai-models-match-openai-with-1000-times-less-data/

These Mini AI Models Match OpenAI With 1,000 Times Less Data

Ai2 new family of open-source AI models are competitive with state-of-the-art models like OpenAI's GPT-4o—but an order of magnitude smaller.

Singularity Hub
@codepo8 Multimodal LLM-s like #Molmo can even read text from images.
Molmo: Inovația în Modelele Multimodale AI - TECHNEWSRO

Inovația nu încetează să ne surprindă în noua lume AI, un exemplu recent este lansarea familiei de modele multimodale Molmo de către Allen Institute for AI (Ai2), care marchează un pas semnificativ înainte în domeniul AI. Molmo nu este doar un alt set de modele AI; este o demonstrație a ceea ce este posibil când

TECHNEWSRO - Pasionat de tehnologie

The Allen Institute for AI debuts Multimodal Open Language Model, #Molmo, the most capable #opensource #AI model with visual abilities yet

https://www.wired.com/story/molmo-open-source-multimodal-ai-model-allen-institute-agents/

#Ai2 #multimodal

The Most Capable Open Source AI Model Yet Could Supercharge AI Agents

A compact and fully open source visual AI model will make it easier for AI to take control of your computer—hopefully in a good way.

WIRED

• 🧠 #AI2 unveils #opensource #Molmo #LLM family, competing with top proprietary models

• 🏆 72B-parameter Molmo outperforms #GPT4 in image and document comprehension tests

• 🎯 7B-parameter version approaches state-of-the-art performance with significantly less data

• 📊 Trained on 600k high-quality, annotated images vs. billions in other models

• 👆 New "pointing" capability allows Molmo to identify specific elements in images

• 🌐 Available for developers on #HuggingFace, promoting open-source #AI development

https://www.technologyreview.com/2024/09/25/1104465/a-tiny-new-open-source-ai-model-performs-as-well-as-powerful-big-ones/

A tiny new open-source AI model performs as well as powerful big ones

The results suggest that training models on less, but higher-quality, data can lower computing costs.

MIT Technology Review
A tiny new open-source AI model performs as well as powerful big ones
The Allen Institute for Artificial Intelligence (#Ai2), called #Molmo, that it says perform as well as top proprietary models from OpenAI, Google, and Anthropic. The results suggest that training models on less, but higher-quality, data can lower computing costs.
They claim its biggest Molmo model, which has 72B parameters, outperforms GPT-4o, which is estimated to have over a trillion parameters
https://www.technologyreview.com/2024/09/25/1104465/a-tiny-new-open-source-ai-model-performs-as-well-as-powerful-big-ones/
A tiny new open-source AI model performs as well as powerful big ones

The results suggest that training models on less, but higher-quality, data can lower computing costs.

MIT Technology Review