Min Choi (@minchoi)

Nvidia가 Nemotron-3 Super를 공개했습니다. 모델은 120B 파라미터이며 12B가 활성화되는 MoE(Mixture of Experts) 구조를 사용하고, 에이전트용으로 특별히 설계된 오픈소스 모델입니다. 트레이닝과 배포를 위한 전체 설정 및 워크플로우도 함께 공유되었다고 밝혔습니다.

https://x.com/minchoi/status/2033216384159690805

#nvidia #nemotron3super #moe #opensource #aiagents

Min Choi (@minchoi) on X

Less than 24 hours ago, Nvidia dropped Nemotron-3 Super. 120B parameters. 12B active (MoE). Open source. Built specifically for AI agents. Here's the full setup + workflow 👇

X (formerly Twitter)

🚀 Nvidia's new Nemotron 3 Super combines a 3‑arch design with Multi‑Token Prediction (MTP) and speculative decoding, promising to outpace GPT‑OSS and Qwen on the Blackwell GPU. The open‑source community gets a powerful, efficient model to experiment with. Dive in to see how this leap could reshape AI research! #Nvidia #Nemotron3Super #MTP #OpenSourceAI

🔗 https://aidailypost.com/news/nvidias-nemotron-3-super-merges-3arch-design-mtp-outpace-gptoss-qwen