Arcee just dropped Trinity‑Large‑TrueBase, a raw 10‑trillion‑token checkpoint. This open‑source milestone pushes U.S. AI research forward, offering a massive base model for anyone to fine‑tune. Dive into the details of the training process and see how it can accelerate your ML projects. #OpenSourceAI #TrinityLarge #Arcee #BaseModel

🔗 https://aidailypost.com/news/arcee-releases-trinity-large-truebase-raw-10trilliontoken-checkpoint

#Arcee AI, a 30-person #startup, released #Trinity, a 400B-parameter #opensource #LLM. The company aims to compete with #Meta’s #Llama and other large models, particularly appealing to developers and academics. Arcee emphasises its commitment to open source, using the Apache licence, and offers Trinity in various versions for different use cases. https://techcrunch.com/2026/01/28/tiny-startup-arcee-ai-built-a-400b-open-source-llm-from-scratch-to-best-metas-llama/?eicker.news #tech #media #news
Tiny startup Arcee AI built a 400B-parameter open source LLM from scratch to best Meta's Llama | TechCrunch

30-person startup Arcee AI has released a 400B model called Trinity, which it says is one of the biggest open source foundation models from a U.S. company.

TechCrunch

AI Leaks and News (@AILeaksAndNews)

Arcee가 Trinity Large를 공개했습니다. 공개된 내용에 따르면 Trinity Large는 오픈소스 400B 파라미터 기반의 MoE(혼합전문가) 모델로, 실제 실행 시에는 13B 활성 파라미터만으로 프론티어급 성능을 낸다고 합니다. 미국 중심의 오픈소스 AI 경쟁에서 중요한 발표로 보입니다.

https://x.com/AILeaksAndNews/status/2016291533918548041

#arcee #trinity #moe #opensource #llm

AI Leaks and News (@AILeaksAndNews) on X

Arcee have released Trinity Large The open source 400B parameter MoE model delivers frontier-level performance with only 13B active parameters America answers back in the open source AI race

X (formerly Twitter)

Arcee just open‑sourced two new LLM families under Apache‑2.0: the 26B Trinity Mini and the 6B Trinity Nano preview. Both use AFMoE’s Mixture‑of‑Experts, bringing DeepSeek‑style efficiency to the community. Check out the details on architecture, training tricks, and how they compare to Qwen. A big step for open‑source AI! #Apache2 #Arcee #TrinityMini #AFMoE

🔗 https://aidailypost.com/news/arcee-releases-apache20-trinity-mini-26b-nano-preview-6b-models

Random Old Comic: Rodimus Gets a Dog, Part III https://www.toyboxcomix.com/2019/05/03/rodimus-gets-a-dog-part-iii/ #Rodimus Gets a Dog, Part III No action figures were harmed in the making of this comic. #Arcee #Transformers