No puedo evitar apoyar al pequeño fabricante de modelos de IA de código extenso Arcee – ButterWord

Arcee is a tiny 26-person U.S. startup that built a high-performing, massive, open source LLM. And it's gaining popularity with OpenClaw users.

ButterWord
Arcee AI has released Trinity-Large-Thinking, a 399-billion parameter open-weight reasoning model under Apache 2.0. The model nearly matches Claude Opus 4.6 on agent benchmarks while running 96% cheaper. https://venturebeat.com/technology/arcees-new-open-source-trinity-large-thinking-is-the-rare-powerful-u-s-made #AIagent #AI #GenAI #AIResearch #Arcee
Arcee AI has released Trinity Large Thinking, a 400B parameter sparse MoE reasoning model under Apache 2.0 license. The model activates only 13B parameters per token while maintaining frontier-class performance on agentic tasks. Designed for long-horizon autonomous agents and multi-turn tool use. https://www.marktechpost.com/2026/04/02/arcee-ai-releases-trinity-large-thinking-an-apache-2-0-open-reasoning-model-for-long-horizon-agents-and-tool-use/ #AIagent #AI #GenAI #AgenticAI #Arcee
Arcee AI Releases Trinity Large Thinking: An Apache 2.0 Open Reasoning Model for Long-Horizon Agents and Tool Use

Arcee AI Releases Trinity Large Thinking: An Apache 2.0 Open Reasoning Model for Long-Horizon Agents and Tool Use

MarkTechPost
Random Old Comic: Starring Finn Wolfhard As Chip Chase https://www.toyboxcomix.com/2020/03/02/starring-finn-wolfhard-as-chip-chase/ Starring Finn Wolfhard As Chip Chase #Arcee #ChipChase #Ratchet #Rodimus #SpikeWitwicky #Transformers #Wheeljack

Arcee just dropped Trinity‑Large‑TrueBase, a raw 10‑trillion‑token checkpoint. This open‑source milestone pushes U.S. AI research forward, offering a massive base model for anyone to fine‑tune. Dive into the details of the training process and see how it can accelerate your ML projects. #OpenSourceAI #TrinityLarge #Arcee #BaseModel

🔗 https://aidailypost.com/news/arcee-releases-trinity-large-truebase-raw-10trilliontoken-checkpoint

#Arcee AI, a 30-person #startup, released #Trinity, a 400B-parameter #opensource #LLM. The company aims to compete with #Meta’s #Llama and other large models, particularly appealing to developers and academics. Arcee emphasises its commitment to open source, using the Apache licence, and offers Trinity in various versions for different use cases. https://techcrunch.com/2026/01/28/tiny-startup-arcee-ai-built-a-400b-open-source-llm-from-scratch-to-best-metas-llama/?eicker.news #tech #media #news
Tiny startup Arcee AI built a 400B-parameter open source LLM from scratch to best Meta's Llama | TechCrunch

30-person startup Arcee AI has released a 400B model called Trinity, which it says is one of the biggest open source foundation models from a U.S. company.

TechCrunch

AI Leaks and News (@AILeaksAndNews)

Arcee가 Trinity Large를 공개했습니다. 공개된 내용에 따르면 Trinity Large는 오픈소스 400B 파라미터 기반의 MoE(혼합전문가) 모델로, 실제 실행 시에는 13B 활성 파라미터만으로 프론티어급 성능을 낸다고 합니다. 미국 중심의 오픈소스 AI 경쟁에서 중요한 발표로 보입니다.

https://x.com/AILeaksAndNews/status/2016291533918548041

#arcee #trinity #moe #opensource #llm

AI Leaks and News (@AILeaksAndNews) on X

Arcee have released Trinity Large The open source 400B parameter MoE model delivers frontier-level performance with only 13B active parameters America answers back in the open source AI race

X (formerly Twitter)