Treatmybrand


a Kainjoo SA Venture
Ch. du Vernay 14a
1196 Gland
+41.21.561.34.96
info@treatmybrand.com

Support


Monday to Friday
8AM to 8PM
support@treatmybrand.com

Arcee Launches Trinity Models to Reinvent Open-Source AI in the U.S. Under Apache 2.0 License

Picture of Venturebeat

Venturebeat

Throughout 2025, the forefront of open-source large language models (LLMs) has been dominated primarily by Chinese labs such as Alibaba’s Qwen and Baidu’s Ernie. Meanwhile, U.S. efforts have lagged in the open-weight model arena. Today, Arcee AI aims to shift this landscape with the release of its Trinity Mini and Trinity Nano Preview models — open-weight Mixture-of-Experts (MoE) models engineered and fully trained in the U.S. These models are accessible in chatbot form at chat.arcee.ai and available for download and modification via Hugging Face under the enterprise-friendly Apache 2.0 license. Trinity Mini, with 26 billion parameters, offers robust reasoning and tool use capabilities, while the smaller 6 billion parameter Nano Preview emphasizes chat interaction with personality. Arcee collaborates with DatologyAI to curate a high-quality 10 trillion token training dataset and partners with Prime Intellect for state-of-the-art U.S. infrastructure supporting the training of these models. The upcoming Trinity Large, a 420 billion parameter model designed for enterprise-level AI, is in training and expected to launch in January 2026. Arcee’s initiative highlights a renewed commitment to model sovereignty and open AI innovation within the U.S., challenging the global dominance of Chinese open models and stressing ownership over the entire training process rather than just fine-tuning.

Related Chronicles

Receive the latest news

Subscribe To Our Weekly Newsletter

Get notified about chronicles from TreatMyBrand (TMB.) directly in your inbox

Subscription Form