Arcee Unveils Trinity Large Thinking, a 400B-Parameter Open-Source AI Rival
Image: TechCrunch

Arcee Unveils Trinity Large Thinking, a 400B-Parameter Open-Source AI Rival

07 April, 2026.Technology and Science.7 sources

Key Takeaways

  • Arcee AI released Trinity Large, a 399–400 billion-parameter open-source LLM.
  • Trinity is built from scratch in the United States and released under Apache 2.0.
  • The model aims to compete with Meta’s Llama 4 and open-source peers.

Open Source Ambition

The model is described as the most capable open-weight model ever released by a non-Chinese company.

Image from CryptoRank
CryptoRankCryptoRank

Trinity aims to provide Western companies an alternative to Chinese-built models perceived as risky.

Benchmarks show Trinity holds its own against Meta's Llama 4 Maverick 400B.

Arcee trained the model on a $20 million budget across 33 days using 2048 GPUs.

Sparsity and Scalability

Trinity employs a sparse Mixture-of-Experts architecture with 400 billion parameters but only 13 billion active per token.

This high-sparsity design enables inference speeds two to three times faster than competing models.

Image from Ecosistema Startup
Ecosistema StartupEcosistema Startup

Trinity currently supports only text input and output, but visual and speech models are planned.

Arcee released the TrueBase checkpoint with no instruction data, a rare resource.

This transparency is central to Arcee's strategy to avoid relying on the whims of giants.

Ecosystem Adoption

This contrasts with closed models where Anthropic separated OpenClaw from subscriptions.

Arcee's API offers an accessible commercial pathway but the long-term play is open source.

The model's release resonates with startups in Latin America and other regions seeking compliance-friendly solutions.

Arcee aims to push open models from China out of the market.

More on Technology and Science