The prevailing belief in the AI industry is that foundation models will ultimately be dominated by a small group of Big Tech players, with Google, Meta, Microsoft, and a handful of preferred model labs controlling the market. Arcee AI, a 30-person startup, is openly challenging that assumption.
The company has released Trinity, a permanently open, Apache-licensed foundation model with 400 billion parameters. According to Arcee AI, this makes Trinity one of the largest open source general-purpose language models ever trained and released by a U.S.-based company. The model is positioned as a direct peer to Meta’s Llama 4 Maverick 400B and Z.ai’s GLM-4.5, a prominent open model developed by researchers at China’s Tsinghua University.
Benchmark tests conducted on base models with minimal post-training show Trinity performing competitively across coding, mathematics, reasoning, knowledge, and common-sense tasks. In several categories, Arcee claims the model slightly outperforms Llama. Like other modern large models, Trinity is optimized for code generation and multi-step, agentic workflows. However, at launch, it supports text-only input and output.
Multimodal expansion is already underway. CTO Lucas Atkins confirmed that a vision model is currently in development, with speech-to-text capabilities planned for future releases. While Meta’s Llama 4 Maverick already supports images alongside text, Arcee deliberately prioritized building a strong text-based foundation model first, targeting developers and academic researchers as its primary audience.
That audience focus also reflects a broader strategic goal. Arcee aims to attract U.S. companies that are hesitant or unable to adopt open models originating from China. Licensing plays a central role in this positioning. Trinity is released under the Apache license, which allows unrestricted commercial use and modification.
“Ultimately, the winners of this game, and the only way to really win over the usage, is to have the best open-weight model,” Atkins said. “To win the hearts and minds of developers, you have to give them the best.”
Arcee’s leadership has been explicit about why it believes another U.S.-built open model is necessary, even with Meta’s Llama already on the market. Atkins argues that Llama’s Meta-controlled license includes commercial and usage limitations, which some open source advocates do not consider fully compliant with open source standards.
The company’s decision to train its own frontier-scale model marked a significant shift from its original business. Arcee initially focused on post-training and customization of existing open models for enterprise clients, including SK Telecom. The team would adapt models such as Llama, Mistral, and Qwen through reinforcement learning and fine-tuning to suit specific production needs.
As the company’s customer base expanded, reliance on third-party models became a growing concern. At the same time, many of the strongest open models were coming from China, introducing regulatory and procurement challenges for U.S. enterprises. These pressures ultimately pushed Arcee toward pre-training its own foundation model.
The move was far from routine. CEO Mark McQuade noted that very few organizations globally have successfully trained and released models at this scale. The company began cautiously, first building a small 4.5-billion-parameter model in collaboration with DatologyAI. That project’s success laid the groundwork for larger efforts, including the Trinity Nano at 6B parameters and Trinity Mini at 26B parameters, both released in December.
Training the full Trinity lineup was completed in approximately six months at a total cost of $20 million. The work was carried out using 2,048 Nvidia Blackwell B300 GPUs and funded from the roughly $50 million Arcee AI has raised to date.
While the investment was substantial for a startup of Arcee’s size, Atkins emphasized that it remains modest compared to the spending of major AI labs. The compressed timeline was intentional.

“We are a younger startup that’s extremely hungry,” Atkins said. “We have a tremendous amount of talent and bright young researchers who, when given the opportunity to spend this amount of money and train a model of this size, we trusted that they’d rise to the occasion. And they certainly did, with many sleepless nights, many long hours.”
With Trinity now in preview and further post-training underway, Arcee AI is positioning itself not just as a service provider, but as a long-term U.S. alternative in the global open source AI ecosystem—one built around permanent openness and competitive scale.




