lckr commited on
Commit
cb50784
·
verified ·
1 Parent(s): d2854aa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -48,6 +48,7 @@ More details on the training of Trinity Large are available in the [technical re
48
  The Trinity Large family consists of three checkpoints from the same training run:
49
 
50
  - **Trinity-Large-Base** (this release): Full 17T-token pretrained foundation model with mid-training anneals
 
51
  - **[Trinity-Large-TrueBase](https://huggingface.co/arcee-ai/Trinity-Large-TrueBase)**: 10T-token pre-anneal checkpoint with no instruction data
52
  - **[Trinity-Large-Preview](https://huggingface.co/arcee-ai/Trinity-Large-Preview)**: Lightly post-trained, chat-ready model undergoing active RL
53
 
 
48
  The Trinity Large family consists of three checkpoints from the same training run:
49
 
50
  - **Trinity-Large-Base** (this release): Full 17T-token pretrained foundation model with mid-training anneals
51
+ - **[Trinity-Large-Thinking](https://huggingface.co/arcee-ai/Trinity-Large-Thinking)**: Reasoning-optimized, agentic post-training with extended chain-of-thought
52
  - **[Trinity-Large-TrueBase](https://huggingface.co/arcee-ai/Trinity-Large-TrueBase)**: 10T-token pre-anneal checkpoint with no instruction data
53
  - **[Trinity-Large-Preview](https://huggingface.co/arcee-ai/Trinity-Large-Preview)**: Lightly post-trained, chat-ready model undergoing active RL
54