Anneketh Vij commited on
Commit
ccc8d5b
·
verified ·
1 Parent(s): 2f394c5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -55,9 +55,10 @@ More details on the training of Trinity Large are available in the [technical re
55
 
56
  ## Model Variants
57
 
58
- The Trinity Large family consists of three checkpoints from the same training run:
59
 
60
  - **[Trinity-Large-Preview](https://huggingface.co/arcee-ai/Trinity-Large-Preview)**: Lightly post-trained, chat-ready model undergoing active RL
 
61
  - **[Trinity-Large-TrueBase](https://huggingface.co/arcee-ai/Trinity-Large-TrueBase)**: 10T-token pre-anneal pretraining checkpoint
62
  - **[Trinity-Large-Base](https://huggingface.co/arcee-ai/Trinity-Large-Base)**: Full 17T-token pretrained foundation model with mid-training anneals
63
 
 
55
 
56
  ## Model Variants
57
 
58
+ The Trinity Large family consists of four checkpoints from the same training run:
59
 
60
  - **[Trinity-Large-Preview](https://huggingface.co/arcee-ai/Trinity-Large-Preview)**: Lightly post-trained, chat-ready model undergoing active RL
61
+ - **[Trinity-Large-Thinking](https://huggingface.co/arcee-ai/Trinity-Large-Thinking)**: Reasoning-optimized, agentic post-training with extended chain-of-thought
62
  - **[Trinity-Large-TrueBase](https://huggingface.co/arcee-ai/Trinity-Large-TrueBase)**: 10T-token pre-anneal pretraining checkpoint
63
  - **[Trinity-Large-Base](https://huggingface.co/arcee-ai/Trinity-Large-Base)**: Full 17T-token pretrained foundation model with mid-training anneals
64