30% Smaller, +2.4% Better
Qwen2.5-1.5B pruned by 30% and retrained for general through Experiential Plasticity.
2.50 โ 2.44 perplexity ยท 3 cycles
Every claim on this card is verified
Trust: self-attested ยท 1 benchmark ยท 2 devices tested
ForgeAlloy chain of custody ยท Download alloy ยท Merkle-chained
Qwen2.5-1.5B with cryptographic provenance via the ForgeAlloy chain of custody.
Benchmarks
| Benchmark | Result | Verified |
|---|---|---|
| perplexity | 2.4 | Self-reported |
What Changed (Base โ Forged)
| Base | Forged | Delta | |
|---|---|---|---|
| Perplexity (general) | 2.50 | 2.44 | -2.4% โ |
| Pruning | None | 30% heads (magnitude) | -30% params โ |
| Training | General | general, 1000 steps | LR 2e-4, 3 cycles |
| Pipeline | prune โ train | 3 cycles |
Runs On
| Device | Format | Size | Speed |
|---|---|---|---|
| MacBook Air 8GB | fp16 | โ | Verified |
| MacBook Pro 16GB | fp16 | โ | Verified |
| MacBook Pro 32GB | fp16 | 8.0GB | Expected |
| MacBook Air 16GB | Q8_0 | ~4.0GB | Expected |
| MacBook Air 8GB | Q4_K_M | ~2.5GB | Expected |
| iPhone / Android | Q4_K_M | ~2.5GB | Expected |
Quick Start
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("continuum-ai/qwen2.5-1.5b-general-forged",
torch_dtype="auto", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("continuum-ai/qwen2.5-1.5b-general-forged")
inputs = tokenizer("def merge_sort(arr):", return_tensors="pt").to(model.device)
output = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Methodology
Produced via head pruning. Full methodology, ablations, and per-stage rationale are in the methodology paper and the companion MODEL_METHODOLOGY.md in this repository. The pipeline ran as prune โ train over 3 cycles on MacBook Air 8GB.
Chain of Custody
Scan the QR or verify online. Download the alloy file to verify independently.
| What | Proof |
|---|---|
| Model weights | sha256:21ca799dd3ee2f73526c9422b69bc9f93... |
| Code that ran | sha256:legacy-pre-alloy-... |
| Forged on | MacBook Air 8GB, 2026-03-27T09:01:22-05:00 |
| Trust level | self-attested |
| Spec | ForgeAlloy โ Rust/Python/TypeScript |
Make Your Own
Forged with Continuum โ a distributed AI world that runs on your hardware.
The Factory configurator lets you design and forge custom models visually โ context extension, pruning, LoRA, quantization, vision/audio modalities. Pick your target devices, the system figures out what fits.
GitHub ยท All Models ยท Forge-Alloy
License
apache-2.0
- Downloads last month
- 726
Quantized
Model tree for continuum-ai/qwen2.5-1.5b-general-forged
Base model
Qwen/Qwen2.5-1.5B
