BinaryLLM (HF export)

Tokenizer-free / base-N model export.

Load

from transformers import AutoModelForCausalLM
m = AutoModelForCausalLM.from_pretrained("./hf_binaryllm_repo", trust_remote_code=True)
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for PhysiQuanty/Wiki-Test2

Finetuned
(2)
this model