Instructions to use trl-internal-testing/tiny-LlamaForSequenceClassification-3.2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use trl-internal-testing/tiny-LlamaForSequenceClassification-3.2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="trl-internal-testing/tiny-LlamaForSequenceClassification-3.2")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("trl-internal-testing/tiny-LlamaForSequenceClassification-3.2") model = AutoModelForSequenceClassification.from_pretrained("trl-internal-testing/tiny-LlamaForSequenceClassification-3.2") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- da070f8626ee36c29d81d14f7a18b0a943ba6477aaa86b433d0f865f98bf8392
- Size of remote file:
- 17.2 MB
- SHA256:
- 6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.