How to use DDSC/roberta-base-scandinavian with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="DDSC/roberta-base-scandinavian")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("DDSC/roberta-base-scandinavian") model = AutoModelForMaskedLM.from_pretrained("DDSC/roberta-base-scandinavian")
This is a sample reference model for Flax/Jax training using only on the MC4. It is trained for roughly three day on a TPU v3-8. Training procedure...
My description