RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper
• 1907.11692 • Published
• 10
HeavyBERTa is a specialized protein language model built on the RoBERTa architecture, pre-trained on unpaired heavy chain antibody sequences from the OAS (Observed Antibody Space) database.
For further details and access to the code, visit our GitHub repository.