XLM-RoBERTa models with continued pretraining on the MultiLegalPile
Joel Niklaus
joelniklaus
AI & ML interests
Pretraining, Instruction Tuning, Domain Adaptation, Benchmarks, Legal Datasets
Organizations
models
40
joelniklaus/legal-swiss-longformer-base
Fill-Mask
•
Updated
•
19
•
2
joelniklaus/legal-swiss-roberta-base
Fill-Mask
•
Updated
•
94
joelniklaus/legal-swiss-roberta-large
Fill-Mask
•
Updated
•
224
•
1
joelniklaus/legal-croatian-roberta-base
Fill-Mask
•
Updated
•
32
•
1
joelniklaus/legal-english-longformer-base
Updated
•
1
joelniklaus/legal-english-roberta-large
Fill-Mask
•
Updated
•
25
joelniklaus/legal-english-roberta-base
Fill-Mask
•
Updated
•
26
joelniklaus/legal-xlm-roberta-base
Fill-Mask
•
Updated
•
126
•
1
joelniklaus/legal-xlm-roberta-large
Fill-Mask
•
Updated
•
23
•
4
joelniklaus/legal-portuguese-roberta-base
Fill-Mask
•
Updated
•
99
datasets
26
joelniklaus/Multi_Legal_Pile
Updated
•
6.32k
•
40
joelniklaus/Multi_Legal_Pile_Commercial
Viewer
•
Updated
•
50
•
6
joelniklaus/legalnero
Viewer
•
Updated
•
4
•
1
joelniklaus/greek_legal_ner
Viewer
•
Updated
•
10
joelniklaus/legal-mc4
Viewer
•
Updated
•
8
joelniklaus/MultiLegalPile_Chunks_4000
Viewer
•
Updated
joelniklaus/eurlex_resources
Updated
•
2
•
6
joelniklaus/lextreme
Viewer
•
Updated
•
69
•
17
joelniklaus/MultiLegalPileWikipediaFiltered
Viewer
•
Updated
•
8
•
3
joelniklaus/EU_Wikipedias
Viewer
•
Updated
•
1