Home
Experience
Projects
Contact
Continued Pretraining
ChristBERT
A domain-adaptive German medical RoBERTa model, exploring continued pre-training and from-scratch training with specialized vocabularies.
GeistBERT
A continued-pretraining extension of GottBERT, developed during a period of transition and finalized as a preprint before being presented at GlobalNLP@RANLP 2025.
Cite
×