Low-Resource NLP

HalleluBERT

A focused Hebrew RoBERTa project delivering the first large Hebrew model with state-of-the-art benchmark performance; pre-trained on a TPUv4-128 pod and evaluated on private hardware.