Standard Soft Prompt (SPT) for XLM-R (large) — SIB-200
This model is released as part of the paper:
Cross-Prompt Encoder for Low-Performing Languages
Findings of IJCNLP–AACL 2025; preprint at arXiv:2508.10352.
The paper studies cross-lingual transfer learning for low-performing languages using parameter-efficient, prompt-based methods on the SIB-200 benchmark.
This repository provides the trained Standard Soft Prompt (SPT) adapter used in the study. It is a parameter-efficient soft-prompt model designed to be loaded on top of a frozen XLM-R (large) backbone, and contains the learned:
Soft Prompt
Classification Head
Model Details
- Adaptation: Parameter-Efficient Fine-Tuning (PEFT), Standard Soft Prompt (SPT)
- Backbone:
FacebookAI/xlm-roberta-large - Task: Multilingual Topic Classification
- Benchmark:
Davlan/sib200 - Source Language Group: Joshi5 (7 Highest Resource Languages)
Seeds
This repository includes 10 models trained with different random seeds.
The main branch corresponds to seed-01
Models for other seeds are available as branches: seed-02, seed-03, …, seed-10
Usage
This model is part of the experimental framework introduced in the paper and is intended to be loaded and used via its canonical codebase.
Related Resources
- Paper Preprint:
Cross-Prompt Encoder for Low-Performing Languages - Code Repository:
bmikaberidze/XPE - Benchmark:
Davlan/sib200 - Preprocessed Dataset:
mikaberidze/sib200-xlmr-tokenized - Related Models:
Citation
If you use this model, please cite:
@misc{mikaberidze2025crosspromptencoderlowperforminglanguages,
title = {Cross-Prompt Encoder for Low-Performing Languages},
author = {Beso Mikaberidze and Teimuraz Saghinadze and Simon Ostermann and Philipp Muller},
year = {2025},
eprint = {2508.10352},
archivePrefix = {arXiv},
primaryClass = {cs.CL},
url = {https://arxiv.org/abs/2508.10352},
}
Contact
Beso Mikaberidze — beso.mikaberidze@gmail.com
Philipp Muller — mueller@is.mpg.de
- Downloads last month
- -
Model tree for mikaberidze/xlmr-large-sib200-peft-spt-joshi5
Base model
FacebookAI/xlm-roberta-large