Standard Soft Prompt (SPT) for XLM-R (large) — SIB-200

This model is released as part of the paper:
Cross-Prompt Encoder for Low-Performing Languages
Findings of IJCNLP–AACL 2025; preprint at arXiv:2508.10352.

The paper studies cross-lingual transfer learning for low-performing languages using parameter-efficient, prompt-based methods on the SIB-200 benchmark.

This repository provides the trained Standard Soft Prompt (SPT) adapter used in the study. It is a parameter-efficient soft-prompt model designed to be loaded on top of a frozen XLM-R (large) backbone, and contains the learned:

  • Soft Prompt

  • Classification Head


Model Details

  • Adaptation: Parameter-Efficient Fine-Tuning (PEFT), Standard Soft Prompt (SPT)
  • Backbone: FacebookAI/xlm-roberta-large
  • Task: Multilingual Topic Classification
  • Benchmark: Davlan/sib200
  • Source Language Group: Joshi5 (7 Highest Resource Languages)

Seeds

This repository includes 10 models trained with different random seeds.
The main branch corresponds to seed-01
Models for other seeds are available as branches: seed-02, seed-03, …, seed-10


Usage

This model is part of the experimental framework introduced in the paper and is intended to be loaded and used via its canonical codebase.


Related Resources


Citation

If you use this model, please cite:

@misc{mikaberidze2025crosspromptencoderlowperforminglanguages,
  title         = {Cross-Prompt Encoder for Low-Performing Languages},
  author        = {Beso Mikaberidze and Teimuraz Saghinadze and Simon Ostermann and Philipp Muller},
  year          = {2025},
  eprint        = {2508.10352},
  archivePrefix = {arXiv},
  primaryClass  = {cs.CL},
  url           = {https://arxiv.org/abs/2508.10352},
}

Contact

Beso Mikaberidzebeso.mikaberidze@gmail.com
Philipp Mullermueller@is.mpg.de

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for mikaberidze/xlmr-large-sib200-peft-spt-joshi5

Adapter
(43)
this model

Collection including mikaberidze/xlmr-large-sib200-peft-spt-joshi5

Paper for mikaberidze/xlmr-large-sib200-peft-spt-joshi5