| --- |
| language: en |
| datasets: |
| - squad_v2 |
| license: cc-by-4.0 |
| --- |
| |
| # roberta-base distilled into tinyroberta |
|
|
| ## Overview |
| **Language model:** roberta-base |
| **Language:** English |
| **Training data:** The PILE |
| **Infrastructure**: 4x Tesla v100 |
|
|
| ## Hyperparameters |
|
|
| ``` |
| batch_size = 96 |
| n_epochs = 4 |
| max_seq_len = 384 |
| learning_rate = 1e-4 |
| lr_schedule = LinearWarmup |
| warmup_proportion = 0.2 |
| teacher = "deepset/roberta-base" |
| ``` |
|
|
| ## Distillation |
| This model was distilled using the TinyBERT approach described in [this paper](https://arxiv.org/pdf/1909.10351.pdf) and implemented in [haystack](https://github.com/deepset-ai/haystack). |
| We have performed intermediate layer distillation with roberta-base as the teacher which resulted in [deepset/tinyroberta-6l-768d](https://huggingface.co/deepset/tinyroberta-6l-768d). |
| This model has not been distilled for any specific task. If you are interested in using distillation to improve its performance on a downstream task, you can take advantage of haystack's new [distillation functionality](https://haystack.deepset.ai/guides/model-distillation). You can also check out [deepset/tinyroberta-squad2](https://huggingface.co/deepset/tinyroberta-squad2) for a model that is already distilled on an extractive QA downstream task. |
|
|
| ## About us |
| <div class="grid lg:grid-cols-2 gap-x-4 gap-y-3"> |
| <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center"> |
| <img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/> |
| </div> |
| <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center"> |
| <img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/> |
| </div> |
| </div> |
| |
| [deepset](http://deepset.ai/) is the company behind the production-ready open-source AI framework [Haystack](https://haystack.deepset.ai/). |
|
|
| Some of our other work: |
| - [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")](https://huggingface.co/deepset/tinyroberta-squad2) |
| - [German BERT](https://deepset.ai/german-bert), [GermanQuAD and GermanDPR](https://deepset.ai/germanquad), [German embedding model](https://huggingface.co/mixedbread-ai/deepset-mxbai-embed-de-large-v1) |
| - [deepset Cloud](https://www.deepset.ai/deepset-cloud-product), [deepset Studio](https://www.deepset.ai/deepset-studio) |
|
|
| ## Get in touch and join the Haystack community |
|
|
| <p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>. |
|
|
| We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p> |
|
|
| [Twitter](https://twitter.com/Haystack_AI) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://haystack.deepset.ai/) | [YouTube](https://www.youtube.com/@deepset_ai) |
|
|
| By the way: [we're hiring!](http://www.deepset.ai/jobs) |