Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
hf-internal-testing
/
tokenizers-test-data
like
0
Follow
Hugging Face Internal Testing Organization
163
Modalities:
Text
Formats:
text
Size:
< 1K
Libraries:
Datasets
Croissant
Dataset card
Data Studio
Files
Files and versions
xet
Community
2
main
tokenizers-test-data
51.6 MB
2 contributors
History:
3 commits
ArthurZ
HF Staff
mcpotato
HF Staff
feat: upload `xnli.txt` to assert validity of the whitespace split (
#2
)
bf38880
about 16 hours ago
.gitattributes
Safe
2.61 kB
feat: upload `xnli.txt` to assert validity of the whitespace split (#2)
about 16 hours ago
albert-base-v1-tokenizer.json
Safe
1.31 MB
Add all tokenizers test artifacts
21 days ago
bert-base-uncased-vocab.txt
Safe
232 kB
Add all tokenizers test artifacts
21 days ago
bert-wiki.json
Safe
439 kB
Add all tokenizers test artifacts
21 days ago
big.txt
Safe
6.49 MB
Add all tokenizers test artifacts
21 days ago
dummy-unigram-special_tokens-train.txt
Safe
1.11 kB
Add all tokenizers test artifacts
21 days ago
gpt2-merges.txt
Safe
456 kB
Add all tokenizers test artifacts
21 days ago
gpt2-vocab.json
Safe
1.04 MB
Add all tokenizers test artifacts
21 days ago
llama-3-tokenizer.json
Safe
17.2 MB
xet
Add all tokenizers test artifacts
21 days ago
openai-gpt-merges.txt
Safe
458 kB
Add all tokenizers test artifacts
21 days ago
openai-gpt-vocab.json
Safe
816 kB
Add all tokenizers test artifacts
21 days ago
roberta-base-merges.txt
Safe
456 kB
Add all tokenizers test artifacts
21 days ago
roberta-base-vocab.json
Safe
899 kB
Add all tokenizers test artifacts
21 days ago
roberta.json
Safe
1.36 MB
Add all tokenizers test artifacts
21 days ago
small.txt
Safe
7.44 kB
Add all tokenizers test artifacts
21 days ago
tokenizer-wiki.json
Safe
668 kB
Add all tokenizers test artifacts
21 days ago
tokenizer.json
Safe
4.1 kB
Add all tokenizers test artifacts
21 days ago
unigram.json
Safe
259 kB
Add all tokenizers test artifacts
21 days ago
unigram_wagahaiwa_nekodearu.txt
Safe
1.12 MB
Add all tokenizers test artifacts
21 days ago
xnli.txt
18.4 MB
xet
feat: upload `xnli.txt` to assert validity of the whitespace split (#2)
about 16 hours ago