Reason-ModernColBERT-ONNX

ONNX export of lightonai/Reason-ModernColBERT for fast CPU inference.

Model Details

Files

File Description
model.onnx FP32 ONNX model

| tokenizer.json | Tokenizer configuration | | config_sentence_transformers.json | Model configuration |

Usage with colbert-onnx (Rust)

use colbert_onnx::Colbert;

let mut model = Colbert::from_pretrained("path/to/model")?;
let embeddings = model.encode_documents(&["Hello world"])?;

Export Tool

This model was exported using pylate-onnx-export:

pip install "pylate-onnx-export @ git+https://github.com/lightonai/next-plaid.git#subdirectory=onnx/python"
pylate-onnx-export lightonai/Reason-ModernColBERT --push-to-hub Novadata-Technologies/Reason-ModernColBERT-ONNX
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support