Qwen / Qwen3.5-0.8B
Information
⚠️ Requires ExLlamaV3 v0.0.23 (or v0.0.22 `dev` branch)
exl3 quantizations of Qwen3.5-0.8B via exllamav3.
repo generated automatically with ezexl3.
exl3 quantizations of Qwen3.5-0.8B via exllamav3.
repo generated automatically with ezexl3.
Repo Data
CLI Download
huggingface-cli download UnstableLlama/Qwen3.5-0.8B-exl3 --revision "4.00bpw" --local-dir ./
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support