Barcenas E2B
Based on Google's Gemma 4 E2B it and trained on the claude-opus-4.6-10000x dataset created by Roman Yemelyanov.
This is my first fine-tuning of a Mixture-of-Experts (MoE) LLM.
The goal is to create a small LLM capable of running on a smartphone, enhanced by fine-tuning with ultra-high-quality data from Claude Opus 4.6 to boost its capabilities.
Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽
- Downloads last month
- 44