π§ββοΈ Judicia AI Law β Legal Conversational LLM
judiciaailaw is a fine-tuned LLaMA-3.2 1B Instruct model designed for:
- Legal question answering
- Document summarization
- Legal chat/dialogue
- Case-law reasoning
- Drafting simple legal text
It is fine-tuned on the TechMaestro369 Indian Legal Texts dataset and adapted for conversational use.
π Model Architecture
- Base Model:
unsloth/Llama-3.2-1B-Instruct - Type: Causal Language Model (Decoder-only)
- Task: Text Generation
- Framework: HuggingFace Transformers
- Format: GGUF + HF Transformers
π Example Inference (Python)
from huggingface_hub import InferenceClient
client = InferenceClient(
"hridika/judiciaailaw",
token="YOUR_HF_TOKEN"
)
response = client.text_generation(
"What is anticipatory bail?",
max_new_tokens=200,
temperature=0.7
)
print(response)
- Downloads last month
- -
Hardware compatibility
Log In to add your hardware
4-bit
Model tree for hridika/judiciaailaw
Base model
meta-llama/Llama-3.2-1B-Instruct
Finetuned
unsloth/Llama-3.2-1B-Instruct