You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

πŸ§‘β€βš–οΈ Judicia AI Law β€” Legal Conversational LLM

judiciaailaw is a fine-tuned LLaMA-3.2 1B Instruct model designed for:

  • Legal question answering
  • Document summarization
  • Legal chat/dialogue
  • Case-law reasoning
  • Drafting simple legal text

It is fine-tuned on the TechMaestro369 Indian Legal Texts dataset and adapted for conversational use.


πŸš€ Model Architecture

  • Base Model: unsloth/Llama-3.2-1B-Instruct
  • Type: Causal Language Model (Decoder-only)
  • Task: Text Generation
  • Framework: HuggingFace Transformers
  • Format: GGUF + HF Transformers

πŸ“Œ Example Inference (Python)

from huggingface_hub import InferenceClient

client = InferenceClient(
    "hridika/judiciaailaw",
    token="YOUR_HF_TOKEN"
)

response = client.text_generation(
    "What is anticipatory bail?",
    max_new_tokens=200,
    temperature=0.7
)

print(response)
Downloads last month
-
GGUF
Model size
1B params
Architecture
llama
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for hridika/judiciaailaw

Quantized
(76)
this model

Dataset used to train hridika/judiciaailaw