GPT-2 Hacker password generator (medium)

This model was fine-tuned based on the GPT-2 Medium.

About the fine-tuning process

Number of epochs: 1

A dataset of 50,000 passwords was used for fine-tuning of 128 tokens).

Total loss: 0.524064

Training time: 40 minutes (Google Colab free, T4 GPU)

Using the model

Use this code:

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

model = "CodeferSystem/GPT2-Hacker-password-generator-Medium"

tokenizer = AutoTokenizer.from_pretrained(model)
model = AutoModelForCausalLM.from_pretrained(model)

prompt = "User: generate a hacker password\nAssistant:"

inputs = tokenizer(prompt, return_tensors="pt")

output = model.generate(
    **inputs,
    max_length=60,
    do_sample=True,
    temperature=0.9, # Change for creativity
    top_p=0.95,
    no_repeat_ngram_size=2
)

print(tokenizer.decode(output[0], skip_special_tokens=True))

Example output:

(1) User: generate a hacker password
Assistant: 7-Zs_?~?JNz2

(2) User: generate a hacker password
Assistant: Y>Z7fB&j9c*q<&

(3) User: generate a hacker password
Assistant: #Nc<w~2hfJ4<

(4) User: generate a hacker password
Assistant: Zg0qV%X-!z=j5j

(5) User: generate a hacker password
Assistant: t~5^>6hVhxQ$yY

Small model

GPT2-Hacker-password-generator Small model

Dataset of this model

The dataset on which the model was trained will be published later.

Downloads last month
155
Safetensors
Model size
0.4B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for CodeferSystem/GPT2-Hacker-password-generator-Medium

Finetuned
(189)
this model

Collection including CodeferSystem/GPT2-Hacker-password-generator-Medium