pierretokns's picture
Upload README.md with huggingface_hub
529b3e2 verified
metadata
language:
  - en
license: gemma
library_name: transformers
tags:
  - tool-calling
  - mcp
  - browser-automation
  - lora
  - ccmcp
  - mlx
base_model: google/functiongemma-270m-it
datasets:
  - custom
pipeline_tag: text-generation

functiongemma-270m-ccmcp-v1

FunctionGemma 270M trained for Claude Chrome MCP tool calling

Attribution: Gemma is provided under and subject to the Gemma Terms of Use found at ai.google.dev/gemma/terms

Model Description

Fine-tuned for MCP (Model Context Protocol) tool calling with the Claude Chrome extension. The model generates tool calls for browser automation tasks.

Training Details

  • Base Model: google/functiongemma-270m-it
  • Method: LoRA fine-tuning on Apple Silicon (MLX)
  • Dataset: 1,782 MCP browser automation examples
  • Validation Loss: 0.027
  • Iterations: 500
  • Naming Convention: {base}-{size}-ccmcp-{version}
    • ccmcp = Claude Chrome MCP

Files

  • adapters.safetensors - LoRA adapter weights
  • adapter_config.json - LoRA configuration
  • functiongemma-270m-ccmcp-v1-f16.gguf - GGUF F16 format for llama.cpp/Ollama
  • checkpoints/ - Training checkpoints

Usage

With MLX (Apple Silicon)

from mlx_lm import load, generate
from mlx_lm.sample_utils import make_sampler

model, tokenizer = load(
    "mlx-community/functiongemma-270m-it-4bit",
    adapter_path="pierretokns/functiongemma-270m-ccmcp-v1"
)

messages = [
    {"role": "system", "content": "You are a browser automation assistant with MCP tools."},
    {"role": "user", "content": "Go to google.com"}
]

prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
sampler = make_sampler(temp=0.1)

response = generate(model, tokenizer, prompt=prompt, max_tokens=150, sampler=sampler)
print(response)

With Ollama

# Download GGUF from this repo
# Create Modelfile:
cat > Modelfile << 'EOF'
FROM ./functiongemma-270m-ccmcp-v1-f16.gguf
PARAMETER num_ctx 8192
PARAMETER temperature 0.1
SYSTEM "You are a browser automation assistant with MCP tools."
EOF

# Create and run
ollama create functiongemma-270m-ccmcp-v1 -f Modelfile
ollama run functiongemma-270m-ccmcp-v1 "Go to google.com"

With Claude Code + Ollama

ANTHROPIC_BASE_URL=http://localhost:11434 \
ANTHROPIC_AUTH_TOKEN=ollama \
ANTHROPIC_API_KEY=ollama \
claude --model functiongemma-270m-ccmcp-v1

MCP Tools

The model was trained on 16 MCP browser automation tools: navigate, read_page, find, computer, form_input, get_page_text, screenshot, javascript_tool, tabs_context_mcp, tabs_create_mcp, gif_creator, upload_image, read_console_messages, read_network_requests, shortcuts_list, shortcuts_execute

License

This model is subject to the Gemma Terms of Use.

Important: By using this model, you agree to:

If you redistribute this model or derivatives, you must include these terms.