You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

PulseLM: A Foundation Dataset and Benchmark for PPG-Text Learning

* Equal contribution    † Corresponding authors

Quick Start

# transformers>=4.46.0 accelerate>=1.0.1 peft>=0.13.2 safetensors
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("Manhph2211/PulseLM", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
    "Manhph2211/PulseLM",
    trust_remote_code=True,
    torch_dtype=torch.bfloat16,
    device_map="auto"
)
Downloads last month
158
Safetensors
Model size
8B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Manhph2211/PulseLM

Base model

Qwen/Qwen2.5-7B
Finetuned
(3272)
this model

Dataset used to train Manhph2211/PulseLM