MetappuccinoLLModel β€” Per-category LoRA adapters for SRA metadata extraction

LoRA adapters (one folder per category) trained for SRA metadata extraction and inference in the Metappuccino project. These adapters not general-purpose dialogue models.

Important: Base model weights are not included. Also download the official base model: mistralai/Mistral-7B-Instruct-v0.3 to use Metappuccino.

Version

v1.0.0

Quickstart

Download for Metappuccino use:

from huggingface_hub import snapshot_download

snapshot_download(
    repo_id="chumphati/MetappuccinoLLModel",
    local_dir="<OUT_DIR_URL>/MetappuccinoLLModel", #path to the output directory
    resume_download=True,
    max_workers=4
)

Hyperparameters

All information about the hyperparameters is provided for each adapter in its respective folder, in the adapter_config.json files.

How to cite

If you use this repository in your work, please cite:

Related tool: Metappuccino β€” https://github.com/chumphati/Metappuccino

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for chumphati/MetappuccinoLLModel

Adapter
(507)
this model