gemma-3-1b-it-toxicity

#808
by mangojesus - opened

It's queued! :D

You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#gemma-3-1b-it-toxicity-GGUF for quants to appear.

Seams like it requires a tokenizer.model and not a tokenizer.json. This is actually an issue for many models and I once wrote a script to generate tokenizer.model based on tokenizer.json but I wasn’t satisfied with the result so we never ended up using it. I will look into it again if I find time to do so.

CONVERT hf
INFO:hf-to-gguf:Loading model: gemma-3-1b-it-toxicity
INFO:gguf.gguf_writer:gguf: This GGUF file is for Little Endian only
INFO:hf-to-gguf:Exporting model...
INFO:hf-to-gguf:gguf: loading model part 'model.safetensors'
Traceback (most recent call last):
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 5386, in <module>
    main()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 5380, in main
    model_instance.write()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 3401, in write
    super().write()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 440, in write
    self.prepare_tensors()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 299, in prepare_tensors
    for new_name, data_torch in (self.modify_tensors(data_torch, name, bid)):
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 3449, in modify_tensors
    vocab = self._create_vocab_sentencepiece()
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 821, in _create_vocab_sentencepiece
    raise FileNotFoundError(f"File not found: {tokenizer_path}")
FileNotFoundError: File not found: gemma-3-1b-it-toxicity/tokenizer.model
yes: standard output: Broken pipe
job finished, status 1
job-done<0 gemma-3-1b-it-toxicity noquant 1>

https://huggingface.co/enochlev/gemma-3-1b-it-toxicity

nico your commitment to the craft is admirable and appreciated. always love catching sparks of insight from your explanations when things crash out. appreciate you taking the time to explain things more in depth too!

Sign up or log in to comment