metadata
datasets:
- togethercomputer/RedPajama-Data-V2
language:
- de
pipeline_tag: text-generation
library_name: transformers
license: other
LLäMmlein 7B
This is a German 7B LLaMA language model trained from scratch using our adapted Tinyllama codebase on the German portion of RedPajama V2. Find more details on our page and our preprint!
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("LSX-UniWue/LLaMmlein_7B")
tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/LLaMmlein_7B")