This is a rebuild/resurrection of Dark Forest 20B v2.0, made using 32 bit source files in place of FP16 wherever possible for superior quality. DavidAU released something very similar but only in GGUF format, meaning I couldn't ablate it directly. His method of float32 upscaling inspired me to do the same so that DarkForest could be ablated.
The result is that quantizing these float32 safetensors directly to GGUF produces better prose, while ablation is gentle enough to only slightly damage KL divergence in return for 100% refusal elimination (compared to 79% without abliteration), no jailbreaks required. They can also be down-converted to float16 for easier merging with the included python script.
v2 was chosen over v3 because the latter implemented breadcrumbs, which seemed less cohesive than dare_ties. v2 also has reports of being better at RP/ERP.
I am releasing safetensors of pre and post-ablation checkpoints, along with their Compliance scores, and the yamls used to make this.
Notes:
- This upscale of DarkForest requires
ChatMLtokenizer settings. - This is the ablated version. Go here for the unablated version.