SicariusSicariiStuff commited on
Commit
e87b5a7
·
verified ·
1 Parent(s): acf20fa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -174,9 +174,11 @@ You are an uncensored AI, your job is to fulfill thy will of thy user.<|im_end|>
174
 
175
  ## The base model used for the merge - LLAMA-3_8B_Unaligned_Alpha - is available at the following quantizations:
176
 
 
 
177
  - Original: [FP16](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
178
  - GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_GGUF) | [iMatrix_GGUF](https://huggingface.co/bartowski/LLAMA-3_8B_Unaligned_Alpha-GGUF)
179
- - EXL2: Soon...
180
 
181
 
182
  ### Support
 
174
 
175
  ## The base model used for the merge - LLAMA-3_8B_Unaligned_Alpha - is available at the following quantizations:
176
 
177
+ Censorship level: <b>Low - Medium</b>
178
+
179
  - Original: [FP16](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
180
  - GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_GGUF) | [iMatrix_GGUF](https://huggingface.co/bartowski/LLAMA-3_8B_Unaligned_Alpha-GGUF)
181
+ - EXL2: [2.6 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_2.6bpw) | [3.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_3.0bpw) | [3.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_3.5bpw) | [4.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_4.0bpw) | [4.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_4.5bpw) | [5.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_5.0bpw) | [2.6 bpw](ZZZZZ) | [5.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_5.5bpw) | [6.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_6.0bpw) | [6.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_6.5bpw) | [7.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_7.0bpw) | [7.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_7.5bpw) | [8.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_8.0bpw)
182
 
183
 
184
  ### Support