File size: 694 Bytes
b3681f8
 
 
 
 
 
 
 
 
 
76d99e1
 
 
b3681f8
76d99e1
b3681f8
76d99e1
b3681f8
bbece30
b3681f8
76d99e1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge

---


<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/Tn9MBg6.png" alt="MidnightMiqu" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>

# Midnight-Miqu-103B-v1.5-exl2-2.5bpw-rpcal

This is a 2.5bpw EXL2 quant of [FluffyKaeloky/Midnight-Miqu-103B-v1.5](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5)

The pippa file used for calibration is optimised for roleplay. The measurement file can be found in the files if you want to do your own quants.

Details about the model and the merge info can be found at the fp16 model link above.