merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using mlabonne/AlphaMonarch-7B as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: mlabonne/AlphaMonarch-7B
# No parameters necessary for base model
- model: Test157t/Kunocchini-7b-128k-test
parameters:
density: 0.53
weight: 0.3
- model: KatyTheCutie/SlushySlerp-7B
parameters:
density: 0.53
weight: 0.3
merge_method: dare_ties
base_model: mlabonne/AlphaMonarch-7B
parameters:
int8_mask: true
dtype: bfloat16
- Downloads last month
- 14
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for nocudaexe/Neural_Waifu_7b_V0.1
Merge model
this model