Upload folder using huggingface_hub (#2)
Browse files- 399ab7d1c9166f89339367c9f00638f6a20bb898d97d50f812e34d60eb40d01c (ee9bbf42eba47450bbfe59529c3e9085c1192e78)
- f7279d8a2d6f5979b4a65db7e94f6dbefddeee1cb1ea9b5c8aa80385febd44ba (dba597abc9c0cf5b1e3fbacc09276ac85127d9ea)
- 45e17865eb66ed52349da89b83e0024cb704c9cf3e53404a385eb4ca5cf97054 (8b0c81a58cae464956bd36f94b884286a1026215)
- 142a6eeb96ee7853880a7ce682d2cbc3155b86e6f44ac49c11a6f7b3320186b8 (a0c8f82e5accd4623a4548c88fbc46a0f5a58d6d)
- f0947e5d47c20c656362368345012b12ea7552a794733277d9c89766536e1353 (20e2a7af7d4a26dce24bf72d4af223165ef92d3e)
- 27c3b559de15564779e18ee13bc57aa35b285e274135f16868d73d27b03af4af (bff0c60a927848569635f9649b2af70582a7a2c6)
- 54fdfc9f329d2b8173d8d671098ab3c660a383fc587be1597c7b0d66a7fb8bc3 (a9b197a313a951bb83915546d3ce256a71bb4694)
- fa7eb666b38d803283a5ff359acd8a95cacf8226384ef7046083d9f859860379 (e7534a1eb3fa1bc286d34542c9672d13f4742369)
- 5e4fa0be0f8fdcd04b3cfa57c7f65e8aa798e181a77522e307a87576180edb4a (e6937533642d9bc71230322b3cfbca7fca3ba553)
- 18937d8555f0f3a370b2d46a6cc8cb66148a01fcb42834106420ff434f085d74 (75c322b4e154f89fb58f62f376401f24f5762e81)
- 32357b8b404cf0adfae5f0eee4f905f9d07aefe55bccc3ef614aacfbd77af368 (bc1e0ccc4eca92bbed01300908a41bbd9cbca85c)
- 525ef712d5aab6ca41d6be82f3f9c14a0687da30e2abb7477ee89f3a00d65f72 (970f47c8a0fa38bcf1bd03e2fec65c11c07afccf)
- fbf6fb45962d217cb6014ea125a7cfdfe6026459d37520809425a90e3f45b159 (f1a26887a6a707235049aaecfd4c66f2cbfa0566)
- 3f11ec5216f81804ebeae466dbe10f809fbaefdd6d536ee43f86d71eba9d3767 (ced93bb490b11b54ded3108c7987f230ecaca81a)
- .gitattributes +13 -0
- README.md +46 -3
- Yi-Coder-9B-Chat-GGUF_imatrix.dat +3 -0
- Yi-Coder-9B-Chat.IQ1_M.gguf +3 -0
- Yi-Coder-9B-Chat.IQ1_S.gguf +3 -0
- Yi-Coder-9B-Chat.IQ2_XS.gguf +3 -0
- Yi-Coder-9B-Chat.IQ3_XS.gguf +3 -0
- Yi-Coder-9B-Chat.IQ4_XS.gguf +3 -0
- Yi-Coder-9B-Chat.Q2_K.gguf +3 -0
- Yi-Coder-9B-Chat.Q3_K_L.gguf +3 -0
- Yi-Coder-9B-Chat.Q3_K_M.gguf +3 -0
- Yi-Coder-9B-Chat.Q3_K_S.gguf +3 -0
- Yi-Coder-9B-Chat.Q4_K_M.gguf +3 -0
- Yi-Coder-9B-Chat.Q5_K_M.gguf +3 -0
- Yi-Coder-9B-Chat.fp16.gguf +3 -0
@@ -35,3 +35,16 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
Yi-Coder-9B-Chat.Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
Yi-Coder-9B-Chat.Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
Yi-Coder-9B-Chat.Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
Yi-Coder-9B-Chat.Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
Yi-Coder-9B-Chat.IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
Yi-Coder-9B-Chat.IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
Yi-Coder-9B-Chat.IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
41 |
+
Yi-Coder-9B-Chat.IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
42 |
+
Yi-Coder-9B-Chat.Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
43 |
+
Yi-Coder-9B-Chat.Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
44 |
+
Yi-Coder-9B-Chat.Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
45 |
+
Yi-Coder-9B-Chat.Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
46 |
+
Yi-Coder-9B-Chat.Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
47 |
+
Yi-Coder-9B-Chat.Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
48 |
+
Yi-Coder-9B-Chat.fp16.gguf filter=lfs diff=lfs merge=lfs -text
|
49 |
+
Yi-Coder-9B-Chat-GGUF_imatrix.dat filter=lfs diff=lfs merge=lfs -text
|
50 |
+
Yi-Coder-9B-Chat.IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
|
@@ -1,3 +1,46 @@
|
|
1 |
-
---
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- quantized
|
4 |
+
- 2-bit
|
5 |
+
- 3-bit
|
6 |
+
- 4-bit
|
7 |
+
- 5-bit
|
8 |
+
- 6-bit
|
9 |
+
- 8-bit
|
10 |
+
- GGUF
|
11 |
+
- text-generation
|
12 |
+
- text-generation
|
13 |
+
model_name: Yi-Coder-9B-Chat-GGUF
|
14 |
+
base_model: 01-ai/Yi-Coder-9B-Chat
|
15 |
+
inference: false
|
16 |
+
model_creator: 01-ai
|
17 |
+
pipeline_tag: text-generation
|
18 |
+
quantized_by: MaziyarPanahi
|
19 |
+
---
|
20 |
+
# [MaziyarPanahi/Yi-Coder-9B-Chat-GGUF](https://huggingface.co/MaziyarPanahi/Yi-Coder-9B-Chat-GGUF)
|
21 |
+
- Model creator: [01-ai](https://huggingface.co/01-ai)
|
22 |
+
- Original model: [01-ai/Yi-Coder-9B-Chat](https://huggingface.co/01-ai/Yi-Coder-9B-Chat)
|
23 |
+
|
24 |
+
## Description
|
25 |
+
[MaziyarPanahi/Yi-Coder-9B-Chat-GGUF](https://huggingface.co/MaziyarPanahi/Yi-Coder-9B-Chat-GGUF) contains GGUF format model files for [01-ai/Yi-Coder-9B-Chat](https://huggingface.co/01-ai/Yi-Coder-9B-Chat).
|
26 |
+
|
27 |
+
### About GGUF
|
28 |
+
|
29 |
+
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
|
30 |
+
|
31 |
+
Here is an incomplete list of clients and libraries that are known to support GGUF:
|
32 |
+
|
33 |
+
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
|
34 |
+
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
|
35 |
+
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
|
36 |
+
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
|
37 |
+
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
|
38 |
+
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
|
39 |
+
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
|
40 |
+
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
|
41 |
+
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
|
42 |
+
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
|
43 |
+
|
44 |
+
## Special thanks
|
45 |
+
|
46 |
+
🙏 Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and the whole team working on [llama.cpp](https://github.com/ggerganov/llama.cpp/) for making all of this possible.
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:44f1eaefcc5304736ff6676e8c444fd7d24a5a92ac5115a4ecbac6063d26d9bf
|
3 |
+
size 6843266
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1bbf60a4ca28905465b02abe85ea80b66fa20258be6189efae6f3a56399bc7ba
|
3 |
+
size 2181640992
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:32c08b88f221cfbfbd341de972d19ace892e318e4366ddb7a77144177e32cf96
|
3 |
+
size 2014573344
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7840c65649fed23c2407671956567d964e46e235a71be2afce155d267cf9b11f
|
3 |
+
size 2708009760
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:bd5a58ce56aa8679a57e076a02d68748e62d78ab403cb6d9b8f043ff6c3bf632
|
3 |
+
size 3717935904
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eeb40eee5e3a1bded5586be3e34025cc9105de3b7cb2ba3c6edead01d7fe6442
|
3 |
+
size 4785009440
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6c577f1921fdd349678cc10fb0c503c8dc8caa989bce5d32c25e8f184d55c55c
|
3 |
+
size 3354325792
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5000851b9fb41e9349f39ee6e427654fb97e9e74eaaa9eda9c28a58cda7b816f
|
3 |
+
size 4690752288
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:74f8338c2bc02d56d42839a5dc6f16a52360ee55b050c4e7d741f804138833ca
|
3 |
+
size 4324406048
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a999a57db1751903d071950f987e39edf9fd50007a95eff1db57b83456e047e0
|
3 |
+
size 3899208480
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4da8426bd1a502426af3ef3d03b035d2a578c930b5966667d7493b317f20d181
|
3 |
+
size 5328958240
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:97fc107f1cac00e82eca61818f29d4dc69ab8dd9a9dd85f911053fc2700b4ec4
|
3 |
+
size 6258258720
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c017e4159c0c6fbb5a4beb50cda79fc8594a59df455e96ed83abfdc5ba9fc8a3
|
3 |
+
size 17661112896
|