btlm-7b-base-v0.2 / README.md
echoctx's picture
Update README.md
eda8b42 verified
---
license: mit
library_name: transformers
tags:
- bittensor
- decentralization
- subnet 9
datasets:
- tiiuae/falcon-refinedweb
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/655a0bdf3ff5ba1b1b1c01b7/y1dKBZh8UhII6wtbs5boj.png" alt="drawing" width="512"/>
# 🚀 **BTLM-7B v0.2**
BTLM (Bittensor Language Model) is a collection of pretrained generative text models. This is the repository for the 7B pretrained model, optimized for dialogue use cases and converted for the HF中国镜像站 Transformers format.
### Model Details
Bittensor's decentralized subnet 9 facilitated the development and release of the first version of the BTLM-7B model. This initial release comprises a sophisticated large language model designed for a variety of applications.In creating this model, significant effort was made to ensure its effectiveness and safety, setting a new standard in the decentralized open-source AI community.
**This is a pretrained model, which should be further finetuned for most usecases.**
**Training subnetwork :** 9
**Checkpoint :** 06-06-2024
[**Subnet 9 Network Leaderboard**](https://huggingface.co/spaces/macrocosm-os/pretraining-leaderboard)
[**Top Bittensor Model Checkpoint**](https://huggingface.co/jw-hf-test/jw2)
### Inference
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "CortexLM/btlm-7b-base-v0.2"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
)
sequences = pipeline(
"Tell me about decentralization.",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
### Benchmark
| Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
| --- | --- | --- | --- | --- | --- | --- |
[LM Evaluation Harness Repository](https://github.com/EleutherAI/lm-evaluation-harness)
## License
BTLM-7B is licensed under the [MIT License](https://opensource.org/license/mit), a permissive license that allows for reuse with virtually no restrictions.