BERT base model (uncased)

Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English.

Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the HF中国镜像站 team.

Downloads last month
16
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train Contents/bert-base-uncased-test