
google-bert/bert-base-uncased · Hugging Face
BERT has originally been released in base and large variations, for cased and uncased input text. The uncased models also strips out an accent markers. Chinese and multilingual uncased and cased versions followed shortly after.
google-bert/bert-base-chinese - Hugging Face
This model has been pre-trained for Chinese, training and random input masking has been applied independently to word pieces (as in the original BERT paper). Developed by: HuggingFace team; Model Type: Fill-Mask; Language(s): Chinese; License: [More Information needed] Parent Model: See the BERT base uncased model for more information about the ...
BERT - Hugging Face
It is used to instantiate a BERT model according to the specified arguments, defining the model architecture. Instantiating a configuration with the defaults will yield a similar configuration to that of the BERT google-bert/bert-base-uncased architecture. Configuration objects inherit from PretrainedConfig and can be used to control the model ...
tftransformers/bert-base-uncased - Hugging Face
BERT base model (uncased) Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is case-sensitive: it makes a difference between english and English.
google-bert/bert-base-uncased at main - Hugging Face
bert-base-uncased. 14 contributors; History: 26 commits. lysandre HF staff. Updates the tokenizer configuration file . 86b5e09 verified about 1 year ago. coreml. Add Core ML conversion (#42) almost 2 years ago.gitattributes. Safe. 491 Bytes. Adding `safetensors` variant of this model (#15) over 2 years ago;
Contents/bert-base-uncased - Hugging Face
BERT has originally been released in base and large variations, for cased and uncased input text. The uncased models also strips out an accent markers. Chinese and multilingual uncased and cased versions followed shortly after.
aubmindlab/bert-base-arabert · Hugging Face
AraBERT v1 & v2 : Pre-training BERT for Arabic Language Understanding AraBERT is an Arabic pretrained lanaguage model based on Google's BERT architechture . AraBERT uses the same BERT-Base config.
aubmindlab/bert-base-arabertv2 - Hugging Face
AraBERT v1 & v2 : Pre-training BERT for Arabic Language Understanding AraBERT is an Arabic pretrained lanaguage model based on Google's BERT architechture. AraBERT uses the same BERT-Base config. More details are available in the AraBERT Paper and in the AraBERT Meetup
google-bert/bert-base-multilingual-cased - Hugging Face
BERT multilingual base model (cased) Pretrained model on the top 104 languages with the largest Wikipedia using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository .
dslim/bert-base-NER - Hugging Face
bert-base-NER is a fine-tuned BERT model that is ready to use for Named Entity Recognition and achieves state-of-the-art performance for the NER task. It has been trained to recognize four types of entities: location (LOC), organizations (ORG), person (PER) and Miscellaneous (MISC).