migtissera/HelixNet · Hugging Face
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
ProtBert is based on Bert model which pretrained on a large corpus of protein... See more
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
ProtBert is based on Bert model which pretrained on a large corpus of protein... See more
Rostlab/prot_bert · Hugging Face
The text embedding set trained by Jina AI, Finetuner team.
Intended Usage & Model Info
jina-embeddings-v2-base-en is an English, monolingual embedding model supporting 8192 sequence length.
It is based on a Bert architecture (JinaBert) that supports the symmetric bidirectional variant of ALiBi to allow longer sequence length.
The backbone... See more
Intended Usage & Model Info
jina-embeddings-v2-base-en is an English, monolingual embedding model supporting 8192 sequence length.
It is based on a Bert architecture (JinaBert) that supports the symmetric bidirectional variant of ALiBi to allow longer sequence length.
The backbone... See more
jinaai/jina-embeddings-v2-base-en · Hugging Face
The Nemotron-3 8B family is available in the Azure AI Model Catalog, HuggingFace, and the NVIDIA AI Foundation Model hub on the NVIDIA NGC Catalog. It includes base, chat, and question-and-answer (Q&A) models that are designed to solve a variety of downstream tasks. Table 1 shows the full family of foundation models.
Model
Variant
Key Benefit
Base
N... See more
Model
Variant
Key Benefit
Base
N... See more