migtissera/HelixNet · Hugging Face
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
ProtBert is based on Bert model which pretrained on a large corpus of protein... See more
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
ProtBert is based on Bert model which pretrained on a large corpus of protein... See more
Rostlab/prot_bert · Hugging Face
Is DNA all you need?
In new work, we report Evo, a genomic foundation model that learns across the fundamental languages of biology: DNA, RNA, and proteins. Evo is capable of both prediction tasks and generative design, from molecular to whole genome scale. https://t.co/BPo9ggHhmp
Patrick Hsux.comThe text embedding set trained by Jina AI, Finetuner team.
Intended Usage & Model Info
jina-embeddings-v2-base-en is an English, monolingual embedding model supporting 8192 sequence length.
It is based on a Bert architecture (JinaBert) that supports the symmetric bidirectional variant of ALiBi to allow longer sequence length.
The backbone... See more
Intended Usage & Model Info
jina-embeddings-v2-base-en is an English, monolingual embedding model supporting 8192 sequence length.
It is based on a Bert architecture (JinaBert) that supports the symmetric bidirectional variant of ALiBi to allow longer sequence length.
The backbone... See more