Skip to content
#

distillation-model

Here are 16 public repositories matching this topic...

A transformer-based masked language model for learning amino acid sequence representations. The model uses self-attention mechanisms with custom gating and incorporates protein features for enhanced sequence understanding. Trained using BERT-style masking on peptide sequences to learn contextual amino acid embeddings.

  • Updated Mar 7, 2025
  • Python

Improve this page

Add a description, image, and links to the distillation-model topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distillation-model topic, visit your repo's landing page and select "manage topics."

Learn more