neoTCR
Popular repositories
-
bertviz
bertviz PublicForked from jessevig/bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Jupyter Notebook 1
-
ProtTrans
ProtTrans PublicForked from agemagician/ProtTrans
ProtTrans is providing state of the art pre-trained models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
Jupyter Notebook
-
SeqVec
SeqVec PublicForked from Rostlab/SeqVec
Modelling the Language of Life - Deep Learning Protein Sequences
Python
-
tape
tape PublicForked from songlab-cal/tape
Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology.
Python
-
tape-neurips2019
tape-neurips2019 PublicForked from songlab-cal/tape-neurips2019
Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology. (DEPRECATED)
Python
-
provis
provis PublicForked from salesforce/provis
Official code repository of "BERTology Meets Biology: Interpreting Attention in Protein Language Models."
Python
Repositories
- MiniFold Public Forked from hypnopump/MiniFold
MiniFold: Deep Learning for Protein Structure Prediction inspired by DeepMind AlphaFold algorithm
- TorchProteinLibrary Public Forked from lamoureux-lab/TorchProteinLibrary
PyTorch library of layers acting on protein representations
- MolDQN-pytorch Public Forked from aksub99/MolDQN-pytorch
A PyTorch Implementation of "Optimization of Molecules via Deep Reinforcement Learning".
- deepchem Public Forked from deepchem/deepchem
Democratizing Deep-Learning for Drug Discovery, Quantum Chemistry, Materials Science and Biology
-
- deepmind-research Public Forked from google-deepmind/deepmind-research
This repository contains implementations and illustrative code to accompany DeepMind publications
- ProtTrans Public Forked from agemagician/ProtTrans
ProtTrans is providing state of the art pre-trained models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
- SeqVec Public Forked from Rostlab/SeqVec
Modelling the Language of Life - Deep Learning Protein Sequences
- provis Public Forked from salesforce/provis
Official code repository of "BERTology Meets Biology: Interpreting Attention in Protein Language Models."
- tape Public Forked from songlab-cal/tape
Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology.
People
This organization has no public members. You must be a member to see who’s a part of this organization.
Top languages
Loading…
Most used topics
Loading…