MSA Transformer
Axial attention. Multiple Sequence Alignments (MSAs).
Written by Liam Bai who lives and works in Boston trying to build useful things. He's on LinkedIn and Twitter.
Axial attention. Multiple Sequence Alignments (MSAs).
Learning protein representations. Representation learning, transfer learning, masked language models, BERT on proteins.
Predicting protein structure and function. Multiple Sequence Alignments (MSAs), the protein folding problem, the Potts model, Direct Coupling Analysis (DCA), EVCouplings.