This is a repo for tracking my daily paper reading habit
Date | Title | First Author | One Sentence Description |
---|---|---|---|
5-7-2021 | Generating novel protein sequences using Gibbs sampling of masked language models | S.R. Johnson et-al | The autors describe a framework in which they use unsupervised language models such as ProtBert, ESM to sample novel protein sequences using gibbson sampling. They measure the performance of their approach purely computationally |
11-7-2021 | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | Jacob Devlin et-al | The original paper of BERT that demonstrates the use of Bidirectional transformers for variuos uses of language understanding. BERT is an encoder-only model, and is mostly useful for creating embedding vectors for downstream learning tasks |