I've been catching up with recent progress and I wanted to share really brief summaries of the results I found interesting. Training (mainly) on aminoacid sequences * Large language models generate functional protein sequences across diverse families (Jul'21, Salesforce, 43 citations) allows sequence generation with conditioning on things like cellular...
(open in a new tab to view at higher resolution) In May 2018 (almost 3 years ago) OpenAI published their "AI and Compute" blogpost where they highlighted the trend of increasing compute spending on training the largest AI models and speculated that the trend might continue into the future. This...
Hi everyone! Here I link to a sketch of my thoughts on how recent advances in language modeling may be connected, or lead, to future advances in developing machine learning models with abstract reasoning capabilities. This was done as a side project last year during my research fellowship at the...