You are viewing revision 1.6.0, last edited by MiguelDev

Archetypal Transfer Learning (ATL) is a proposal by @whitehatStoic for what is argued by the author to be a fine tuning approach that "uses archetypal data" to "embed Synthetic Archetypes". These Synthetic Archetypes are derived from patterns that models assimilate from archetypal data, such as artificial stories. The method yielded a shutdown activation rate of 29.33% allowing GPT2-xl. 

The team, consisting of @MiguelDev, @marc/er, @Abhay Chowdhry, Mazianni and @Linda Linsefors is working to improve the build to a 100%. 

 ...

(Read More)