Archetypal Transfer Learning (ATL) is a proposal by @whitehatStoic for what is argued by the author to be a fine tuning approach that "uses archetypal data" to "embed Artificially Generated Archetypes". These "AGAs" are derived from patterns that models assimilate from artificially created data, such as artificial stories. The method yielded a shutdown activation rate of 38.6% allowing GPT2-medium to shutdown itself 386 times in 1,000 tries in the event its intelligence exceeded that of humans. .. (read more)
Archetypal Transfer Learning (ATL) is a proposal by @whitehatStoic for what is argued by the author to be a fine tuning approach that "uses archetypal data" to "embed Artificially Generated Archetypes". These "AGAs" are derived from patterns that models assimilate from artificially created data, such as artificial stories. The method yielded a shutdown activation rate of 38.6% allowing GPT2-medium to shutdown itself 386 times in 1,000 tries in the event its intelligence exceeded that of humans. .. (read more)