GPT

Multicore
Multicore (+46/-7)
A_donor (+16/-6)
Ben Pace (+36)
Raemon (+4/-5)
habryka
Ruby (+161)

GPT (Generative Pretrained Transformer) is a family of large transformer-based language models created by OpenAI.OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI.

External links:

GPT-3 Paper

GPT-3 Website

GPT (Generative Pretrained Transformer) is a family of large transformer-based language modelmodels created by OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI. 

GPT (Generative Pretrained Transformer) is a large transformer-based language model created by OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI. 

GPT-2GPT is a large transformer-based language model created by OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI. 

GPT-2 is a large transformer-based language model created by OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI.