LESSWRONG
LW

Wikitags

GPT

Edited by Ruby, Multicore, Ben Pace, A_donor, et al. last updated 27th Aug 2022

GPT (Generative Pretrained Transformer) is a family of large transformer-based language models created by OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI.

External links:

GPT-3 Paper

GPT-3 Website

Subscribe
1
Subscribe
1
Discussion0
Discussion0
Posts tagged GPT
89Collection of GPT-3 results
Ω
Kaj_Sotala
5y
Ω
24
70To what extent is GPT-3 capable of reasoning?
QΩ
TurnTrout, Daniel Kokotajlo
5y
QΩ
73
65GPT-3: a disappointing paper
nostalgebraist
5y
43
63GPT-3 Fiction Samples
gwern
5y
15
56$1000 bounty for OpenAI to show whether GPT3 was "deliberately" pretending to be stupider than it is
Ω
Bird Concept
5y
Ω
39
54Two Small Experiments on GPT-2
jimrandomh
7y
28
111Alignment As A Bottleneck To Usefulness Of GPT-3
Ω
johnswentworth
5y
Ω
57
72How "honest" is GPT-3?
QΩ
abramdemski, gwern
5y
QΩ
18
39'This Waifu Does Not Exist': 100,000 StyleGAN & GPT-2 samples
gwern
7y
6
37Does GPT-2 Understand Anything?
Douglas Summers-Stay
6y
23
37345M version GPT-2 released
[anonymous]6y
0
29Replicating the replication crisis with GPT-3?
skybrian
5y
10
19How well can the GPT architecture solve the parity task?
Q
FactorialCode, gwern
5y
Q
3
260larger language models may disappoint you [or, an eternally unfinished draft]
Ω
nostalgebraist
4y
Ω
31
140Developmental Stages of GPTs
Ω
orthonormal
5y
Ω
72
Load More (15/377)
Add Posts