GPT-3 is over-hyped | George Hotz and Lex Fridman

GPT-3 has no memory. Yet, GPT has shown a surprising tendency to task generalization, with pretty good results on few-shot learning. It asks for at least some benefit of a doubt that a larger pre-trained transformer could continue get better at generalization.