Category: Microsoft, artificial-intelligence

by If the debate seems recent, that’s because it is (writing from 2020): The notorious GPT-2 model was announced by OpenAI in February 2019, but it wasn’t fully released until nearly 9 months later (although it was replicated before that).

Approximate size comparison of GPT-2, represented by a human skeleton, and GPT-3 approximated by the bones of a Tyrannosaurus rex.

Whatever the mechanism, GPT-3 is so immense, and trained on such a large corpus of data, that it can use prompts alone to do things that GPT-2 might be able to do, albeit comparatively poorly, only with substantive fine-tuning of weights.

Many startups, researchers, and tinkerers already had ambitious projects that used GPT-2, and many of these have since made the switch to GPT-3 with a range of results.

Related Articles