GPT-2 was released in 2019. GPT-3 is a big leap forward. The model has 175 billion parameters (the values that a neural network tries to optimize during training), compared with GPT-2's already vast 1.5 billion.
Developers have built an
* all-purpose Excel function
* a recipe generator
* a layout generator (translates natural language to JSX)
* a search engine
* GPT-3 to generate code for a machine learning model, just by describing the dataset and required output. This is the start of no-code AI.
* GPT-3 can write Python code and explain its functions