“
OpenAI’s 2019 model GPT-2 had 1.5 billion parameters,[96] and despite flashes of promise, it did not work very well. But once transformers got over 100 billion parameters, they unlocked major breakthroughs in AI’s command of natural language—and could suddenly answer questions on their own with intelligence and subtlety. GPT-3 used 175 billion in 2020,[97] and a year later DeepMind’s 280-billion-parameter model Gopher performed even better.[98] Also in 2021, Google debuted a 1.6-trillion-parameter transformer called Switch, making it open-source to freely apply and build on.[99]
”
”