Artificial Intelligence

Google’s new trillion-parameter AI language model is almost 6 times bigger than GPT-3 | The Next Web

Read more at thenextweb.com

A trio of researchers from the Google Brain team recently unveiled the next big thing in AI language models: a massive one trillion-parameter transformer system. The next biggest model out there, as far as we’re aware, is OpenAI’s GPT-3, which uses a measly 175 billion parameters. Background: Language models are capable of performing a variety […]
Back to top button