100 trillion parameter...
...gpt-3 models will be a thing by 2023, according to Nvidia CEO Jensen Huang.
He made the prediction in his GTC keynote speech. I've posted the youtube video and it should start at the relevant timestamp.
As Jensen points out, the human brain has 125 trillion synapses, so by some measures, especially in the field of language processing, machines could soon be getting close to human-level capability. Document summaries, phrase completion and the ability to produce code from plain English, are a few examples.
As Jensen makes clear elsewhere in the video, computers writing the software to run computers is a major goal of the research. With the ability to both understand language input and, in response, code their own software, computers will be showing many of the hallmarks of intelligence while outperforming people numerous tasks.
Assuming they're not too busy mining bitcoin.