Term of the Moment

blind drop


Look Up Another Term


Definition: AI scaling


The common approach to building more comprehensive AI models is to use more data, more GPUs and larger datacenters. OpenAI co-founder Ilya Sutskever, who is considered a central figure in AI, claims we are reaching diminishing returns with scaling and that more research should be done to find new ways of building models from scratch.

Sutskever does not deny that more scaling is bringing more advanced systems; however, he says that the keys to future success will depend on more advanced training methods, not just more training, and more efficient algorithms, not just more passes through the data. See AI training vs. inference and neural network.