Term of the Moment

AI inference


Look Up Another Term


Definition: AI hyperparameter


A value that directs the machine learning process and is adjusted throughout the training process. Selected by the neural network designer, hyperparameters are chosen before any training is done. Examples are the number of hidden layers in the network, the number of neurons per layer and training epochs (the number of passes through the dataset). The number of training samples in the dataset and learning rate (size of steps taken) are also hyperparameters. See AI training vs. inference.

Parameters Are the Weights and Biases
Both hyperparameters and parameters (weights and biases) are set at the beginning; however, AI engineers often change the hyperparameters, but weights and biases (the parameters) are continuously updated by software during the training stages. Large language models (LLMs) can have billions and trillions of parameters. See neural network and AI weights and biases.