A value that directs the machine learning process and is adjusted throughout the training process. Selected by the neural network designer, hyperparameters are chosen before any training is done. Examples are the number of hidden layers in the network, the number of neurons per layer and training epochs (number of passes through the dataset). The number of training samples in the dataset and learning rate (how much the weight is changed) are also hyperparameters. See
AI training vs. inference and
neural network.
Parameters Are the Weights and Biases
Both hyperparameters and parameters (weights and biases) are set at the beginning; however, AI engineers often change the hyperparameters, but weights and biases (parameters) are continuously updated by software during the training stages. Large language models (LLMs) can have billions of parameters. See
AI weights and biases.