Term of the Moment

binary compare


Look Up Another Term


Definition: AI weights and biases


Numerical values added to the neurons in a neural network training phase to adjust the outcome. Initially, weights are randomly added to each input, and using a "gradient descent" algorithm, they are continually adjusted to increase or decrease the importance of that input to generate the best output. See neural network.




Weights and Biases
Weights adjust the network to create accurate outcomes. For example, in a system designed to recognize letters and digits, each pixel becomes an input to the network. For pixels in more critical locations of the image, higher weights can be assigned. Biases are used to fine tune the network and add flexibility.






The Network Has Hidden Layers
Neural networks can have many hidden layers, all of which are mathematically tied together. This is a character recognition example showing how pixels are turned into neurons. See convolutional neural network.