Backpropagation

What is back propagation?
Backpropagation is a technique used to train certain classes of neural networks - it is essentially a principle that allows the machine learning program to adapt itself according to its past function.

Backpropagation is sometimes referred to as 'backpropagation of errors'.

Back propagation as a technique uses gradient descent: it calculates the gradient of the loss function at the output and distributes it back through the layers of a deep neural network. The result is adjusted weights for neurons. Although backpropagation can be used in both supervised and unattended networks, it is considered a supervised learning method.

With the advent of simple forward neural networks where data only goes in one direction, engineers found that they could use backpropagation to adjust the neural input weights afterwards. Backpropagation can be viewed as a way to train a system based on its activity to adjust how precisely or precisely the neural network processes certain inputs or how it leads to some other desired state.

Was the explanation to "Backpropagation"Helpful? Rate now:

Weitere Erklärungen zu Anfangsbuchstabe B