Sunday, October 2, 2022

What is back propagation

Backpropagation is a strategy to compute the gradient in a neural network. The method that does the updates is the training algorithm. For example, Gradient Descent, Stochastic Gradient Descent, and Adaptive Moment Estimation.


Lastly, since backpropagation is a general technique for calculating the gradients, we can use it for any function, not just neural networks. Additionally, backpropagation isn’t restricted to feedforward networks. We can apply it to recurrent neural networks as well.


the difference between Feedforward Neural Networks and Backpropagation. The former term refers to a type of network without feedback connections forming closed loops. The latter is a way of computing the partial derivatives during training.

References:

https://www.baeldung.com/cs/neural-networks-backprop-vs-feedforward#:~:text=Lastly%2C%20since%20backpropagation%20is%20a,recurrent%20neural%20networks%20as%20well..

No comments:

Post a Comment