Monday, August 8, 2022

Forward and Backward pass in Neural network

 The "forward pass" refers to calculation process, values of the output layers from the inputs data. It's traversing through all neurons from first to last layer.

A loss function is calculated from the output values.

And then "backward pass" refers to process of counting changes in weights (de facto learning), using gradient descent algorithm (or similar). Computation is made from last layer, backward to the first layer.

Backward and forward pass makes together one "iteration".

During one iteration, you usually pass a subset of the data set, which is called "mini-batch" or "batch" (however, "batch" can also mean an entire set, hence the prefix "mini")

"Epoch" means passing the entire data set in batches.

One epoch contains (number_of_items / batch_size) iterations

The Backpropagation. The aim of backpropagation (backward pass) is to distribute the total error back to the network so as to update the weights in order to minimize the cost function (loss)

In simple terms, after each forward pass through a network, backpropagation performs a backward pass while adjusting the model's parameters (weights and biases).


No comments:

Post a Comment