Alternatives to Backpropagation

Remembering that the updated of the weights is done by

And is nothing more then:

we can empirically calculate it if we take a really small .

But is it any good ? Actually no, the backpropagation algorithm ha complexity while empirically calculating the derivative has complexity

This is due to the fact that taking the length of the NN () we will need to run only one feedforward algorithm for calculating the backpropagation one

While for calculating the error you need to re-run the feedforward propagation for each feedforward step, (calculate all errors for ).