After all that work it's finally time to train our Neural Network. We'll use the BFGS numerical optimization algorithm and have a look at the results.
Supporting Code:
[ Ссылка ]
Yann Lecun's Efficient BackProp Paper: [ Ссылка ]
More on BFGS:
[ Ссылка ]
In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.
Part 1: Data + Architecture
Part 2: Forward Propagation
Part 3: Gradient Descent
Part 4: Backpropagation
Part 5: Numerical Gradient Checking
Part 6: Training
Part 7: Overfitting, Testing, and Regularization
Follow me on Twitter for updates:
@stephencwelch
Ещё видео!