Skip to content

Commit dd44e08

Browse files
# Conflicts: # ch09-artificial_neural_networks/README.md
2 parents dd8fecf + e2b52e6 commit dd44e08

1 file changed

Lines changed: 1 addition & 2 deletions

File tree

ch09-artificial_neural_networks/README.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,6 @@ The general flow for forward propagation includes the following steps:
1515
![Forward propagation 2](readme_assets/Ex-ANN-exercise-solution-2.png)
1616

1717
## Back Propagation
18-
Phase A: Setup
1918
1. Define ANN architecture: This involves defining the input nodes, the output nodes, the number of hidden layers, the number of neurons in each hidden layer, the activation functions used, and more. We will dive into some of these details in the next section. For now, we will stick to the same ANN architecture that we have already used in the previous section.
2019
2. Initialize ANN weights: The weights in the ANN must be initialized to some value. There are various approaches to this, however, the key principle is that the weights will be constantly adjusted as the ANN learns from training examples but we need to start somewhere.
2120
Phase B: Forward propagation: This is the same process that we covered in the previous section. The same calculations are carried out; however, the predicted output will be compared with the actual class for each example in the training set to train the network.
@@ -27,4 +26,4 @@ Phase C: Training
2726
![Back propagation](readme_assets/ANN-backpropagation-chain-calc-adjust.png)
2827

2928
## Summary
30-
![Chapter 9 summary](readme_assets/Ch9-Summary.png)
29+
![Chapter 9 summary](readme_assets/Ch9-Summary.png)

0 commit comments

Comments
 (0)