Page3/20
Forward & Backpropagation — How Networks Learn · Page 1 of 2
Forward Propagation
Forward & Backpropagation
Forward Pass (Prediction)
The forward pass moves data through the network, layer by layer:
Input → Layer1 → Layer2 → Layer3 → Output
Example: 3-Layer Network
Architecture:
- Input layer: 3 neurons
- Hidden layer 1: 4 neurons
- Hidden layer 2: 2 neurons
- Output layer: 1 neuron
Forward Pass:
1. Input: x = [0.5, 0.2, 0.8]
2. Layer 1:
z1 = W1 · x + b1 (matrix multiplication)
a1 = ReLU(z1) (activation)
3. Layer 2:
z2 = W2 · a1 + b2
a2 = ReLU(z2)
4. Output:
z3 = W3 · a2 + b3
output = sigmoid(z3)
5. Prediction: output = 0.73
Vectorization
In practice, we process batches of samples at once:
# Single sample (slow)
for sample in dataset:
output = forward(sample)
# Batch (fast, what GPUs do)
outputs = forward(batch_of_samples) # All at once!
Why? GPUs are designed for matrix operations.
Loss Function
After forward pass, calculate how wrong we are:
Loss = measure of how far prediction is from truth
For binary classification:
Loss = -y×log(ŷ) - (1-y)×log(1-ŷ) ← Cross-entropy
For regression:
Loss = (y - ŷ)² ← Mean squared error
Goal: Minimize loss by adjusting weights.
main.py
Loading...
OUTPUT
▶Click "Run Code" to execute…