3/20
Linear Regression from Scratch · Page 1 of 2

The Hypothesis & Cost Function

Linear Regression from Scratch

The Equation

We assume a linear relationship between features (X) and target (y): y = (weight * X) + bias

The Cost Function (MSE)

How do we know if our line is good? We measure the error using Mean Squared Error (MSE):

MSE = (1/n) * Σ(actual - predicted)²

Our goal is to find the weight and bias that make this error as close to 0 as possible.

Gradient Descent

To minimize the error, we take small steps down the "error curve":

  1. Calculate the gradient (slope of error).
  2. Update weights: weight = weight - (learning_rate * gradient).
  3. Repeat until convergence.

Run the code to see Gradient Descent in action!

main.py
Loading...
OUTPUT
Click "Run Code" to execute…