Lab 4: Gradient Descent for Best-Fit Line

CILO: 2
Weeks: 17–20
Lab #4

Aim

Implement batch gradient descent to learn slope and intercept for linear regression on toy data.

Objectives

Algorithm / Procedure

  1. Define X, y for a linear relation with noise-free data.
  2. Start with m=0, c=0 and choose learning rate.
  3. For each epoch: compute predictions, gradients, update m and c.
  4. After training, print m and c.

Python Code

import numpy as np

X = np.array([1,2,3,4,5], dtype=float)
y = np.array([2,4,6,8,10], dtype=float)  # y = 2x

m, c = 0.0, 0.0
L = 0.01
epochs = 1000

for _ in range(epochs):
    y_pred = m*X + c
    D_m = (-2/len(X)) * np.sum(X * (y - y_pred))
    D_c = (-2/len(X)) * np.sum(y - y_pred)
    m -= L * D_m
    c -= L * D_c

print("Slope (m):", round(m, 4), "Intercept (c):", round(c, 4))

Sample Output (expected)

Slope (m): 2.0 Intercept (c): 0.0