Linear Regression
Linear Regression
Implement linear regression trained with gradient descent.
Functions to implement
1. predict(X, weights, bias)
Make predictions using linear model.
- Input: Features X (2D), weights (list), bias (float)
- Output: Predictions (list)
2. compute_loss(y_true, y_pred)
Compute mean squared error loss.
- Input: True values and predictions
- Output: MSE loss value
3. compute_gradients(X, y_true, y_pred)
Compute gradients of MSE with respect to weights and bias.
- Input: Features, true values, predictions
- Output: (weight_gradients, bias_gradient)
4. train(X, y, lr, n_epochs)
Train linear regression using gradient descent.
- Input: Features X, targets y, learning rate, number of epochs
- Output: (weights, bias, loss_history)
Examples
# Simple 1D regression: y = 2x + 1
X = [[1], [2], [3], [4]]
y = [3, 5, 7, 9]
weights, bias, losses = train(X, y, lr=0.1, n_epochs=100)
# weights ≈ [2.0], bias ≈ 1.0
predictions = predict(X, weights, bias)
# predictions ≈ [3, 5, 7, 9]
Notes
- Initialize weights to zeros
- X is (n_samples, n_features)
- Use batch gradient descent (all samples per update)
Run tests to see results
No issues detected