Logistic Regression

medium · classification, logistic, sigmoid

Logistic Regression

Implement binary logistic regression trained with gradient descent.

Functions to implement

1. sigmoid(z)

Compute the sigmoid function.

  • Input: A number or list of numbers
  • Output: Sigmoid(z) = 1 / (1 + exp(-z))

2. predict_proba(X, weights, bias)

Predict probabilities using logistic model.

  • Input: Features X, weights, bias
  • Output: List of probabilities (0 to 1)

3. predict(X, weights, bias, threshold=0.5)

Predict binary labels.

  • Input: Features X, weights, bias, threshold
  • Output: List of 0s and 1s

4. compute_loss(y_true, y_pred_proba, eps=1e-15)

Compute binary cross-entropy loss.

  • Input: True labels, predicted probabilities
  • Output: Loss value

5. train(X, y, lr, n_epochs)

Train logistic regression.

  • Input: Features X, labels y, learning rate, epochs
  • Output: (weights, bias, loss_history)

Examples

# Simple classification
X = [[1], [2], [3], [4]]
y = [0, 0, 1, 1]

weights, bias, losses = train(X, y, lr=0.5, n_epochs=100)
probs = predict_proba(X, weights, bias)
preds = predict(X, weights, bias)
# preds ≈ [0, 0, 1, 1]

Notes

  • Use sigmoid to convert scores to probabilities
  • Gradient: dL/dw = (1/n) * X^T @ (probs - y)
  • Clip probabilities to avoid log(0)
Run tests to see results
No issues detected