Loss Functions

easy · loss, mse, cross-entropy

Loss Functions

Implement common loss functions used in machine learning.

Functions to implement

1. mse_loss(y_true, y_pred)

Compute Mean Squared Error loss.

  • Input: Two lists of numbers (true values and predictions)
  • Output: Average of squared differences

2. mae_loss(y_true, y_pred)

Compute Mean Absolute Error loss.

  • Input: Two lists of numbers
  • Output: Average of absolute differences

3. binary_cross_entropy(y_true, y_pred, eps=1e-15)

Compute binary cross-entropy loss.

  • Input: Lists of true labels (0 or 1) and predicted probabilities
  • Output: Average cross-entropy loss
  • Use eps to avoid log(0)

4. categorical_cross_entropy(y_true, y_pred, eps=1e-15)

Compute categorical cross-entropy for multi-class.

  • Input: Lists of one-hot vectors (y_true) and probability distributions (y_pred)
  • Output: Average cross-entropy loss

5. softmax(logits)

Convert logits to probabilities.

  • Input: A list of numbers (logits)
  • Output: A list of probabilities that sum to 1

Examples

mse_loss([1, 2, 3], [1, 2, 4])       # 0.333...
mae_loss([1, 2, 3], [1, 2, 4])       # 0.333...
binary_cross_entropy([1, 0], [0.9, 0.1])  # small value
softmax([1, 2, 3])                   # [0.09, 0.24, 0.67]

Notes

  • Clip predictions in cross-entropy to avoid log(0)
  • Use math.log for natural logarithm
  • Use math.exp for exponential
Run tests to see results
No issues detected