Linear Algebra Essentials

Vectors

  • Ordered list of numbers
  • Represent data points, weights, directions
  • Operations: addition, scalar multiplication, dot product

Dot Product

  • a · b = sum(a[i] * b[i])
  • Measures similarity/alignment
  • Core of neural network computations

Norms

  • L2: sqrt(sum of squares) - Euclidean distance
  • L1: sum of absolute values - Manhattan distance
  • Used in loss functions and regularization

Matrices

  • 2D arrays, dimensions: rows × columns
  • Datasets, transformations, weight matrices
  • Transpose: swap rows and columns

Matrix Multiplication

  • (m×n) × (n×p) = (m×p)
  • Inner dimensions must match
  • Order matters: AB ≠ BA generally

Special Matrices

  • Identity: 1s on diagonal, neutral for multiplication
  • Symmetric: A = A^T (covariance matrices)
  • Diagonal: only diagonal elements non-zero

Eigenvalues/Eigenvectors

  • A · v = λ · v
  • Directions preserved by transformation
  • Foundation for PCA
1 / 1
Use arrow keys or click edges to navigate. Press H to toggle help, F for fullscreen.