Autograd Value: Build the Graph

medium · autograd, computational-graph, gradients

Autograd Value: Build the Graph

Implement the scalar Value class that builds a computational graph during the forward pass.

What you are building

Each Value stores:

  • data: float
  • grad: float (initially 0.0)
  • _prev: set[Value] (parents)
  • _op: str (debug label)
  • _backward: Callable[[], None] (initially a no-op)

The forward pass should create new Value objects and wire their _prev sets.

Methods to implement

1) __init__(self, data, _children=(), _op='')

  • Store data as a float
  • Initialize grad = 0.0
  • _backward = lambda: None
  • _prev = set(_children)
  • _op = _op

2) Arithmetic ops

Implement these to return a new Value:

  • __add__, __mul__, __pow__, __neg__, __sub__, __truediv__

Rules:

  • Accept Value or Python numbers (int/float)
  • Convert numbers to Value
  • Set _op to a readable string (e.g., "+", "", f"*{n}")
  • Set _backward to propagate gradients (use +=)

3) Reverse ops

Implement:

  • __radd__, __rmul__, __rsub__, __rtruediv__

These are called when the left operand is a Python number.

Notes

  • This problem does not implement backward() yet.
  • Gradients remain 0.0 after forward-only graphs.
  • Use += in _backward because a node can be used multiple times.

Example

a = Value(2.0)
b = Value(-3.0)
c = Value(10.0)

# Forward pass builds the graph
f = (a * b + c) ** 2

assert f.data == 16.0
assert a in (a * b)._prev

Hints

  • Division can be implemented as self * other**-1.
  • Subtraction can be implemented as self + (-other).
Run tests to see results
No issues detected