Learn how neural networks work from the ground up! Build, train, and visualize your own networks with automatic differentiation (autograd) - just like PyTorch and TensorFlow!
Micrograd is a tiny autograd engine that implements backpropagation (reverse-mode autodiff) over a dynamically built DAG (Directed Acyclic Graph). It's like a super-simplified version of PyTorch's autograd!
Autograd (automatic differentiation) automatically computes gradients. When you do math operations, it tracks everything in a computation graph. Then it can "backpropagate" to find how much each input affects the output - these are the gradients!
Click through these examples to see micrograd in action:
const a = new Value(2.0); const b = new Value(3.0); const c = a.mul(b).add(new Value(1)); // Compute gradients c.backward(); console.log(a.grad); // How much 'a' affects 'c' console.log(b.grad); // How much 'b' affects 'c'