← Back to Home

Micrograd Neural Network Lab

Learn how neural networks work from the ground up! Build, train, and visualize your own networks with automatic differentiation (autograd) - just like PyTorch and TensorFlow!

Welcome to Micrograd

Micrograd is a tiny autograd engine that implements backpropagation (reverse-mode autodiff) over a dynamically built DAG (Directed Acyclic Graph). It's like a super-simplified version of PyTorch's autograd!

What is Autograd?

Autograd (automatic differentiation) automatically computes gradients. When you do math operations, it tracks everything in a computation graph. Then it can "backpropagate" to find how much each input affects the output - these are the gradients!

Interactive Examples

Click through these examples to see micrograd in action:

Example Code
const a = new Value(2.0);
const b = new Value(3.0);
const c = a.mul(b).add(new Value(1));

// Compute gradients
c.backward();

console.log(a.grad); // How much 'a' affects 'c'
console.log(b.grad); // How much 'b' affects 'c'