All Tools

Recursion & Call Stack Visualizer

Step through recursive functions one call at a time. Watch the call stack grow and shrink, trace the recursion tree, and see how memoization transforms exponential algorithms into linear ones.

Reference Guide

How Recursion Works

A recursive function calls itself with a smaller version of the same problem. Every recursive function needs two parts: a base case that stops the recursion, and a recursive case that breaks the problem into smaller pieces.

When a function calls itself, the computer pushes a new frame onto the call stack. Each frame stores the function's parameters and local variables. When the base case is reached, frames start popping off the stack as return values flow back up.

Factorial is the classic example: factorial(5) calls factorial(4), which calls factorial(3), and so on until factorial(1) returns 1. Then each waiting frame multiplies by its own n and returns.

Call Stack and Stack Frames

The call stack is a data structure that tracks which function is currently running and which functions are waiting for results. Each entry is called a stack frame.

A stack frame holds the function name, its arguments, any local variables, and the return address (where to continue when this function finishes). Frames follow Last-In-First-Out (LIFO) order.

Stack overflow errors happen when recursion goes too deep and the call stack runs out of memory. For factorial(5), the maximum stack depth is 5 frames. For fibonacci(7) without memoization, the maximum depth is 7, but it creates 41 total calls.

Base Case vs Recursive Case

The base case is the condition that stops recursion. Without it, the function would call itself forever (causing a stack overflow). Good base cases handle the simplest possible input directly.

For factorial, the base case is n = 0 or n = 1 (both return 1). For fibonacci, the base cases are n = 0 (returns 0) and n = 1 (returns 1). For binary search, the base case is when the search range is empty (lo > hi).

The recursive case must always make progress toward the base case. Factorial decreases n by 1 each call. Binary search halves the search range. Merge sort halves the array. If your recursive case does not shrink the problem, recursion will never terminate.

Memoization and Dynamic Programming

Memoization stores the results of expensive function calls and returns the cached result when the same inputs occur again. It is the top-down approach to dynamic programming.

Naive fibonacci is the best example. Without memoization, fib(7) makes 41 recursive calls because it recomputes the same subproblems many times. With memoization, it makes only 13 calls because each unique subproblem is computed once and cached.

Memoization transforms the time complexity from O(2n) to O(n) for fibonacci. Toggle the memoization checkbox to see the recursion tree shrink dramatically as cached results prune entire subtrees.