Algorithms & Big-O Notation Cheat Sheet
A printable reference covering algorithm steps, Big-O notation, time complexity, space complexity, loops, recursion, and common growth rates for grades 10-12.
Algorithms are step-by-step procedures for solving problems, and Big-O notation describes how their resource use grows as input size increases. This cheat sheet helps students compare algorithms without focusing on a specific computer or programming language. It is useful for coding interviews, data structures, sorting, searching, and understanding why one solution may scale better than another. The most important ideas are time complexity, space complexity, input size n, and worst-case growth. Common Big-O classes include O(1), O(log n), O(n), O(n log n), O(n^2), O(2^n), and O(n!). Students should learn how loops, nested loops, recursion, and data structure operations affect complexity.
Key Facts
- Big-O notation gives an upper bound on growth rate, so O(n^2) means the work grows no faster than a constant times n^2 for large n.
- Constant time is O(1), which means the number of operations does not depend on input size n.
- A single loop that runs n times is usually O(n), such as for i = 1 to n.
- Two nested loops that each run n times are usually O(n^2), because n times n = n^2 operations.
- Binary search is O(log n) because each step cuts the remaining search space roughly in half.
- Efficient comparison sorting algorithms such as merge sort and heap sort run in O(n log n) time in typical or worst-case analysis.
- Recursive algorithms often have time complexity based on a recurrence, such as T(n) = 2T(n/2) + O(n) for merge sort.
- Big-O ignores constant factors and lower-order terms, so O(3n^2 + 10n + 5) simplifies to O(n^2).
Vocabulary
- Algorithm
- An algorithm is a finite set of clear steps used to solve a problem or complete a task.
- Big-O Notation
- Big-O notation describes the upper-bound growth rate of an algorithm as input size becomes large.
- Time Complexity
- Time complexity measures how the number of operations an algorithm performs grows with input size.
- Space Complexity
- Space complexity measures how much extra memory an algorithm uses as input size grows.
- Input Size
- Input size, usually written as n, is the amount of data the algorithm must process.
- Recurrence Relation
- A recurrence relation is an equation that describes the running time of a recursive algorithm in terms of smaller inputs.
Common Mistakes to Avoid
- Keeping constant factors in Big-O, such as writing O(2n) instead of O(n), is wrong because Big-O focuses on long-term growth rate.
- Adding loop costs incorrectly is wrong because sequential loops are added, while nested loops are multiplied.
- Assuming every recursive function is O(log n) is wrong because recursion depth and work per call both affect total complexity.
- Confusing best case with worst case is wrong because Big-O analysis usually describes an upper bound, often the worst-case behavior.
- Ignoring space complexity is a mistake because an algorithm can be fast but still use too much extra memory.
Practice Questions
- 1 A loop runs from i = 1 to n and prints one value each time. What is the time complexity in Big-O notation?
- 2 An algorithm has running time 5n^2 + 20n + 100. Simplify this using Big-O notation.
- 3 A binary search starts with 1024 sorted items and cuts the remaining search space in half each step. About how many steps are needed in the worst case?
- 4 Two algorithms solve the same problem: Algorithm A is O(n^2), and Algorithm B is O(n log n). Explain which one is usually better for very large n and why.