This document provides an overview of algorithm complexity and big O notation. It defines computational complexity as describing the amount of resources required for an algorithm relative to its inputs. Big O notation is introduced as a way to describe the growth rate of an algorithm's runtime. Common time complexities for algorithms like sorting, searching, and matrix operations are listed. The document warns of hidden complexity and provides examples of how easy it is to unintentionally write inefficient code. Benchmark results are shown for optimal versus naive implementations of algorithms like cumulative sums and word counting.