Big-O notation gives you the ability to describe how fast your code will run if given a large problem. It remains relevant because it doesn’t base anything on how fast or slow your computer actually is. Computers get faster all the time and there are so many different capabilities from a simple micro controller, to an average laptop, to the largest super-computer. Big-O notation works on all of these because it doesn’t care what type of processor you have. It just looks at the steps needed to solve the problem and how they scale as the problem gets bigger.

This episode also introduces two new terms, algorithms and asymptotes.

An algorithm is just the set of steps that a program takes to turn some input into some output. There are different ways that problems can be solved and some of them are much more efficient than others.

An asymptote is a line that in geometry is used to describe how some curve will converge upon. The curve may start out far from the line but will get closer. The curve might pass through the line and for a while get farther away again but it will eventually start approaching the line again. The curve will keep getting closer to the line. Eventually all the initial deviations will fade away and the line will dominate everything.

That’s how Big-O notation works. As the problem gets bigger, the speed of one computer vs. another become less and less of an issue until only the line remains. The line represents the steps that the code is taking to solve the problem. And some approaches to the problem will have asymptotic lines that are much faster than others as the problem size becomes large enough.