Big-O notation gives you the ability to describe how fast your code will run if given a large problem. It remains relevant because it doesn’t base anything on how fast or slow your computer actually is. Computers get faster all the time and there are so many different capabilities from a simple micro controller, to an average laptop, to the largest super-computer. Big-O notation works on all of these because it doesn’t care what type of processor you have. It just looks at the steps needed to solve the problem and how they scale as the problem gets bigger.

This episode also introduces two new terms, algorithms and asymptotes.

An algorithm is just the set of steps that a program takes to turn some input into some output. There are different ways that problems can be solved and some of them are much more efficient than others.

An asymptote is a line that in geometry is used to describe how some curve will converge upon. The curve may start out far from the line but will get closer. The curve might pass through the line and for a while get farther away again but it will eventually start approaching the line again. The curve will keep getting closer to the line. Eventually all the initial deviations will fade away and the line will dominate everything.

That’s how Big-O notation works. As the problem gets bigger, the speed of one computer vs. another become less and less of an issue until only the line remains. The line represents the steps that the code is taking to solve the problem. And some approaches to the problem will have asymptotic lines that are much faster than others as the problem size becomes large enough.

Listen to the full episode or read the full transcript below.

## Transcript

Think of it like this. A car has a specific top speed that depends on the type of car and what modifications have been made to it. A computer is like a car in this sense because the speed that a computer can run code depends on the type of processor and what other modifications have been made. Big-O notation is not concerned about how fast any specific car or computer can go but how fast can your code can go.

The speed of a car also depends on the skill of the driver. This doesn’t mean that a racing champion will win a race in a old clunker. But it’s more like how given any specific car, a racing champion will be able to drive that car faster around a course than somebody with less skill. You code is more like the driver of a car than the car itself because it can run on different computers just like a driver can drive different cars. Asking how long it will take a champion driver to complete a course when the type and condition of the car can change is meaningless.

Here’s where programming has an answer. With Big-O notation, asking how long it will take to run your code actually means something even without knowing anything at all about the speed or capabilities of the computer. You can compare code that was written 50 years ago with code from today and get value from that comparison. And you can do this even though a handheld calculator today has more power than the best computers back then.

How is it possible to ignore the speed of a computer and focus on just what the code does? It’s because when examining an algorithm. And let me stop here for a moment. An algorithm is not the same thing as a logarithm even though both words end with a “rithm” sound. An algorithm is just a fancy word for how something is done. In other words, what steps need to be taken to solve a problem. An algorithm is also concerned with what input is provided and by solving the problem, it produces some form of output. Okay, so when examining an algorithm, we can ignore specific factors such as the speed of the computer because we take the problem to the largest size imaginable. This is called a limit and the study of problems at this scale means we’re studying the asymptotic efficiency.

An asymptote in geometry is a line that a curve gets closer and closer to as the curve and the line stretch out to infinity. The curve might sometimes cross the line and momentarily get a little farther away. But it will eventually get closer to the line again. If not, then the line doesn’t represent an asymptote. The line becomes the limit of the curve. What this means is the running time of an algorithm may be affected drastically by specifics such as how fast the machine happens to be when the problem size is small. But as the problem size becomes larger, the limit will just dominate everything and nothing else matters anymore.

Think about it this way. A mountain is big and heavy, right? Full of rock. Now imagine a computer big and fast enough to eat that mountain for breakfast. I know, computers don’t eat rocks. Never mind, this one does. That’s a fast computer and compared to one from 50 years ago, the old computer would be lucky to finish a handful of sand. It seems like the speed of the computer really makes a difference here. But let’s take it to the limit. Because while a mountain is big, there’s things bigger still. What if we gave it an entire continent? How about a small moon? The earth itself? The thing about lines and curves that stretch to infinity is they can get pretty far out there. If we take this to the limit of all the stars, planets, and matter in the universe, then the difference between the two computers becomes so insignificant now. What seemed like a huge difference at first has ended up with both computers identical.

I have another example for you but before that, let’s take a moment to thank our sponsor.

( Message from Sponsor )

Alright, let’s look at this is as if it was a race. In lane one, you have the current world’s fastest car. You know the type, bright colors, loud engine, and a whole crew that can change tires faster than you can sneeze. In lane two, you have an average compact car with a hatchback and an engine that needs a little extra encouragement. If this was a quarter mile race, then there would be no contest. And in fact, if both cars are running the same race, then of course the faster car will win. This is not what Big-O notation is all about.

Big-O notation lets you analyze how a problem is solved. So going back to the race, let’s say that this is a 500 mile race. Something that would take the race car a few hours to complete and the average car maybe three times longer. This would apply if both cars were doing a linear race where a mile is a mile and start to completion requires all miles to be driven. Whether it takes 3 hours or 9 hours doesn’t matter. What matters is what happens when we take this to the limit. If it takes 3 hours for the fast car to travel 500 miles, then it should take 6 hours to go a thousand miles and 60 hours to go ten thousand miles.

But let’s say that the average car going at a slower speed has a very different set of rules that it follows. In other words, its code follows different steps. It has a special engine that even though it might run slower, allows it to skip the first 250 miles with just the first lap. The second lap takes the slow car another 125 miles. Let’s say that it needs to slow down a bit as it approaches the finish line so it doesn’t miss the winning ceremony. The third lap takes the slow car about another 60 miles. Even considering some downtime, the slow car finishes all 500 miles in just 10 laps.

This is the essence of Big-O notation. It allows you to express how code runs by comparing vastly different rules that the code can follow. This is why it doesn’t matter how fast the computer is or how fast the car is. A car that can jump around the track will always win a race against a faster car if the track is long enough. You could have a race between a bicycle and a supersonic jet and the bicycle would win if the race was long enough and the bicycle could somehow jump around like I just described.

We’ll continue talking about Big-O notation tomorrow and I’ll explain several different standard running times that you’ll likely encounter when programming.

I thought the analogy with the computers eating the mountain was a really good one! If you take it to the next level you can note how computer 1 might need to eat the mountain and chew each bite 32 times before swallowing, whereas computer 2 might be slower but it has special saliva that instantly liquefies the mountain and so it can drink the mountain. Either way nice episode, it’s definitely a challenging concept to explain.