Algorithm complexity — or how good is an algorithm?

Manon Jacquin
6 min readMay 6, 2019
pic by Paweł Czerwiński

This week I wanted to explain to myself — and you, reader! algorithms complexity. This subject caught my attention last summer when I attended a ‘Girl Develop It’ meetup in San Francisco on algorithms.

What’s an algorithm — and why should we care?

Thank you Dictionary. So in other words, an algorithm describes a specific procedure in order to achieve a certain goal in a repeatable way. I always felt that the term ‘algorithm’ is a pretty fancy word to describe something actually simple.

If we think about it, our lives are shaped around algorithms and we use them all the time. Take this daily life situation for instance:

  • open the fridge
  • take one egg
  • close the fridge
  • take a pan
  • turn on the gas
  • put the pan on the gas
  • break the egg
  • wait for a few minutes

We can safely call this a make-an-egg algorithm! We did a specific set of rules/procedure in order to solve a problem: we were hungry and we wanted to cook an egg.

In computer science, we deal more with sequences of computational steps, and these steps are meant to achieve an input/output relationship:

A beautiful algorithm schema by me inspired by Khan Academy

An algorithm has, therefore, two main goals:

  • solve a problem
  • in a repeatable way

So far, so good.

But why do we care? Because, ultimately, we want our program to be fast! And for it to be fast, we need to build the best algorithm possible. We want to come up with the greatest procedure that will outcome our result faster.

Great plan!! But how can I measure if my algorithm will be good enough?

Thanks to two complementary techniques:

  • Time efficiency: how long does the algorithm take
  • Space complexity: how much space does the algorithm uses

Time efficiency: drop off 50 kids with a school bus

  1. Drive bus with all kids to school
  2. Drop off all kids in once

In this example, we are saving time, but it requires you to have a bus. We have a lower time complexity but a higher space complexity

Space efficiency: drop off 50 kids with a small car

  1. Drive a small car with two kids to school
  2. Drop off two kids
  3. Repeat until all kids are in school

Here it takes a long time but doesn’t require a bus. We have a higher time complexity and a lower space complexity.

Mind=blown

Now that we know what efficiency means, let’s see how we can measure it.

Optimization — where efficiency matters

To be really fair, I am not yet really into optimization when I write code! I am more into making my program works, which is actually already a great victory to me when it does!!

Nevertheless. I know the day I’ll need to optimize my code will come soon (right?!) — so, what is optimization?

Optimization is the act of designing your code in a way that your algorithms will be correct and efficient:

  • Correctness: you are able to solve the problem (aka: where I am now 🤓)
  • Efficiency: you do it quickly (time efficiency) and it does not take too much space (space complexity)

The correctness is pretty straightforward: if you solve the problem, you’re good. The difficulty seems to appear when you need to deal with efficiency. Are we going to optimize to complete the task in the shortest time or using the least amount of space? But, maybe more accurate here, how do I measure efficiency anyway??

Thanks to a technique called asymptotic analysis. That’s a pretty fancy word for something that, this time, is a little more complex to define than an algorithm!

Asymptotic Notation — Measuring computational complexity

The asymptotic notation is here to help us answer different questions:

  • The estimation of how long a program will run.
  • The estimation of the largest input that can reasonably be given to the program.
  • We can use it to compare the efficiency of different algorithms.
  • Finally, choose which algorithm to use!

Simply put, we use the asymptotic notation to better understand the computational complexity of an algorithm. And this computational complexity is expressed by something called Big O Notation.

Big O Notation — expressing computational complexity

Big O Notation is the language programmers use to describe the complexity of an algorithm. It is an approximation of how quickly space or time complexity grows relative to the input size. Often, developers talk about worst case scenarios, meaning that they look at the functions when n is very large.

I found it very complex to understand Big O without a specific example. Imagine that we are building an algorithm for hanging up laundry (I know, very useful)

Version 1: Drying up my (cleaned) socks: socks.drop()

  1. Dumping socks on the floor

Here the only thing we are doing is dumping our socks on the floor. How long this algorithm will take?

Let’s assume that the act of dumping takes 2 seconds. Here it won’t matter if we have 2, 4, 6, or 100 socks, it will always take us 2 seconds to dump them to the floor. So if N is the number of socks, it will take (N*0+2) time to drop off the items.

How does this translate into Big O Notation? If Big O Notation looks at how the algorithm grows relative to its input size, here the amount of time will never change regardless of the input. The time will be constant:

Big O — Constant time

Version 2: Drying up my socks — the normal way

  1. Dumping socks on the floor
  2. Pick up each sock and make a pile for men socks and women socks
  3. Pick up each sock from each pile and hang up each sock

Here we’ll have more complexity in our algorithm because we have to iterate two times over our pile of socks:

Assuming that putting a sock in a pile takes 10 seconds and hanging it up takes 30 seconds:

This time, the more socks we have, the more time it will take to hang them up. Because the algorithm grows relative to its input size, the amount of time will increase relative to the number of socks we have. The time will be linear:

Big O — Linear time

We can visualize the rate of growth as such:

From Girls Develop It ❤

Expressing computational complexity takes practice. If you want to dig deeper check out Khan Academy.

That’s it for now!

--

--

Manon Jacquin

Fullstack Developer ~ Writer - Perpetual Adventurer