Let’s consider how much work this algorithm does in order to sort the array. What do we mean by work? Let’s count how many operations this algorithm does! We’ll consider any time we compare two number to be one operation.

This algorithm makes a lot of comparisons using our two `for`

loops! Given our sample data `[2, 7, -7, 20, 5]`

on our first pass, we are going to make four comparisons.

`2`

will first be compared to `7`

. `2`

is smaller. Then `2`

will be compared to `-7`

. `-7`

becomes our smallest number. We’ll then compare `-7`

to `20`

and then again to `5`

. We’ll make four total comparisons on this pass through the data.

On our next pass, the `-7`

has already been sorted, so we only have four items in our unsorted sublist. We’ll have to make three comparisons to find the smallest. On our third pass there will be two comparisons and so on. When we add all of these up, for a list of size 5, that is a total of 10 comparisons.

While that doesn’t seem like a ton of comparisons, imagine the amount of comparisons we would need to make with ten pieces of data: `[2, 7, -7, 20, 5, -3, 5, 18, 4, -50]`

. Our first pass alone would make nine comparisons. Now, imagine a hundred pieces of data!

In fact, when thinking about the runtime of algorithms, we only care about how they do on absurdly large lists (think infinitely long). It turns out that as we get larger and larger lists, the number of comparisons starts to approach `n^2`

(where `n`

is the length of the list).

We can see this happen in our example lists. With a list of size 5, we needed 10 comparisons. That’s pretty far away from 25 (5^2). When we have a list of size 10, we need to do 55 total comparisons. That’s getting closer to 10^2. We won’t go through the math here, but as n approaches infinity, the number of comparisons needed approaches `n^2`

.

`n^2`

comparisons is pretty bad. And the bad news keeps on coming! Some sorting algorithms can cut out some of their comparisons depending on the array they’re trying to sort. This isn’t the case for selection sort.

Consider the best case for this sorting algorithm: we give selection sort an already sorted array. Can selection sort cut out any comparisons? Unfortunately, the answer is no. We still need to look through all of our values to find the smallest value. It just so happens that we don’t need to swap it because it’s already in the right place.

Whether we’re given the best case, worst case, or any random list, selection sort will always make `n^2`

comparisons.

Nested loops are generally an indicator of quadratic complexity. This means that as the number of elements `n`

increases, the running time increases quadratically. This means that if `n`

doubles, we know that sorting time will quadruple.

If you’re intimidated by all this math, just remember we need to make around `n`

comparisons to find the smallest item in the array. And we need to do that `n`

times. So we end up at a runtime of `n^2`

.