You may find yourself asking why we go through all the bother of binary numbers when the decimal system has worked so well for humans for so long.
Unfortunately, computers only understand two states of being, off and on, represented by the bits
1 respectively. Computer hardware would be incredibly large, expensive, and resource-intensive if they were made to handle ten different states of data.
Binary data, when run through hardware, is seen as power applied or power not applied. An incredible level of precision and regulation would have to be built into the hardware to modify the applied electrical voltage so minutely as to fluctuate between ten levels of power.
Binary data also typically comes in specific lengths, for example, eight bits is called a Byte and two Bytes (16 bits) is called a Word. When the incoming data follows these guidelines, it is easy for the hardware to process and computer the desired result.
One common place you might see this is internet speed, which is typically expressed in kilobits or Megabits per second (kb/s or Mb/s).
File storage on your computer, however, is referenced in Byte size. For example, your favourite app might be 250 MegaBytes (MB).
When you buy Gig-speed internet (1 Gb/s), you may think you can download your favourite app in 0.25 seconds, but remember a Byte (file size) is eight times bigger than a bit (internet speed), so it will really take you 2.00 seconds, or eight times as long as you thought.
At the end of the day, it is all just a combination of
1s, off and on, that represents every piece of digital information you interact with, from the text on this screen to the color of the pixels in your profile picture.
Take a look at the GIF to the right. Binary data can impact the speed at which a CPU can process information. When good data (words, nibbles, bytes) passes through the CPU, the meter (which represents CPU processing speed) moves toward the right. When bad data (unknown binary values) passes through, the meter moves towards the left.