(Cover Artist: Jada Moncur)

Is zero less than one? To most people, the answer is obvious: of course it is!

But humans don’t come built in with the concept of zero. In fact, many children under the age of four couldn’t identify that a paper with zero dots had less dots than a paper with a singular dot. When you’re counting something, you start with 1, 2, 3, and so on. A long time ago, that was how numbers were used. If a farmer was counting how many horses they had, it would make for sense to have one, two or three horses. If they had no horses, instead of saying the number of horses they had was zero, they just wouldn’t count horses to begin with. For a long time, many civilizations didn’t have the concept of zero. Even the Romans, whose empire stretched over thousands of miles, didn’t have a concept of zero.

The earliest documented use of zero comes from ancient Mesopotamia, around 5,000 years ago. They used zero as a placeholder like we do: to distinguish between 222 and 2022, for example. Placeholder markers clearly indicate place value, making it easier to perform mathematical operations. When subtracting, for example, you can always ‘borrow’ a one from the next greatest place value and use it to subtract if two digits in the same place value don’t subtract easily.

It’s hard to chart the course of a concept, but it was in India that zero evolved from just being a placeholder to being its own numeral. In the year 628 C.E., Indian mathematician and astronomer Brahmagupta set basic laws for getting zero through mathematical operations (for example, a number multiplied by zero equals zero or a number subtracted from the same number also equals zero). The concept of decimals was also developed in India, setting up the system we use to this day to precisely measure things like money and temperature.

From there, Arabian traders trading from Indian ports likely took that concept to their own cities and towns, eventually reaching Persian mathematician Mohammed ibn-Musa al-Khowarizmi, who, in the 800s, worked on equations that equal to zero. This concept is essential to algebra, which is still practiced and used today. To this day, the numerals we use for numbers are Arabic numerals and have influences from both Indian mathematicians from earlier and later Arab mathematicians.

By then, the Umayyad Caliphate was well on its way to controlling present-day Spain and Portugal and introduced al-Khowarizmi’s work to Europe. Within a few hundred years, the concept of zero and operations using it had spread across the continent. In the 1600s, René Descartes, a French mathematician, developed the Cartesian coordinate system. Those familiar with coordinate planes would know they center on an origin point of (0, 0) and cannot exist without the concept of zero. Without zeroes, the concept of negative numbers also wouldn't have been possible, because we very rarely find negative things in the real world.

Zero is also fundamental in the development of calculus, which paved the way for many modern mathematics and adjacent fields. Take computers, for example. Whatever device you’re reading this on, be it a computer, a phone, or a tablet, most likely runs on binary code. Binary code is a sequence of ones and zeros that relays computer processing information, text and much more. At its simplest, the one turns a signal on and the zero turns it off. Zero, a numerical representation of nothingness, allows binary code to work and helps your devices run.

Nowadays, you probably wouldn’t think twice about zero. It’s just a number like any other. Zero doesn’t seem like an important number, or something worth a whole history about. And yet, it’s the basis of so much of what our world runs on. From computers to finances to scientific measurements: all of it would have been impossible without zero, and it's a testament to human ingenuity to take something completely theoretical, like nothingness, and build it into something intrinsic to human society.