An imaginary number is a mathematical term for a number whose square is a negative real number. Imaginary numbers are represented with the letter i, which stands for the square root of -1. This definition can be represented by the equation: i2 = -1. Any imaginary number can be represented by using i. For example, the square root of -4 is 2i.
When imaginary numbers were first defined by Rafael Bombelli in 1572, mathematicians believed that they did not really exist, hence their name. Decartes coined the term imaginary in reference to these numbers in his 1637 book, La Geometrie. However, imaginary numbers are as real as any other numbers and have gradually come to be accepted by the mathematical community and the world at large. The work of mathematicians Leonhard Euler and Carl Friedrich Gauss in the 18th and 19th centuries was instrumental in this change.
While imaginary numbers are meaningless in the "real world" of most individuals, they are indispensable in such fields as quantum mechanics, electrical engineering, computer programming, signal processing, and cartography. For perspective, consider that negative numbers were also once considered fictitious, and that such concepts as fractions and square roots might be considered meaningless to a person who does not need them in everyday life, though they are quite real to others.
To better understand imaginary numbers, geometry may be helpful. Picture a standard number line: zero is in the center, positive numbers are found to the right of zero and negative numbers are found to the left. At the zero point, visualize another line perpendicular to the first, stretching up and down rather than right and left. This is the axis of imaginary numbers, also known as the y-axis in geometry, while the "standard number line" is the x-axis. Positive imaginary numbers extend up from the zero point, and negative imaginary numbers extend down. Zero is the only number that is considered both real and imaginary.