A Brief History of Thermometry

All thermometers work on the principle of detecting changes in the physical properties of things as temperature changes.

As early as 220 BC, Philo of Byzantium noted the expansion and contraction of air with hot and cold. By the 16th and 17th centuries AD, European scientists had used this principle to create the earliest thermal instruments by trapping air in glass tubes that were closed at one end and submerged in water at the other.

These early "thermoscopes," as they were called, displayed the rise and fall of the water line relative to the contraction and expansion of the air trapped inside the tube. The Venetian physician Santorio Santorio is credited with being the first to put measured markings with a numerical scale on the sides of one of these air "thermoscopes," effectively creating the first thermometer.

The famed astronomer Galileo Galilei was among the first to experiment with the expansion and contraction of substances other than air. Galileo filled glass spheres with wine and alcohols of different density, suspending them in water and noting their relative rise and fall when exposed to hot or cold. And in about 1654, Ferdinando II de' Medici, the Grand Duke of Tuscany, sealed alcohol inside of tubes by closing both ends, effectively creating the first modern thermometer by isolating the expansion of the alcohol from the variability of barometric air pressure.

Key innovations came from Christiaan Huygens (1665) who first suggested using the melting and boiling points of water as standards and Carlo Renaldini (1694) who proposed using them as fixed points on a universal scale.

Finally, in 1724, the Dutchman Gabriel Fahrenheit began manufacturing sealed thermometers with mercury inside of them. Mercury was found to expand much more dramatically with heat than alcohol. Taking a tip from Sir Isaac Newton, who had suggested 12 points or degrees between the melting point of water and the temperature inside the human mouth, Fahrenheit devised a universal scale for his thermometers with the temperature of a mixture of water, ice and sea salt as his zero and the temperature inside an adult male's mouth as his 96 (which is 8 x 12). A slightly modified version of this scale is still used today in the United States.

Almost two decades later, in 1742, the Swedish astronomer Anders Celsius proposed a universal scale with 100 degrees of difference between the melting point of water and its boiling point (although Celsius originally envisioned having 100°C be the melting point and 0°C be the boiling point). His 100-degree scale (or centigrade) is now commonly used throughout the world, though many scientific applications prefer to use a newer centigrade scale (1848) based on the theoretical "absolute zero" where all molecular activity stops. The Kelvin scale, devised by Lord Thomson Kelvin from Ireland, has a much wider range of temperatures than Fahrenheit or Celsius.

Eventually, scientists found other physical properties that respond reliably to the application of heat and cold besides the expansion and contraction of liquids inside a tube. Dial thermometers depend upon the expansion and contraction of metal. Electronic thermometers, like thermistor and thermocouple thermometers use the effects of heat and cold on the speed or flow of electronic circuits to calculate temperature. Infrared thermometers measure the emission of infrared radiation. Still other thermometers measure the affect of heat and cold on sound waves, photoluminescence, fluorescence, magnetism, gamma rays, and many other physical phenomena.