The Standard Meter

Back Home Next

Article taken from "Backsights" Magazine published by Surveyors Historical Society


The French originated the meter in the 1790s as one/ten-millionth of the distance from the equator to the north pole along a meridian through Paris.  It is realistically represented by the distance between two marks on an iron bar kept in Paris.  The International Bureau of Weights and Measures, created in 1875, upgraded the bar to one made of 90 percent platinum/10 percent iridium alloy.

In 1960 the meter was redefined as 1,650,763.73 wavelengths of orange-red light, in a vacuum, produced by burning the element krypton (Kr-86).  More recently (1984), the Geneva Conference on Weights and Measures has defined the meter as the distance light travels, in a vacuum, in 1/299,792,458 seconds with time measured by a cesium-133 atomic clock which emits pulses of radiation at very rapid, regular intervals.

None of the definitions changed the length of the meter, but merely allowed this length to be duplicated more precisely.

Our English foot has not been so constant.  The U. S. Congress legalized the use of the metric system in 1866 on the basis that one meter is exactly equal to 39.37 inches.  In 1959 a number of English-speaking countries agreed that an inch is exactly equal to 2.54 centimeters so that the International foot is exactly equal to 0.3048 meters.  The United States retained the old 1866 equivalency and called it the U. S. Survey foot so that 1 U. S. Survey foot equals 1.000002 International feet.

Back Home Next