[net.physics] Meter standard

skip@gatech.UUCP (10/27/83)

Do you (anyone) know why they changed the basis for the standard?
-- Skip Addison {emory,allegra,msdc,rlgvax}!gatech!skip

bill@utastro.UUCP (William H. Jefferys) (10/31/83)

Skip Addision (skip@gatech) asks why the definition of
the meter is being changed to one based on the speed of light.
Let me quote from an article by Arthur L. Robinson, found on p. 1367
of the 24 June, 1983 issue of Science Magazine.

	"It is currently possible to measure time
	 from the frequency of transitions in atomic
	 clocks much more accurately than it is to
	 measure distance from the wavelength of
	 optical radiation.  Hence, the proposal by the
	 Consultative Committee for the Definition of
	 the Meter (CCDM) opens the way to a much improved
	 determination of the meter."

	"Some metrologists have wondered if it would be possible
	 to replace the seven base standards (time, distance,
	 mass, temperature, current, voltage and resistance)
	 with a single base standard - time.  The redefinition
	 of the meter is the first stem in this process."

	"In principle, the most accurate meter could then
	 be determined from the best atomic clock, cesium-133,
	 which is accurate to about 8 parts in 10^14.  The present
	 meter standard is accurate to about 4 parts in 10^9."

The article goes on to discuss some new techniques that would
considerably improve the measurement of the meter based on the
new definition.  It is obvious that the whole affair is
based on practical considerations of attaining the greatest
accuracy with the available technology.  Another advantage
is that once the definition is adopted, it will never again
be necessary to redefine the meter to keep up with advancing
technology.

	Bill Jefferys  8-%
	Astronomy Dept, University of Texas, Austin TX 78712   (Snail)
	{ihnp4,kpno,ctvax}!ut-sally!utastro!bill   (uucp)
	utastro!bill@utexas-20   (ARPANET)