[sci.electronics] How are power line voltages determi

irwin@uiucdcsb.cs.uiuc.edu (05/12/88)

I may be able to shed some light on your question. Years ago, I worked
for Illinois Power Co. The Illinois Commerce Commission is the one that
sets the standards as to what is acceptable, in our state.

The current minimum acceptable voltage to the customer is 113 VAC and
the maximum is 127 VAC. Since the mean value between the two is 120 VAC,
that is what should be stamped on electrically operated devices, designed
to be operated at the current standards.

Many moons ago, 110 VAC was the standard and though it is no longer
acceptable, old habits die hard and people still refer to "110/220".

I do not believe there is a federal standard, probably a state standard
in each state, and are also probably all the same, because of the link
of utilities nation wide, forming a "power grid". Utilities purchase/sell
power to each other, and the study of the grid is an interesting subject.

When it gets dark on the West coast, people to the East have gone to bed
and a lot of the lighting load is shut down, so, power flows to the West
through the grid. As people in the East are turning on lights, it is still
light in the West, so power can flow to the East. This grid has been known
to have its problems however, as there have been times when several states
were put into the dark with a massive failure. Safeguards are built in,
but, it still can get into trouble from time to time.

jbn@glacier.STANFORD.EDU (John B. Nagle) (05/12/88)

      Part of the answer to this question goes back to the early history of
the electric industry.  In the early days of electric lamp manufacture,
Edison's lamp plants were unable to manufacture lamps of uniform resistance.
The plants of the 1880s and 1890s manufactured lamps with carbonized paper
filaments.  The carbonization process produced widely varying product.
The original target operating voltage was 100 volts DC, but the lamp 
manufacturing process resulted in lamps with proper operating voltages
between about 85 and 135 volts.  This resulted in a serious yield problem.
Only about half of the lamps would give full life at full output
at 100 volts.  This yield problem was doubling lamp production cost.
The solution chosen was one very familiar in the semiconductor industry --
part selection.  Lamps were tested with a manual photometer as the voltage
was adjusted, and the lamps thus sorted by operating voltage.

      This created a marketing problem.  What to do with the off-voltage
lamps?  The answer was to provide some diversity in system voltages.  Bear
in mind that in the early days, most power systems were isolated, and most
electric lamps were provided by the power company (a tradition carried into
the 1970s in a few areas of the U.S.).  So various systems ran at different
voltages, ranging from about 90 to 130 volts.

      Standardization came later.  Through the 1950s, there were large
installed systems with non-standard voltages, and even non-standard
frequencies; parts of Boston, for example, ran on 25Hz into the 50s.
Large parts of New York and Chicago were on DC through the 50s.  
Electronics of that vintage could usually cope; the standard radio was
"AC-DC", and many early transformerless TV sets would run happily on
DC.  In an era where 20% resistors were normally 20% and 5% was "precision".
all TV sets had large numbers of front and back panel adjustments, and 
could usually be tweaked into operating despite voltage variations.

      See "History of the Early Electrical Manufacturers", published by
the Harvard Business School, for more than you ever wanted to know 
on this subject.  MBAs may find the business plan for Edison's first
power plant fascinating.

					John Nagle