jgk@osc.COM (Joe Keane) (12/13/90)
I have some incandescent lamp curves in front of me, and they pretty much agree with the formulas others have posted. At 110% of rated voltage the power consumed is 115% of normal, the light output is 140% of normal, and the expected life is 40% of normal. Conversely, at 90% of rated voltage the power consumed is 85% of normal, the light output is 70% of normal, and the expected life is 400% of normal. You can see that it's easy to get a very large increase in life by dropping the supply voltage a bit. This fact is frequently re-discovered, as if no one knows it and the light bulb companies are conspiring to make you buy more. It's good to know if you only care about lamp life, say they're in a remote location and hard to change. The problem is the resulting loss in efficiency; notice that at 90% of rated voltage the lumens per watt is down to about 80% of normal. If you're thinking economically, you should keep in mind that the cost of a light bulb is much lower than the cost of the power it consumes during its lifetime. The conclusion is that if you're dropping the supply voltage, or equivalently using a higher rated voltage, you're making a mistake. If you're happy with the reduced light output, then you should use a lower wattage bulb with the right voltage. You'll use less power and also get better color. Some sort of inrush current limiter and/or voltage regulator would be a good idea, since start-up and high-voltage periods consume a disproportionate amount of the lamp's life. But that's a lot more complicated than a simple rectifier.