Zyra //// Zyra's website //// Electricity Companies //// Electrical Engineering //// Electronics //// Electric Shops //// Site Index
Why such a high voltage?
Why power lines are at thousands of volts rather than normal mains voltage
When you see electric power lines on high pylon towers strung out across the country, the warnings on the signs warn of Danger High Voltage! 33,000 volts! or 132,000 volts! or 400,000 volts! etc. Such enormously high voltages, and such big towers, surely an impressive and imposing sign of power. But why are the voltages so high? Not for safety reasons? And not to keep the birds off the lines. Also, electricity can be at a high power without being at a high voltage (a car starter motor is as powerful as many appliances that would be plugged into a 240 volt 13 amp socket and yet it runs off 12 volts). Anyway, here's the reason why electricity is distributed and transported around at such high voltage: To save money.
It makes sense, and saves money because of the way the science works. Electricity cables have resistance, and some of the electricity is lost en-route in the form of heat because of the resistance of the cables. However this can be minimised because the loss has more to do with current than with voltage. (see the explanation of volts and amps). By using transformers, electricity can be changed, keeping the power the same. Step-up transformers increase the voltage but lower the current, and step-down transformers reduce the voltage but increase the current. The power remains about the same. Power = volts x amps. I = current.
The formula for loss in power lines is: I2R (current squared times resistance)
In effect the electricity is "taxed" by the laws of physics at a rate proportional to current and then again on the current, as well as proportional to the cable resistance. A well-founded tax avoidance strategy is therefore to reduce the current as much as possible while keeping the power the same, and that means increasing the voltage. So, that's why electricity power lines are at such a high voltage. It has a better energy efficiency. It keeps the current low, which keeps the energy wastage low. Even so, over a few hundred miles, 10% can be lost. Incidentally, the energy that's "lost" is turned into heat.
You can see this in an experiment. If you get a long mains extension lead and plug it in the power outlet and put a 100 watt mains lightbulb on the end, it lights up perfectly well. Now, unplug the extension lead and connect it to a 12 volt car battery and on the other end put a 100 watt car headlamp bulb. This will not light so brightly. The reason is that quite a lot of the energy is lost in the cable. Notice how it's the same power in watts, and yet the loss is much worse at the lower voltage. It's because the resistance of the cable is more significant at the lower voltage, being in the same sort of range as the resistance of the lamp itself.
This situation has significance in the history of electrical science. In the early times, there was considerable rivalry between the supporters of AC and the supporters of DC. Westinghouse, who was in favour of AC, met with considerable criticism from Edison, who believed strongly that DC was the way electricity should be done. The battle was political, and Edison said that AC was dangerous and that prisoners to be executed in the electric chair should be "Westinghoused". Such brilliant rhetoric did very little versus the science, and the truth is that both AC and DC are dangerous in different ways. More significantly, DC has the disadvantage that it can not be stepped up and down using transformers, and so for distribution on a mains grid, AC prevails.
Also see electrical engineering, AC and DC, voltage and current, energy and power, etc.