Post History
Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the...
#2: Post edited
- Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the symptoms I observed.
My home is in the served split-phase power from a nearby transformer, single-phase power transformed down to 240V with a center tap. For the sake of explanation I refer to the wires coming from the transformer to the main junction box on my house as L1, L2 and G. L1 and L2 are the "hot" wires, the voltage between which should ideally be 240V. G is the center tap wire, which is grounded and should ideally have a voltage difference of 120V to either of L1 or L2.- When the outage occurred, I observed the following:
- * The L1/G voltage difference was 120V (good).
- * The G/L2 voltage difference was 90V (bad).
- * The L1/L2 voltage difference was 30V (very bad!).
Despite my taking these measurements with a generic digital multimeter with a 10MOhm input impedance (which I'm told is not reliable for this case), I believe them to be accurate as the power company repairmen also made measurements and confirmed that theirs matched my own.- The power company repairman investigated and determined that someone had been digging nearby and nicked the insulation to the L2 cable going from the transformer to my house. He said that this damage resulted in corrosion and build-up of mineral residue from electrolysis, and that the result was that impedance of the L2 cable increased drastically. (This cable was actually a cluster of several thick wires, and he said that sometimes in cases like this the corrosion is so bad that only 2 of the wires in the cluster remain fully intact.)
- What I don't understand is how that can result in those voltage measurements. I can understand how high impedance on L2 could cause a drop in the L1/L2 voltage, and how L1/G could remain 120V since the high impedance of L2 would not be a part of the L1/G circuit. But I'm thoroughly puzzled by how the L1/L2 voltage of 30V can be *lower* than the G/L2 voltage of 90V; it seems like both voltages to the center tap, namely L1/G and G/L2, should sum up to L1/L2... unless some kind of really weird voltage phase shift is going on.
- Any answers that could educate me in the aspects of AC electrical power that might explain this would be greatly appreciated!
- Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the symptoms I observed.
- My home is served by split-phase power from a nearby transformer, single-phase power transformed down to 240V with a center tap. For the sake of explanation I refer to the wires coming from the transformer to the main junction box on my house as L1, L2 and G. L1 and L2 are the "hot" wires, the voltage between which should ideally be 240V. G is the center tap wire, which is grounded and should ideally have a voltage difference of 120V to either of L1 or L2.
- When the outage occurred, I observed the following:
- * The L1/G voltage difference was 120V (good).
- * The G/L2 voltage difference was 90V (bad).
- * The L1/L2 voltage difference was 30V (very bad!).
- Despite my taking these measurements with a generic digital multimeter with a 10 MΩ input impedance (which I'm told is not reliable for this case), I believe them to be accurate as the power company repairmen also made measurements and confirmed that theirs matched my own.
- The power company repairman investigated and determined that someone had been digging nearby and nicked the insulation to the L2 cable going from the transformer to my house. He said that this damage resulted in corrosion and build-up of mineral residue from electrolysis, and that the result was that impedance of the L2 cable increased drastically. (This cable was actually a cluster of several thick wires, and he said that sometimes in cases like this the corrosion is so bad that only 2 of the wires in the cluster remain fully intact.)
- What I don't understand is how that can result in those voltage measurements. I can understand how high impedance on L2 could cause a drop in the L1/L2 voltage, and how L1/G could remain 120V since the high impedance of L2 would not be a part of the L1/G circuit. But I'm thoroughly puzzled by how the L1/L2 voltage of 30V can be *lower* than the G/L2 voltage of 90V; it seems like both voltages to the center tap, namely L1/G and G/L2, should sum up to L1/L2... unless some kind of really weird voltage phase shift is going on.
- Any answers that could educate me in the aspects of AC electrical power that might explain this would be greatly appreciated!
#1: Initial revision
How could a damaged wire in split-phase power delivery create these voltages?
Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the symptoms I observed. My home is in the served split-phase power from a nearby transformer, single-phase power transformed down to 240V with a center tap. For the sake of explanation I refer to the wires coming from the transformer to the main junction box on my house as L1, L2 and G. L1 and L2 are the "hot" wires, the voltage between which should ideally be 240V. G is the center tap wire, which is grounded and should ideally have a voltage difference of 120V to either of L1 or L2. When the outage occurred, I observed the following: * The L1/G voltage difference was 120V (good). * The G/L2 voltage difference was 90V (bad). * The L1/L2 voltage difference was 30V (very bad!). Despite my taking these measurements with a generic digital multimeter with a 10MOhm input impedance (which I'm told is not reliable for this case), I believe them to be accurate as the power company repairmen also made measurements and confirmed that theirs matched my own. The power company repairman investigated and determined that someone had been digging nearby and nicked the insulation to the L2 cable going from the transformer to my house. He said that this damage resulted in corrosion and build-up of mineral residue from electrolysis, and that the result was that impedance of the L2 cable increased drastically. (This cable was actually a cluster of several thick wires, and he said that sometimes in cases like this the corrosion is so bad that only 2 of the wires in the cluster remain fully intact.) What I don't understand is how that can result in those voltage measurements. I can understand how high impedance on L2 could cause a drop in the L1/L2 voltage, and how L1/G could remain 120V since the high impedance of L2 would not be a part of the L1/G circuit. But I'm thoroughly puzzled by how the L1/L2 voltage of 30V can be *lower* than the G/L2 voltage of 90V; it seems like both voltages to the center tap, namely L1/G and G/L2, should sum up to L1/L2... unless some kind of really weird voltage phase shift is going on. Any answers that could educate me in the aspects of AC electrical power that might explain this would be greatly appreciated!