Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Post History

71%
+3 −0
Q&A How could a damaged wire in split-phase power delivery create these voltages?

Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the...

3 answers  ·  posted 3y ago by Joel Lathrop‭  ·  edited 3y ago by Olin Lathrop‭

#2: Post edited by user avatar Olin Lathrop‭ · 2020-08-04T13:35:42Z (over 3 years ago)
  • Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the symptoms I observed.
  • My home is in the served split-phase power from a nearby transformer, single-phase power transformed down to 240V with a center tap. For the sake of explanation I refer to the wires coming from the transformer to the main junction box on my house as L1, L2 and G. L1 and L2 are the "hot" wires, the voltage between which should ideally be 240V. G is the center tap wire, which is grounded and should ideally have a voltage difference of 120V to either of L1 or L2.
  • When the outage occurred, I observed the following:
  • * The L1/G voltage difference was 120V (good).
  • * The G/L2 voltage difference was 90V (bad).
  • * The L1/L2 voltage difference was 30V (very bad!).
  • Despite my taking these measurements with a generic digital multimeter with a 10MOhm input impedance (which I'm told is not reliable for this case), I believe them to be accurate as the power company repairmen also made measurements and confirmed that theirs matched my own.
  • The power company repairman investigated and determined that someone had been digging nearby and nicked the insulation to the L2 cable going from the transformer to my house. He said that this damage resulted in corrosion and build-up of mineral residue from electrolysis, and that the result was that impedance of the L2 cable increased drastically. (This cable was actually a cluster of several thick wires, and he said that sometimes in cases like this the corrosion is so bad that only 2 of the wires in the cluster remain fully intact.)
  • What I don't understand is how that can result in those voltage measurements. I can understand how high impedance on L2 could cause a drop in the L1/L2 voltage, and how L1/G could remain 120V since the high impedance of L2 would not be a part of the L1/G circuit. But I'm thoroughly puzzled by how the L1/L2 voltage of 30V can be *lower* than the G/L2 voltage of 90V; it seems like both voltages to the center tap, namely L1/G and G/L2, should sum up to L1/L2... unless some kind of really weird voltage phase shift is going on.
  • Any answers that could educate me in the aspects of AC electrical power that might explain this would be greatly appreciated!
  • Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the symptoms I observed.
  • My home is served by split-phase power from a nearby transformer, single-phase power transformed down to 240V with a center tap. For the sake of explanation I refer to the wires coming from the transformer to the main junction box on my house as L1, L2 and G. L1 and L2 are the "hot" wires, the voltage between which should ideally be 240V. G is the center tap wire, which is grounded and should ideally have a voltage difference of 120V to either of L1 or L2.
  • When the outage occurred, I observed the following:
  • * The L1/G voltage difference was 120V (good).
  • * The G/L2 voltage difference was 90V (bad).
  • * The L1/L2 voltage difference was 30V (very bad!).
  • Despite my taking these measurements with a generic digital multimeter with a 10 M&Omega; input impedance (which I'm told is not reliable for this case), I believe them to be accurate as the power company repairmen also made measurements and confirmed that theirs matched my own.
  • The power company repairman investigated and determined that someone had been digging nearby and nicked the insulation to the L2 cable going from the transformer to my house. He said that this damage resulted in corrosion and build-up of mineral residue from electrolysis, and that the result was that impedance of the L2 cable increased drastically. (This cable was actually a cluster of several thick wires, and he said that sometimes in cases like this the corrosion is so bad that only 2 of the wires in the cluster remain fully intact.)
  • What I don't understand is how that can result in those voltage measurements. I can understand how high impedance on L2 could cause a drop in the L1/L2 voltage, and how L1/G could remain 120V since the high impedance of L2 would not be a part of the L1/G circuit. But I'm thoroughly puzzled by how the L1/L2 voltage of 30V can be *lower* than the G/L2 voltage of 90V; it seems like both voltages to the center tap, namely L1/G and G/L2, should sum up to L1/L2... unless some kind of really weird voltage phase shift is going on.
  • Any answers that could educate me in the aspects of AC electrical power that might explain this would be greatly appreciated!
#1: Initial revision by user avatar Joel Lathrop‭ · 2020-08-02T16:07:07Z (over 3 years ago)
How could a damaged wire in split-phase power delivery create these voltages?
Recently, my home suffered a partial power outage, and due to curiosity and a desire to learn more about residential AC power, I'm trying to understand how the event that took place resulted in the symptoms I observed.

My home is in the served split-phase power from a nearby transformer, single-phase power transformed down to 240V with a center tap.  For the sake of explanation I refer to the wires coming from the transformer to the main junction box on my house as L1, L2 and G.  L1 and L2 are the "hot" wires, the voltage between which should ideally be 240V.  G is the center tap wire, which is grounded and should ideally have a voltage difference of 120V to either of L1 or L2.

When the outage occurred, I observed the following:

* The L1/G voltage difference was 120V (good).
* The G/L2 voltage difference was 90V (bad).
* The L1/L2 voltage difference was 30V (very bad!).

Despite my taking these measurements with a generic digital multimeter with a 10MOhm input impedance (which I'm told is not reliable for this case), I believe them to be accurate as the power company repairmen also made measurements and confirmed that theirs matched my own.

The power company repairman investigated and determined that someone had been digging nearby and nicked the insulation to the L2 cable going from the transformer to my house.  He said that this damage resulted in corrosion and build-up of mineral residue from electrolysis, and that the result was that impedance of the L2 cable increased drastically.  (This cable was actually a cluster of several thick wires, and he said that sometimes in cases like this the corrosion is so bad that only 2 of the wires in the cluster remain fully intact.)

What I don't understand is how that can result in those voltage measurements.  I can understand how high impedance on L2 could cause a drop in the L1/L2 voltage, and how L1/G could remain 120V since the high impedance of L2 would not be a part of the L1/G circuit.  But I'm thoroughly puzzled by how the L1/L2 voltage of 30V can be *lower* than the G/L2 voltage of 90V; it seems like both voltages to the center tap, namely L1/G and G/L2, should sum up to L1/L2... unless some kind of really weird voltage phase shift is going on. 

Any answers that could educate me in the aspects of AC electrical power that might explain this would be greatly appreciated!