J
John Miller
Guest
Hello,
This question is more to do with electricity than electronics per se - I
hope that's okay!
The quality of the electricity supply to my house is terrible (North
Carolina, standard single phase). For *ages*, light bulbs have been popping
very prematurely. Last week, I bought a used 'scope from eBay and, looking
for a quick signal to test it with, stuck a x10 probe into an power outlet.
I did not see a nice sine wave but a very distorted one - like you'd get if
you over drive a vacuum tube. This also explained why an old NAD amplifier
I was testing was consistently yielding DC rail voltages approx. 20% higher
than specified.
Although the peak line voltage is about right (153V which would be approx.
108V rms), the true rms value is about 130V). I've reported this to the
local utility company. The person in the Power Quality section didn't
understand the concept of peak vs. rms and also stated that an engineer
would come out to run some tests on the neutral bonding but that they don't
have the means to look at the shape of the incoming waveform (surely not!).
Anyway, I grew up in the UK and so I don't know the regulations in the US
specifying the quality of power supply (given the never ending brown-outs
etc, I suspect they are not as strict).
So - the question: what's the right way to get this seen to and does anyone
have a link to the corresponding NFPB/NEC regulations?
Thanks,
John.
This question is more to do with electricity than electronics per se - I
hope that's okay!
The quality of the electricity supply to my house is terrible (North
Carolina, standard single phase). For *ages*, light bulbs have been popping
very prematurely. Last week, I bought a used 'scope from eBay and, looking
for a quick signal to test it with, stuck a x10 probe into an power outlet.
I did not see a nice sine wave but a very distorted one - like you'd get if
you over drive a vacuum tube. This also explained why an old NAD amplifier
I was testing was consistently yielding DC rail voltages approx. 20% higher
than specified.
Although the peak line voltage is about right (153V which would be approx.
108V rms), the true rms value is about 130V). I've reported this to the
local utility company. The person in the Power Quality section didn't
understand the concept of peak vs. rms and also stated that an engineer
would come out to run some tests on the neutral bonding but that they don't
have the means to look at the shape of the incoming waveform (surely not!).
Anyway, I grew up in the UK and so I don't know the regulations in the US
specifying the quality of power supply (given the never ending brown-outs
etc, I suspect they are not as strict).
So - the question: what's the right way to get this seen to and does anyone
have a link to the corresponding NFPB/NEC regulations?
Thanks,
John.