I recently purchased 2 little voltmeters they look like the sort that would go in a control or instrument panel they are connected with just 2 wires which provide the operating supply ( they light up green and red) however the green one states it will work between 20and 500 volts and the red one between 60 and 480 volts. When they are both on the green one indicates normally around 241 volts the red one shows 235 volts why the discrepancy I know it's not much but makes you wonder if one of them is lying. Secondly I've noticed that the green one tracks voltage changes faster than the red one and that a few times the green one jumps down to 238 then up to 241 multiple times while the red one stays the same and I think can see a slight flicker in my filament lamps when this is happening incidentally both meters are connected to the same plug a 2 pin 5 amp one
Sounds like some digital meters I had a look at on a famous internet auction site...
6V difference in readings could mean than each is only 3V off - which on a 500V scale is just 0.6% of f.s.d. which probably isn't too bad (or equally they could both be off by a much larger amount). If they're digital they'll sample the voltage at intervals and the analogue to digital converter (ADC) will work in certain sized steps (usually the full range divided by some power of 2 - e.g. dividing the range into 256, 1024, 2048, or 4096 etc steps) - the greater the number of steps the more expensive the ADC - so you get what you pay for. Supplying the meters with a variable voltage (say through an old fashioned variac) might be educational. Likewise the sample rate could be different between the two meters which might explain why one is more likely to respond to any short duration glitches. There could also be some 'smoothing' going on in there.
Sounds like some digital meters I had a look at on a famous internet auction site...
6V difference in readings could mean than each is only 3V off - which on a 500V scale is just 0.6% of f.s.d. which probably isn't too bad (or equally they could both be off by a much larger amount). If they're digital they'll sample the voltage at intervals and the analogue to digital converter (ADC) will work in certain sized steps (usually the full range divided by some power of 2 - e.g. dividing the range into 256, 1024, 2048, or 4096 etc steps) - the greater the number of steps the more expensive the ADC - so you get what you pay for. Supplying the meters with a variable voltage (say through an old fashioned variac) might be educational. Likewise the sample rate could be different between the two meters which might explain why one is more likely to respond to any short duration glitches. There could also be some 'smoothing' going on in there.