This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Maximum voltage drop on SWA runs to outbuildings, am I over engineering the solution

I do quite a few EV charger installs, more and more seem to end involving long SWA cable runs to garages etc with voltage drop driving cable size.

In many cases I am running an EV charger, other bits in the garage and garage lighting circuits from the same cable run.

According to my understanding of the regulations lighting circuits are only allowed to have a 3% voltage drop between the incoming supply and the accessory compared to 5% for power. Therefore voltage drop for the garage lighting ends up being the driver for cable size on the garage supply cable.

I get the impression that others bend the rules and have certainly found some installs that do not comply even with the 5% guidance, I appreciate that most of the time we can get away with it but I don't want to end up with upset customers and maybe having to replace an expensive cable.

But by calculating this way, am I being unnecessarily stringent with my calculations. I realise that this assumes worst case supply voltage from the grid which I feel I have to stick with, but for example are typical modern LED light fittings more tolerant to low voltage supply compared to filament lamps and therefore the guidance in 7671 is actually out of date? 

Parents
  • Just found this article 

    https://www.electronicspoint.com/forums/threads/running-led-bulb-on-lower-voltage.275364/ 

    Based on this with a very simple driver a 10% difference in voltage has a very limited impact on output but as we approach 20% to 25% drop the impact on output is considerable.

    Therefore there is a risk of problems and the question will be what voltage range has the device been optimised for. If it's centered on 230V there is probably quite a lot of wiggle room but if it's 240V there is much less. If I was designing a low cost LED light I would certainly be thinking about reliability as well as performance over the required voltage range and may design for a higher voltage to give better reliability, sacrificing light output at lower voltages, which are unusual in the UK at least.

Reply
  • Just found this article 

    https://www.electronicspoint.com/forums/threads/running-led-bulb-on-lower-voltage.275364/ 

    Based on this with a very simple driver a 10% difference in voltage has a very limited impact on output but as we approach 20% to 25% drop the impact on output is considerable.

    Therefore there is a risk of problems and the question will be what voltage range has the device been optimised for. If it's centered on 230V there is probably quite a lot of wiggle room but if it's 240V there is much less. If I was designing a low cost LED light I would certainly be thinking about reliability as well as performance over the required voltage range and may design for a higher voltage to give better reliability, sacrificing light output at lower voltages, which are unusual in the UK at least.

Children
  • Indeed the lamps that take  'universal' mains - really meaning terrestrial" , but lets not quibble, from say 85 to 250V have inside an SMPS inside that is regulating for constant LED current, not LED voltage, and these are pretty bombproof.

    But the simpler ones (230v only)  are closer to the internals of these ones, which are a Phillips special for Dubai, but not that different from their  euroland  offering, except being a bit better engineered.

    I have the Osram variants of this LED filament design, and one of these is  flickery, exacerbated I think as the LEDs have a very steep forward voltage current curve, so a small change in voltage is a large change in current, and as electrons are  almost proportional to photons, a large change in light output too.

    However the fact that 2 'identical' lamps are different suggests that the design is not exactly fixed yet.

    (and I have only cut one open, the other may not be the same inside.)

    Mike