This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Maximum voltage drop on SWA runs to outbuildings, am I over engineering the solution

I do quite a few EV charger installs, more and more seem to end involving long SWA cable runs to garages etc with voltage drop driving cable size.

In many cases I am running an EV charger, other bits in the garage and garage lighting circuits from the same cable run.

According to my understanding of the regulations lighting circuits are only allowed to have a 3% voltage drop between the incoming supply and the accessory compared to 5% for power. Therefore voltage drop for the garage lighting ends up being the driver for cable size on the garage supply cable.

I get the impression that others bend the rules and have certainly found some installs that do not comply even with the 5% guidance, I appreciate that most of the time we can get away with it but I don't want to end up with upset customers and maybe having to replace an expensive cable.

But by calculating this way, am I being unnecessarily stringent with my calculations. I realise that this assumes worst case supply voltage from the grid which I feel I have to stick with, but for example are typical modern LED light fittings more tolerant to low voltage supply compared to filament lamps and therefore the guidance in 7671 is actually out of date? 

Parents
  • I agree I think Broadage has slipped up with decimal places but I agree cost of the losses should be a consideration.

    I am talking about situations where using armoured cable is the only practical solution as outbuildings are a long way from the house or getting the cable through the house is a significant challenge. In most cases voltage drop with just the charger running is within 3% but then there are other loads on a socket circuit, tumble dryers, power tools and worst case hot tubs. In most cases these are running for less time than the charger and cost of losses are less of an issue.

    I agree with short runs on dedicated charger circuits moving from say 4mm to 6mm cable makes good sense but if we are looking at a 50m cable run and deciding between 10mm or 16mm cable with over £100 cost difference the decision is less clear.

    With the attached calculation if someone drives 12000 miles per year they will spend 417 hrs charging, 5% cable loss equates to about £46 worth of electricity per year. If the a larger cable was used to bring loss down to 3% they would save £18.

    From talking to customers at the moment most are probably closer to 6000 miles per year, although this may change as people get back in to more normal travel patterns post Covid. Also I suspect that high millage drivers are still viewing EV's as impractical, which I assume will change as range increases.

    XLSX

    So in conclusion if my calculations are correct 

    Moving from 4 to 6mm for shorter cable runs probably makes sense (especially if looking at none SWA cables)

    Beyond that from a cost of voltage drop loss point of view selecting a larger cable doesn't make sense in my opinion, although having a 10mm earth cable run out to outbuildings has some advantages if there is a desire to export a TNCS earth now or in the future.

    I also agree that it's important t stay within the 5% and to make allowance for future loads, especially as open pen detectors can be upset by low supply voltages.

Reply
  • I agree I think Broadage has slipped up with decimal places but I agree cost of the losses should be a consideration.

    I am talking about situations where using armoured cable is the only practical solution as outbuildings are a long way from the house or getting the cable through the house is a significant challenge. In most cases voltage drop with just the charger running is within 3% but then there are other loads on a socket circuit, tumble dryers, power tools and worst case hot tubs. In most cases these are running for less time than the charger and cost of losses are less of an issue.

    I agree with short runs on dedicated charger circuits moving from say 4mm to 6mm cable makes good sense but if we are looking at a 50m cable run and deciding between 10mm or 16mm cable with over £100 cost difference the decision is less clear.

    With the attached calculation if someone drives 12000 miles per year they will spend 417 hrs charging, 5% cable loss equates to about £46 worth of electricity per year. If the a larger cable was used to bring loss down to 3% they would save £18.

    From talking to customers at the moment most are probably closer to 6000 miles per year, although this may change as people get back in to more normal travel patterns post Covid. Also I suspect that high millage drivers are still viewing EV's as impractical, which I assume will change as range increases.

    XLSX

    So in conclusion if my calculations are correct 

    Moving from 4 to 6mm for shorter cable runs probably makes sense (especially if looking at none SWA cables)

    Beyond that from a cost of voltage drop loss point of view selecting a larger cable doesn't make sense in my opinion, although having a 10mm earth cable run out to outbuildings has some advantages if there is a desire to export a TNCS earth now or in the future.

    I also agree that it's important t stay within the 5% and to make allowance for future loads, especially as open pen detectors can be upset by low supply voltages.

Children
No Data