The IET is carrying out some important updates between 17-30 April and all of our websites will be view only. For more information, read this Announcement

This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Volt Drop

Hmm...Good day everybody.

Couple of questions regarding volt drop on a sub main cable.

1st) Is 5% the correct figure?

2nd) For a 3 phase supply would the volt drop be calculated on 3 single phase supplies? ie. 230V per phase.

3rd) Sorry there are three.

The sub main is a 4 core 70mm XLPE x 150m run.

Calc I've done are; Max drop 11.5 A, m/v 0.6, length 150m with gives design current of 127.8 Amps.

Thing that's always puzzled me is that the voltage at the supply is around 245v.

So even with an 11.5v drop it's not even 230v.

So how does this work?


Thank you.

  • In my view, the voltage drop should be calculated for the entire consumers installation, from the origin to the far point of the longest final subcircuit. And not just for the submain. The total voltage drop should not normally exceed 5% for power and 3% for lighting.

    It is up to the designer as to how this is apportioned between the sub main and the final circuits. 2% in the sub main and then 1% in lighting final circuits and 3% in power final circuits might be a start.

    These figures are not absolute requirements and in some circumstances engineering judgement may be applied to permit of a larger voltage drop. If the supply is from private generating plant, or from an HV DNO supply via a private transformer, then a larger figure may well be acceptable.

    For an installation supplied at low voltage from public mains, I would normally keep within the accepted figures. The fact that the present supply voltage is generous is of little relevance for a public LV supply. What if the DNO reduce it to 220 volts which they might.


    For a three phase submain feeding single phase loads, the intention should be to ensure that the single phase line to neutral voltage at the far end of the "worst" final circuit will be within the accepted figures of 3% or 5%. If the load is reasonably balanced then one might reasonably calculate the voltage drop in the sub main on a 3 phase basis. Alternatively calculate on a single basis, but allow only for voltage drop in the phase conductor, not the neutral. With a reasonably balanced load, the current in the neutral core and therefore the drop therein should be very small.

  • 1st) Is 5% the correct figure?



    You need to consider the overall voltage drop from the origin to the appliance/luminaire - you can't just apply the percentage to one length of cable in the middle. So if your submain fed lighting you might need to achieve a maximum of 3% voltage drop overall - so you might for example have to share the 3% (or 6.9V) between the submain and final circuit - perhaps 1% for the final circuit and 2% for the submain - or some other split depending on the relative lengths of the circuits.


    Note however that the 3%, 5% etc in the regulations are simple "deemed to comply" values - it is permitted to use other (larger) values if you know the appliances will cope with the resulting (low) voltage. The 3% value for lighting related to filament lamps and probably isn't that relevant if you know all your lighting is LED or HF fluorescent that has ballasts that'll easily correct for lower supply voltages.

     

    2nd) For a 3 phase supply would the volt drop be calculated on 3 single phase supplies? ie. 230V per phase.



    If you were feeding balanced 3-phase loads - so there'd be no current flowing in the N conductor and so only the line conductors would suffer voltage drop - then you could use the 3-phase v.d. figures from the tables based on a percentage of 400V. If however you had a mix of single phase loads which could be unbalanced - in the worst case all the loads one phase switched on and none on the others - so the N carried as much current as the L - then yes, treat as if you were supplying a single phase load on each line and use the single phase voltage drop figures against the percentage of 230V.

     

    Thing that's always puzzled me is that the voltage at the supply is around 245v.

    So even with an 11.5v drop it's not even 230v.

    So how does this work?



    Supplies are 230V nominal (i.e in name only) - and will vary considerably with time and circumstances. Legislation says that public supplies must be within the range 230V +10% -6% - i.e. between 216.2V and 253V.  Most UK supplies tend to be a little over 230V due to the DNOs originally designing for 240V (+/- 6%) - but nothing's guaranteed within those ranges. When you or other consumers are drawing large currents the voltage drop in the supply cables will increase and your supplied voltage will drop. Likewise customers further away from the substation will tend to have lower supply voltages than those closer. So for the sake of standardization and robustness (and simplicity) we just design against the nominal voltage.


    You might find tomorrow that your supply comes in at 220V and your 11.5V v.d. brings it down 208.5V at the appliance.


       - Andy.