The IET is carrying out some important updates between 17-30 April and all of our websites will be view only. For more information, read this Announcement

This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Volt Drop

Hmm...Good day everybody.

Couple of questions regarding volt drop on a sub main cable.

1st) Is 5% the correct figure?

2nd) For a 3 phase supply would the volt drop be calculated on 3 single phase supplies? ie. 230V per phase.

3rd) Sorry there are three.

The sub main is a 4 core 70mm XLPE x 150m run.

Calc I've done are; Max drop 11.5 A, m/v 0.6, length 150m with gives design current of 127.8 Amps.

Thing that's always puzzled me is that the voltage at the supply is around 245v.

So even with an 11.5v drop it's not even 230v.

So how does this work?


Thank you.

Parents

  • 1st) Is 5% the correct figure?



    You need to consider the overall voltage drop from the origin to the appliance/luminaire - you can't just apply the percentage to one length of cable in the middle. So if your submain fed lighting you might need to achieve a maximum of 3% voltage drop overall - so you might for example have to share the 3% (or 6.9V) between the submain and final circuit - perhaps 1% for the final circuit and 2% for the submain - or some other split depending on the relative lengths of the circuits.


    Note however that the 3%, 5% etc in the regulations are simple "deemed to comply" values - it is permitted to use other (larger) values if you know the appliances will cope with the resulting (low) voltage. The 3% value for lighting related to filament lamps and probably isn't that relevant if you know all your lighting is LED or HF fluorescent that has ballasts that'll easily correct for lower supply voltages.

     

    2nd) For a 3 phase supply would the volt drop be calculated on 3 single phase supplies? ie. 230V per phase.



    If you were feeding balanced 3-phase loads - so there'd be no current flowing in the N conductor and so only the line conductors would suffer voltage drop - then you could use the 3-phase v.d. figures from the tables based on a percentage of 400V. If however you had a mix of single phase loads which could be unbalanced - in the worst case all the loads one phase switched on and none on the others - so the N carried as much current as the L - then yes, treat as if you were supplying a single phase load on each line and use the single phase voltage drop figures against the percentage of 230V.

     

    Thing that's always puzzled me is that the voltage at the supply is around 245v.

    So even with an 11.5v drop it's not even 230v.

    So how does this work?



    Supplies are 230V nominal (i.e in name only) - and will vary considerably with time and circumstances. Legislation says that public supplies must be within the range 230V +10% -6% - i.e. between 216.2V and 253V.  Most UK supplies tend to be a little over 230V due to the DNOs originally designing for 240V (+/- 6%) - but nothing's guaranteed within those ranges. When you or other consumers are drawing large currents the voltage drop in the supply cables will increase and your supplied voltage will drop. Likewise customers further away from the substation will tend to have lower supply voltages than those closer. So for the sake of standardization and robustness (and simplicity) we just design against the nominal voltage.


    You might find tomorrow that your supply comes in at 220V and your 11.5V v.d. brings it down 208.5V at the appliance.


       - Andy.
Reply

  • 1st) Is 5% the correct figure?



    You need to consider the overall voltage drop from the origin to the appliance/luminaire - you can't just apply the percentage to one length of cable in the middle. So if your submain fed lighting you might need to achieve a maximum of 3% voltage drop overall - so you might for example have to share the 3% (or 6.9V) between the submain and final circuit - perhaps 1% for the final circuit and 2% for the submain - or some other split depending on the relative lengths of the circuits.


    Note however that the 3%, 5% etc in the regulations are simple "deemed to comply" values - it is permitted to use other (larger) values if you know the appliances will cope with the resulting (low) voltage. The 3% value for lighting related to filament lamps and probably isn't that relevant if you know all your lighting is LED or HF fluorescent that has ballasts that'll easily correct for lower supply voltages.

     

    2nd) For a 3 phase supply would the volt drop be calculated on 3 single phase supplies? ie. 230V per phase.



    If you were feeding balanced 3-phase loads - so there'd be no current flowing in the N conductor and so only the line conductors would suffer voltage drop - then you could use the 3-phase v.d. figures from the tables based on a percentage of 400V. If however you had a mix of single phase loads which could be unbalanced - in the worst case all the loads one phase switched on and none on the others - so the N carried as much current as the L - then yes, treat as if you were supplying a single phase load on each line and use the single phase voltage drop figures against the percentage of 230V.

     

    Thing that's always puzzled me is that the voltage at the supply is around 245v.

    So even with an 11.5v drop it's not even 230v.

    So how does this work?



    Supplies are 230V nominal (i.e in name only) - and will vary considerably with time and circumstances. Legislation says that public supplies must be within the range 230V +10% -6% - i.e. between 216.2V and 253V.  Most UK supplies tend to be a little over 230V due to the DNOs originally designing for 240V (+/- 6%) - but nothing's guaranteed within those ranges. When you or other consumers are drawing large currents the voltage drop in the supply cables will increase and your supplied voltage will drop. Likewise customers further away from the substation will tend to have lower supply voltages than those closer. So for the sake of standardization and robustness (and simplicity) we just design against the nominal voltage.


    You might find tomorrow that your supply comes in at 220V and your 11.5V v.d. brings it down 208.5V at the appliance.


       - Andy.
Children
No Data