This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Loss of mains for type tested generation units

Hi all,


I'm reviewing the Loss of Mains protection for a number of old (pre-2018) G59 type tested inverters. While for site commissioned units it is required to note the LOM detection method, the "G59 certificate" only records operation time. I haven't managed to find (yet) a statement that it must be a certain method; G59 only seems to say that the parameters must be XX for each method (depending on age) and leaves it open to the manufacturer to decide. Does anyone know where it states which method is used? I'm also trying manufacturers but not all of them still exist!


It's always been a niggle at the back of my mind, but it's not been an issue... Now with the ENA's requirement to update old settings it's relevant.


Ta,

Jam
Parents
  • I've a feeling I've got the wrong end of the stick here, but I'll try anyway.... (it'll bring the thread to the top of the list at least)


    As I see it an inverter will disconnect itself from the grid whenever it decides that the mains has been lost - and there are a number of criteria for making that decision - voltage, frequency, rate-of-change-of-frequency, vector shift and so on... The inverter will continually monitor all such variables and disconnect when any one (or deemed combination) go out of bounds.


    When comissioning on site however I would have thought it would be difficult simulate many of those conditions (you obviously can't mess with the real grid so you'd almost need your own little power plant with controllable frequency to simulate nasty grid conditions) - so would have imagined that the test would be just to open a handy disconnector between the inverter and the grid and time how long it takes the inverter to notice and disconnect - in effect simulating an extreme case for all the monitored variables all at the same time. So in a way the on-site test produces a time, but it can't be related to any particular detection method.


        - Andy.
Reply
  • I've a feeling I've got the wrong end of the stick here, but I'll try anyway.... (it'll bring the thread to the top of the list at least)


    As I see it an inverter will disconnect itself from the grid whenever it decides that the mains has been lost - and there are a number of criteria for making that decision - voltage, frequency, rate-of-change-of-frequency, vector shift and so on... The inverter will continually monitor all such variables and disconnect when any one (or deemed combination) go out of bounds.


    When comissioning on site however I would have thought it would be difficult simulate many of those conditions (you obviously can't mess with the real grid so you'd almost need your own little power plant with controllable frequency to simulate nasty grid conditions) - so would have imagined that the test would be just to open a handy disconnector between the inverter and the grid and time how long it takes the inverter to notice and disconnect - in effect simulating an extreme case for all the monitored variables all at the same time. So in a way the on-site test produces a time, but it can't be related to any particular detection method.


        - Andy.
Children
No Data