I assume that you are able to access information from National Grid and other Power Companies? ERA Technology (with its roots in the supply industry) would be an obvious point of contact?
From 1980-84 when I worked for CEGB Transmission (132KV and above) now National Grid , “Thermo-vision” surveys were being introduced in Substations and from Helicopters for Transmission Lines. Hot spots would be investigated at the first outage opportunity. In those days, if you wanted to measure the energy being wasted, before and after Ductor readings, would give before and after IsquaredR losses. Since the price of the energy is known, cost benefit analysis , incorporating the cost of repair is fairly simple, as would be with a modern sensibility CO2 contribution or any other derivative. “System losses” were not a particularly high priority at the time, security (aka reliability) of supply was, especially during the intensely political times of 1984.
I assume that you are seeking temperature versus reliability studies, where the risk of failure or degradation can be weighed against the cost of actions. On large supply plant such as I referred to, catastrophic failures were rare, but the costs and potential risks of a “hot spot” in an oil filled device considerable, as are anything that could cause catastrophic mechanical failure or sudden release of electrical fault energy etc. This is at the extreme and rather obvious end of thermal work, hopefully others will point you at something in electronics etc., after all we had exploding batteries recently.
I assume that you are able to access information from National Grid and other Power Companies? ERA Technology (with its roots in the supply industry) would be an obvious point of contact?
From 1980-84 when I worked for CEGB Transmission (132KV and above) now National Grid , “Thermo-vision” surveys were being introduced in Substations and from Helicopters for Transmission Lines. Hot spots would be investigated at the first outage opportunity. In those days, if you wanted to measure the energy being wasted, before and after Ductor readings, would give before and after IsquaredR losses. Since the price of the energy is known, cost benefit analysis , incorporating the cost of repair is fairly simple, as would be with a modern sensibility CO2 contribution or any other derivative. “System losses” were not a particularly high priority at the time, security (aka reliability) of supply was, especially during the intensely political times of 1984.
I assume that you are seeking temperature versus reliability studies, where the risk of failure or degradation can be weighed against the cost of actions. On large supply plant such as I referred to, catastrophic failures were rare, but the costs and potential risks of a “hot spot” in an oil filled device considerable, as are anything that could cause catastrophic mechanical failure or sudden release of electrical fault energy etc. This is at the extreme and rather obvious end of thermal work, hopefully others will point you at something in electronics etc., after all we had exploding batteries recently.