Oliver Marshall of Frauscher UK began by giving a brief overview of the company, best known for its railway wheel detection and axle counting equipment. He then went on to describe the principles of what the oil and gas industries called Distributed Acoustic Sensing (DAS) that the company was developing for the railway industry as Frauscher Acoustic Sensing.
This technique sends a pulse of coherent light into a fibre optic that is absorbed in a termination at the far end. Because of inevitable minor variations in refractive index along the length of the fibre Rayleigh scattering takes place, causing some of the light energy to be reflected back along the fibre, interfering with the advancing pulse in an additive or subtractive fashion depending on the relative phase of the two waves. The reflected light is then detected at the transmission end. By analysing the nature of the reflected light and its time of arrival it is possible to relate this to the state of the fibre at specific points along its length. In effect this is a form of optical time domain reflectometry, similar to methods used in copper cables to determine faults. However it had been found that geometry of optic fibres is minutely influenced by physical vibration, which in turn modulates the form of the Rayleigh scattering.
Since the first use of this phenomenon in the 1960s advances in laser technology, processing power and signal processing have made new applications possible.
Several railways are involved in projects to develop the concept for railway use. A typical project can make use of existing spare optics located in the trackside ducting, so-called ‘dark fibre’. Transmission/detection units are connected to 40 km of fibre on each side and fire a pulse equivalent to a 20 m length of the fibre at a of 2.5 thousand pulses per seconds. This provided sufficient resolution by distance and the ability to detect vibrations in the 10 Hz – 1 kHz range.
The raw data from the detector is stored and then subjected to a Fast Fourier Transform, to extract frequency information from the time domain, and the ‘Morphologically cleaned’. This process is probably best understood by viewing the data in a graphic form that can be cleaned up by removing ‘outlier’ points and ‘infilling’ gaps. The final stage involved setting appropriate thresholds for setting alarm conditions.
There then followed a brief overview of some of the applications that were being developed, such as flat wheel detection, rail anomalies, train tracking, rock falls, intruder detection and catenary flashovers. The trial systems sometimes produced false alarms, reporting rail fractures that were later revealed to be missing chair keys or voids in the ballast, faults that had previously gone unnoticed. The ongoing projects should enable the system to be fine-tuned and establish what data can be reliably determined and the best ways that it can be presented and used by the railways.
This topic certainly raised a lot of interest judging by the very large number of questions that Oliver and his colleague addressed very well. I think there was a degree of scepticism, “Interesting idea, but why do we need it?” but I am sure many will be pondering its potential.
It occurs to me that crudely speaking the Victorian age was the true ‘digital age’, certainly a ‘binary age’, the Morse key was either down or up, the safety valve open or closed. Techniques such as DAS are becoming more common, ‘rich data’ from multiple sources rather than just one, patterns extracted from the ‘noise’. Do we move from systems of (perhaps false) ‘certainty’ to those of ‘probability’ (hopefully very high)? (I am reminded of a very old joke where a level crossing keeper is asked why the gates are half-open. He says, “Well I’m half expecting a train”).