This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

The Future Is Looking Cloudy – Chippenham 6 February 2019: Summary & Comments.

Dr. Gary McGuire from Siemens Rail Automation described some of the challenges facing the railway industry, in particular that of dealing with the growth in passenger traffic, which had been fairly constant in the later half of the 20th century but has been rising steadily ever since 1995. In 1947 this amounted to 37 x 109 passenger-km (23 x 109 passenger-miles) and was now in excess of 56 x 109 passenger-km (35 x 109 passenger-miles). At the same time passenger expectations had increased partly as a result of the introduction of new travel services such as that provided via Uber taxis.



 



Within the industry 'digitisation' has been proposed as one way of meeting these challenges. Of course the railways had been using digital techniques for communications and management functions for decades, 'digitisation' is intended to go far beyond that.



A parallel was drawn between the way that we watched films in the home. First they had been available as TV broadcasts, at the time and choosing of the broadcaster, then on VHS tape, then on DVD disc and now from streaming services such as Netflix. Having a tape or DVD to hand had allowed the viewer to make the choice of film and when to view it but the streaming service greatly extended that choice but now the source was out 'on the cloud'. Could the same be done for the services supporting the transport industry?



 



The US National Institute of Standards and Technology (NIST) definition of Cloud Computing was briefly described - "cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction."



 



The signalling of a railway was the first service considered. Conventionally control apparatus grouped together to form an interlocking is connected via trackside cabinets to signals, point machines and vehicle detection equipment. The interlocking might consists of relays or software running on proprietary hardware. By emulating the operation of the proprietary equipment the function of the interlocking can be migrated onto generic computing equipment that of itself just provides a computing service. That service does not need to be local to the location, (i.e. in an interlocking control room) or even provided by the railway administration but could be delivered by a contactor or cloud service provider, (Amazon! Netflix!). Ultimately the fixed rail signalling infrastructure could be reduced to just point machines. The railway operator instead of having to fund the capital expenditure of a proprietary signalling installation would now pay for a cloud computing service.



 



A small system based on these principles was now being trialled at Achen, Austria, controlling 12 point machines, 16 main signals and a level crossing. A key part of digitisation is the ability to obtain and distribute data. So in this trial the data used by the existing installation can be used by the trial installation in a parallel but non-active state and its correct operation verified.



 



Data obtained from a real railway can be used in simulations and the University of Birmingham has established a UK Rail Data centre for this purpose, initially just handling static data. Siemens also has a partnership with the University of Sheffield to develop the Siemens-Sheffield Advanced Multimodal Simulator. Such simulators can model passenger movements within existing or planned stations. A 'digital twin' can be created and fed with live data on which various scenarios can be imposed. By using artificial intelligence to process the system data problems can be anticipated and methods developed to mitigate them. Examples of traffic peaks related to events like football matches and failures correlated with particular maintenance teams were given. Ultimately a data-rich environment would be created that could be used to optimise the operation of the railway and enhance the passenger experience.



 



In the questions that followed it was clear that many people were concerned about data security, resilience and the confidence in the safety of such cloud-based systems. It was admitted that in the past security of data links had not been a concern, the railway owned the wire, but that now security was part of the design process. Data transmission was encrypted and confirmed by encrypted 'hand-shake' responses. Resilience came in part from the way that data centres were constructed and by the use of multiple communication links. Safety to some extent was assured by the fact that existing control algorithms were emulated by multiple cloud processors, confidence however could only be built up gradually by first using the techniques locally and then gradually migrating them further into 'the cloud'.



 



This was certainly a fascinating talk, a mix of what was being done now and what might be done in the future. I am not sure that I have given that part justice here!



 



As to comments I feel a whole article could be written as some of the issues here are not just technical but of a social/political nature. Let me just say that the casual comment that 'passengers could be prompted to change their walking pace' nudges the line between helpful and authoritarian.



 



The parallel with watching films at home struck me too. Here the 'data' moved from the centre, to the home and back to the centre. I observe the same with computers, they started with a 'personal' machine, (OK it fills a room), it gets bigger, more expensive and is now time-shared with remote clients, it gets smaller, cheaper and becomes the 'PC', it gets connected to the internet and the process goes out to the cloud. Home, centre, home, centre – is there a data 'tide' there?



 



I must admit data security worries me. Will we have a new breed of software engineers that 'think security'? It really goes against the grain. The hardware engineer builds a prison with walls and reluctantly and carefully installs doors. The software engineer thinks doors are what it is all about! “You want locks on them? Really?” World Wide Web, Internet of Things, data sharing, everything about that screams “No doors!” The network has to be secure but against what? A hardware example: A bank has a safe. A bank robber has a copy of that safe, night and day the robber practices on that safe. The bank 'knows' that its safe is secure, no-one has ever 'cracked' it ten years, then our robber does it one day in the space of five seconds. How do we know that we aren't the only ones with a 'digital twin'?



 



My other concern is optimisation. The 'Big Data', 'Smart City' ideas are all about using up the slack, trying to hit peak efficiency. Consider the Smart Motorway, all that unused hard shoulder. So we 'sweat the assets' of a three-lane motorway, gain a bit by using the hard shoulder as a fourth lane and put off creating real capacity. Then the traffic builds up, vehicles overheat, and the pseudo four lane motorway becomes a three-lane car park with a clogged hard shoulder. All 'efficient' systems behave like this, they don't degrade when stretched, they collapse.



 



A interesting talk, some good ideas in there, but very thought provoking!



 



---



External links



 



NIST Cloud Computing Definition



 



Birmingham Centre for Railway Research and Education



 



Siemens-Sheffield Advanced Multimodal Simulator



 



Siemens Highlights Connected Mobility