A new flyer on safely managing the emergent properties of complex systems

To develop a system that is safe, a sufficient understanding of its properties is needed. For a complex system, this must include emergent properties, without which understanding is not complete and confidence in its safety cannot be claimed. Our new flyer has been created to help managers and engineers understand complexity and emergent properties to guide systems more clearly and safely through their life cycles. In doing so, there is greater potential to develop safe products that are fit for purpose, produced efficiently, and supported effectively. Download the flyer for free: Safely managing the emergent properties of complex systems

The diagram below demonstrates how to navigate complex systems safely:

Our flyer also presents the objectives for engineering managers, which includes sustainable thinking, exploiting technology for deeper management insights, as well the objectives for engineers, which includes a better understanding of emergent properties and when to take action.

Download the flyer for free: Safely managing the emergent properties of complex systems

All feedback on this paper is welcome. Log in to your IET EngX account and leave your comments below.

  • Hmm...ok, since you ask so nicely (and hence apologies if some of these seem a bit negative, I will try to be constructive!):

    1. I may be wrong, but: how many engineers / engineering managers know what is meant by "emergent properties"? I would suggest clarifying this. I'll be honest, I had to look it up to be sure (and it didn't mean quite what I thought it did!), and safety of complex systems is my field!
    2. Managers' Objective 1, I do like, personally I might add "...with responsibility for ensuring that future stewards have all the information they need to continue managing the system safety". Ok, probably too many words for this, but really really important point to capture. 
    3. Managers' Objective 2, absolutely, but there's (at least) two separate issues in here. I would most certainly pull out "prepare for low probability catastrophic risks" into its own Objective (or other clear statement) as it's SO important, and, in the context of this flyer, a key paradigm shift requirement for engineering managers not used to safety issues. Particularly those with a lot of experience of high reliability systems, who sometimes struggle to realise that systems with superb failure rates may still be many orders of magnitude away from acceptable safe failure rates.
    4. Still on Objective 2, I suspect the word "sustainable" in the heading could cause confusion, I agree it is literally correct, but it is so much associated with environmental sustainability now that an alternative might be better - personally I usually somehow use the words "safety culture" in similar phrases, but that may not be the best phrasing either.
    5. Similarly with Managers' Objective 3, I'm not sure "reality" really conveys the concept that's trying to be got across here? Perhaps something like "...a responsibility to nurture openness and accountability"?   
    6. Engineers' Objective 2 - but the problem is that engineers DO "judge when understanding can turn into action", but get it wrong (too early or too late). So to be useful this object really needs to be more about using the full range of resources, including users and experts in all implications of the system, to ensure this judgement is made correctly. Probably most of us who work in independent assurance spend most of our time checking this judgement, engineers really need more guidance and support in how to achieve it.
    7. Engineers' Objective 3 - Yes please!!!! But crucially I would add "...should UNDERSTAND and fully support..." 
    8. I have no idea what Figure 1 is trying to convey. Sorry. I can see there was a good idea there, but I suspect it got tweaked a lot on the way to the final graphic design. (Ok, I can sort of work it out but that's not the point - a picture should instantly convey something, otherwise it might be better to use words.)
    9. There is a tiny note on Figure 1 "Use teams. Promote diverse thinking." This is vital and should be given huge prominence. And further, teams must be wide ranging, representing anyone who might interface with or have expertise on the project. The major risk which this flyer doesn't seem to clearly pick up is engineers and engineering managers who "don't know what they don't know".
    10. Figure 2 is correct, but is it useful? And surely it's true for all systems?

    I spend a fair bit of my time these days supporting the type of people (individuals and organisations) who I think this flyer is aimed at: those who previously worked on "simple" (let's say broadly understood by a single engineer) systems and are now moving into complex systems, and those who have previously worked on complex systems with no safety impact and are now applying those systems in ways that have a safety impact. And sometimes both. Would I say "put this up on your office wall, your team might find it helpful"?  Not right now because of those points above.

    BUT it's a REALLY good idea, so despite all the comments above thanks to those who've worked hard on this, it covers some really important areas which really are not well understood and appreciated, so perhaps a mk 2??? As it says "Prepare for the next project, learn for the future"!



  • I second that it would be good to perhaps add something by way of definition


    Emergence concerns new properties produced as the system grows, which is to say ones which are not shared with its components when considered in isolation or in prior simpler states. Also, it must be assumed that some of the system properties are supervenient   that is to say interlinked in  way that a change in one thing is required for there to be a change in the other rather than metaphysically primitive, that  is to say indivisible and totally independent entities or  concepts.

    That should do it...  Pillow and a mug of cocoa anyone ?

    I think in the world of  engineering we worry about things known and discussed in the specialist vernacular as surprises, or unforeseen occurrences.

    The art of allowing for these is the art of management of risk.

    A chap called Rumsfeld had something pithy to say about unknown unknowns as well

    A rewrite in plainspeak would help,  also as it seems that true (strong) emergence is incompatible with the laws of physics, so I cannot support it  ;-)


  • as surprises, or unforeseen occurrences.

    I'm currently working on a Canadian project which uses the North American military term "mishaps". To our UK ears it sounds highly entertaining to refer to the potential event where e.g. two trains collide with multiple fatalities as a "mishap" - "oh dear, what a pity, that was a bit of a mishap wasn't it, tut tut"! But then plenty of things we say sound strange to Canadian ears...

  • A chap called Rumsfeld had something pithy to say about unknown unknowns as well

    Quite. I quote him a lot when assessing HAZIDs and what comes afterwards!

  • No worse than 'near miss' used locally to mean things that to the non-specialist sound like not even close to a "miss" at all..

    To the man in the street a near miss is when a projectile ruffles your hair but does no injury, and is a serious  thing - as in a few inches removed from a fatality or life changing accident..

    To some H and S types a near miss is an extension lead in the stock cupboard with an out-of date test label, or a cable that no one tripped over but perhaps they might have if anyone had walked there.

    The problem is that 'erring on the safe side' can over-state very minor risks and  cheapens the language of the more serious things we should be looking at. 

    This is in much the same way as the proliferation of warnings. "do not touch the blade" is a very sensible warning on a chainsaw, but to put an identical standard warning on a penknife immediately lowers the perceived risk of the chainsaw to the same level as the knife, which it really isn't.

    Language matters, and must match the intended audience - and if, as the modern trend,  you have managers of the type who have an MBA but precious little knowledge of the nuts and volts of what they are managing, then it becomes really very important indeed .

    Actually we have a great many special terms we use to describe when things go wrong with a great finesse of resolution of the severity of the problem, but these are not in the official lexicon. Sadly I think most of them would not survive the auto-censor, though I suspect many practical places of work are the same.


  • I find it hugely difficult to say much useful. I know personally two people on the Engineering Safety PP, whom I assume contributed (One is on a mailing list I run, on which I distributed the link to the flyer, but little traffic so far.) I have my difficulties with general vague feelgood statements, but agree that it is good to inform people of things of which they might be unaware. This is about management. I am not sure what the needs are that the Panel think they are addressing. I do know that a main issue with complex systems, and with people coming from simple systems to complex heterogeneous systems, is the HRA. I would support keeping on top of that (and keeping it well maintained), because doing it for complex systems is a world different from doing it for simple systems, but there is no word about it in the flyer. 

  • Andy, thank you very much for your comments.  As the IET Engineering Safety Policy Panel Lead, I am going to refer all the points raised, and the rest of the thread, to the group of volunteers that have compiled this 2-pager.  It is very much a 'starter for 10' - the intention has always been to supplement this document with a more substantial paper that delves more into its intricacies, and then to promote discussion on the topic amongst engineering communities.  We have already started work on that more detailed paper.  If you want to get more involved in shaping this work - and this equally applies to others adding their comments - do get in touch with me directly at arylah@theiet.org.

  • Andy, legalism and misrepresenting risk is a real problem.  Possibly the IET should somehow have a committee that advises lawyers????


    legalism and misrepresenting risk is a real problem

    I am not sure to what exactly you are referring. There are many conceptions of risk used in engineering (I introduced five in a recent talk). The IEC for many years tried to "harmonise" its various definitions of risk and has now acknowledged that it is a bigger issue than they thought.

    The more knowledgeable law firms know how to get expert advice on engineering safety and reliability, and are usually very selective in how they engage experts. I don't really see a role for the IET there.

  • I think the problem here would be "a little knowledge is a dangerous thing". What the IET could do is to brief law firms with the outlines but then with the strict caveat that "this is only scratching the surface, actually to know whether enough risk assessment has been done you need to employ an expert to investigate that particular case" (i.e. as Peter says). But in practice I'd fully agree with Peter that I've never come across this as an issue; whenever I've been called in to help with a legal (or potentially legal) case, on either side of the fence, everyone involved has been pretty clear that they need expert help. I like working with lawyers because they leave us to get on with it!

    The people who "don't know what they don't know" tend to come much earlier in the process, it's the engineering and project management teams, they're the ones who do need the information and support (including, conversely, about what their legal obligations are). Hence it's great that the IET are reaching out like this. Sort that out and companies don't need to worry about the lawyers anyway. 

    By the way, a declaration of interest to be squeaky clean, I do deliver training and mentoring on this stuff, but that's very much a side part of my "real" job which is checking that engineering projects have done it properly. And I'm near enough to retirement age that I would be delighted if all engineering teams suddenly "got it" and I was left with no work to do!!! Just because the world would be a safer place. (My business managers won't like me saying this but: it is a wonderful feeling when you're working with a client and suddenly all their safety processes start working and they don't need you any more - makes it all worth while.) But equally I've no worries about, sadly, there being plenty of work in the real world to keep me going for many years to come. The principles of safety engineering are actually very easy, but truly embedding them in an organisation is blooming difficult.