This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Should the IET seek its members to pledge not to help build killer robots?

I read an interesting article in the online E&T (above) that reports of a pledge to not assist in the development of so-called Killer Robots. Should the IET take a stance, be the first PEI to endorse the pledge and furthermore expect/encourage its members to sign up too?
Parents
  • Interesting, I would actually have to go further than Alasdair and say no and no, for example as described later in the article:

    At the Farnborough airshow this week, defence secretary Gavin Williamson announced a multibillion pound project to develop a new RAF fighter – the Tempest – which will be capable of flying unmanned and autonomously hitting targets, as well as using concentrated energy beams to inflict damage; the government has stated that human operators will always have oversight over all weapons systems.




    I am sure many figures in the IET are involved with the Tempest, as an excellent example, and would argue that it is not for them to make the moral judgement as to how it is used or operated. So where do you draw the line?

    HOWEVER, I personally feel rather differently about this (a brief look at my LinkedIn page will reveal that I have never worked anywhere in the field of armaments, and I don't intend to). And I do feel that "the IET" (whatever that is) does have a duty to explore and explain what the implications are of the technology the engineering community is developing and is capable of developing. So I do think it has a duty to expose the facts, but it's difficult to see where it can draw the line on "withdrawing labour". There are plenty of conventional weapons, used under human control, designed and built in (for example) the UK, that are used by repressive regimes against civilians. I can even imagine an argument that it's better that if these autonomous systems are going to be developed that they are developed by professional engineers rather than "unprofessional" engineers.


    We really should be discussing ethics in engineering far more in the IET. It is the huge elephant in the room for engineering.


    Here's a thought: I think it would be credible for the IET to propose that engineers should have the right to refuse to work on technology which they consider "morally unacceptable" without risk of reprisals (i.e. dismissal). But again it's phenomenally difficult to put into a solid legal code - we are in a slightly odd position in UK employment law that cases where work is deemed unacceptable to an individual for religious reasons are reasonably well covered (although complex!) in law, but us far as I know there is no employment law to cover moral / ethical decisions of an individual for personal (non-faith) reasons. So, as far as I know, if someone refused to work on a project because they thought it was "morally wrong" (although legal) they could just be dismissed. So you can imagine a scenario where a weapons engineer is working on Tempest thinking it is a human controlled system, and it then becomes clear that it is actually going to become an autonomous system which they are unhappy with, there's not much they can actually do other than hope they can find another job. So maybe there's something the IET can do there.


    I must admit I had a look here in a break in case Lisa had posted one of her fun Friday postings, this was a bit the opposite! But thanks for posting this Mark and Alasdair, it is a really important subject which needs airing, and actually this is just the tip of the iceberg. With very little thought I'm sure we can think of many other things engineers have done - often with the best of intentions - that cause greater loss of life or health than autonomous weapons will. How much can we really blame the purchaser and the user every time? Should we be looking at what we do much more carefully? How much can we trust our own moral judgements anyway - do we really understand the big picture? Personally I've been directly wrestling with these problems since 1982 - when I went for a job interview which turned out to be working on atomic weapons (which thankfully I didn't get offered, given it was the one time in my life when I was pretty desperate for a job) - and I have never found an answer yet.


    Fortunately trains don't kill many people smiley Even better, my job is to make sure they kill even less people smileysmiley I know what you're thinking, "smug ***" smileysmileysmiley


    Happy weekend!


    Andy


Reply
  • Interesting, I would actually have to go further than Alasdair and say no and no, for example as described later in the article:

    At the Farnborough airshow this week, defence secretary Gavin Williamson announced a multibillion pound project to develop a new RAF fighter – the Tempest – which will be capable of flying unmanned and autonomously hitting targets, as well as using concentrated energy beams to inflict damage; the government has stated that human operators will always have oversight over all weapons systems.




    I am sure many figures in the IET are involved with the Tempest, as an excellent example, and would argue that it is not for them to make the moral judgement as to how it is used or operated. So where do you draw the line?

    HOWEVER, I personally feel rather differently about this (a brief look at my LinkedIn page will reveal that I have never worked anywhere in the field of armaments, and I don't intend to). And I do feel that "the IET" (whatever that is) does have a duty to explore and explain what the implications are of the technology the engineering community is developing and is capable of developing. So I do think it has a duty to expose the facts, but it's difficult to see where it can draw the line on "withdrawing labour". There are plenty of conventional weapons, used under human control, designed and built in (for example) the UK, that are used by repressive regimes against civilians. I can even imagine an argument that it's better that if these autonomous systems are going to be developed that they are developed by professional engineers rather than "unprofessional" engineers.


    We really should be discussing ethics in engineering far more in the IET. It is the huge elephant in the room for engineering.


    Here's a thought: I think it would be credible for the IET to propose that engineers should have the right to refuse to work on technology which they consider "morally unacceptable" without risk of reprisals (i.e. dismissal). But again it's phenomenally difficult to put into a solid legal code - we are in a slightly odd position in UK employment law that cases where work is deemed unacceptable to an individual for religious reasons are reasonably well covered (although complex!) in law, but us far as I know there is no employment law to cover moral / ethical decisions of an individual for personal (non-faith) reasons. So, as far as I know, if someone refused to work on a project because they thought it was "morally wrong" (although legal) they could just be dismissed. So you can imagine a scenario where a weapons engineer is working on Tempest thinking it is a human controlled system, and it then becomes clear that it is actually going to become an autonomous system which they are unhappy with, there's not much they can actually do other than hope they can find another job. So maybe there's something the IET can do there.


    I must admit I had a look here in a break in case Lisa had posted one of her fun Friday postings, this was a bit the opposite! But thanks for posting this Mark and Alasdair, it is a really important subject which needs airing, and actually this is just the tip of the iceberg. With very little thought I'm sure we can think of many other things engineers have done - often with the best of intentions - that cause greater loss of life or health than autonomous weapons will. How much can we really blame the purchaser and the user every time? Should we be looking at what we do much more carefully? How much can we trust our own moral judgements anyway - do we really understand the big picture? Personally I've been directly wrestling with these problems since 1982 - when I went for a job interview which turned out to be working on atomic weapons (which thankfully I didn't get offered, given it was the one time in my life when I was pretty desperate for a job) - and I have never found an answer yet.


    Fortunately trains don't kill many people smiley Even better, my job is to make sure they kill even less people smileysmiley I know what you're thinking, "smug ***" smileysmileysmiley


    Happy weekend!


    Andy


Children
No Data