This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Flexible futures

Interesting post from UR

Flexible Futures



Parents
  • Tricky for those working in some areas of defence where the end product is designed to cause destruction,

    It can indeed be a slippery slope. I recall, probably back in the 1990s, that one professional society had quite a debate over whether the society should accept members who worked in the tobacco industry. The establishment of the society didn't seem to see a problem, but some of the younger members seemed to see an obvious conflict between the evidence of harm caused by smoking (at the time not doubt encouraged by tobacco industry advertising) and the society's rules about 'doing no harm' (I don't recall the exact phrasing). These days similar arguments could be raised from anything from salt and sugar to fossil fuels to cars. Almost anything that benefits some people can have some detrimental effect on others. Where can you draw the line?

       - Andy.

  • Quite. The problem is (we're getting very philosophical now) whether you can define "right" and "wrong", as compared to "actions" and "consequences". For example, personally my view of the current scientific expertise is that I want to work in areas which result in carbon emission reduction, and other personal views I hold mean that I do not want to work in areas associated with military activities. So that's steered the areas I've chosen to work in. But it's very clear from discussions with any mixed group of engineers, including of course these forums, that other engineers hold 180 degree opposed views as to whether these are the "right" positions to hold. Who's to decide which is right?  I think it's got to be down to individual decisions, although I do think the PEIs can have a role in helping to provide information to allow engineers to make informed decisions about their work and the consequences.

    (Also remembering that engineers, like all human beings, are very very good at finding ways of justifying why what they wanted to do anyway is morally the right thing to do!!) 

    Which I think brings us back to the point (good grief!) - that it would be really useful for the IET to support research into the impact of AI, robotics and other automation on society so that engineers can consider whether it's a field they wish to support (or, perhaps more practically, which they wish to try to steer in particular directions).

    Cheers,

    Andy

  • it would be really useful for the IET to support research into the impact of AI, robotics and other automation on society so that engineers can consider whether it's a field they wish to support

    Interesting point Andy. I remember doing an engineering degree by night. One of the classes was "The responsibility of engineers in society". While it had a longwinded name it was one of the classes that got the most engagement from the students. The class reviewed case studies (plane crashes in the 70', for instance) and discussed the responsibility or culpability of the engineers in each case.
    It would indeed be good for industrial and academic institutes to fund studies on social benefit of AI, automation or robotics. The key challenge is that these studies are impartial. It can be difficult to prove impartiality, even in government, which is heavily lobbied by industry. I don't think we can leave it to the merchants to self-regulate either. I believe regulation needs to be informed by research in collaboration between academia, community, government & industry. Unfortunately it is a slow process of negotiation which can often lead to hiatus as evident in CoP 27. But it still needs to be done. In the meantime, regarding social benefit, we need to make personal choices about what we produce and what we are buying. 

    #robotics

Reply
  • it would be really useful for the IET to support research into the impact of AI, robotics and other automation on society so that engineers can consider whether it's a field they wish to support

    Interesting point Andy. I remember doing an engineering degree by night. One of the classes was "The responsibility of engineers in society". While it had a longwinded name it was one of the classes that got the most engagement from the students. The class reviewed case studies (plane crashes in the 70', for instance) and discussed the responsibility or culpability of the engineers in each case.
    It would indeed be good for industrial and academic institutes to fund studies on social benefit of AI, automation or robotics. The key challenge is that these studies are impartial. It can be difficult to prove impartiality, even in government, which is heavily lobbied by industry. I don't think we can leave it to the merchants to self-regulate either. I believe regulation needs to be informed by research in collaboration between academia, community, government & industry. Unfortunately it is a slow process of negotiation which can often lead to hiatus as evident in CoP 27. But it still needs to be done. In the meantime, regarding social benefit, we need to make personal choices about what we produce and what we are buying. 

    #robotics

Children
No Data