Economist–As robots grow more autonomous, society needs to develop rules to manage them

As they become smarter and more widespread, autonomous machines are bound to end up making life-or-death decisions in unpredictable situations, thus assuming””or at least appearing to assume””moral agency. Weapons systems currently have human operators “in the loop”, but as they grow more sophisticated, it will be possible to shift to “on the loop” operation, with machines carrying out orders autonomously.

As that happens, they will be presented with ethical dilemmas. Should a drone fire on a house where a target is known to be hiding, which may also be sheltering civilians? Should a driverless car swerve to avoid pedestrians if that means hitting other vehicles or endangering its occupants? Should a robot involved in disaster recovery tell people the truth about what is happening if that risks causing a panic? Such questions have led to the emergence of the field of “machine ethics”, which aims to give machines the ability to make such choices appropriately””in other words, to tell right from wrong….

Read it all.

print

Posted in * Culture-Watch, * Economics, Politics, Ethics / Moral Theology, Law & Legal Issues, Politics in General, Science & Technology, Theology

4 comments on “Economist–As robots grow more autonomous, society needs to develop rules to manage them

  1. BlueOntario says:

    The programmers should read Azimov.

  2. sophy0075 says:

    Better yet, they should read Asimov.

  3. BlueOntario says:

    oops. 😉

  4. BillB says:

    Seconded. The Three Laws of Robotics are brilliant and forward looking.