Context
The French defense ethics committee has identified the guiding principles and made recommendations relating to methodology, research, use, design, but also training concerning the development of techniques contributing to autonomy in weapons systems. This proposal is being scrutinised by the Ministry.
Background
Automated devices have long been used in weapon systems employed in military operations on land, at sea or in the air. However, the concept of "autonomous weapon systems", currently used in civil society to condemn the very principle or, in certain international fora, to restrict their use, evokes, in all cases, the idea of a clear disruption in both technology and ethics
Due to the prospects offered by the development of robotics, use of lethal weapon systems described as "autonomous" is a source of ethical questions linked to the very foundations of the military function:
Analysis
What is Lethal Autonomous Weapon System (LAWS)?
What is Partially Autonomous Lethal Weapon System (PALWS)?
Reasons for renouncing the use of LAWS Use of LAWS would:
|
What are the ethical issues associated?
What should be the guiding principles for use of these weapons?
Conclusion
India should foster research in the fields of defence artificial intelligence and weapon systems automation for several reasons. First to avoid the country losing ground in the scientific and technological fields; second to counter enemy development of LAWS; and finally, to be able to defend ourselves against this type of weapon in the likely event of their use by an enemy State or terrorist group against our troops or population. Such research must be governed by a strict ethical and legal framework and be conducted in compliance with legal review mechanisms.
More Articles
Verifying, please be patient.