Following a debate on autonomous weapons systems at the House of Lords, a minister of the Ministry of Defence has refused to rule out the future use of autonomous lethal weapons.
Indeed, as the UK is facing calls for regulations for new technologies in the area, it is becoming vital to know where does the government stands regarding such technologies. It was then reported that the UK would always engage internationally with experts on the matter but did not provide details on the role of autonomous weapons.
However, it was noted that using artificial intelligence (AI) in modern warfare could bring moral challenges and that it was essential that humans would always be involved to oversights this technology. The minister confirmed that the human responsibility for the use of a system cannot be removed, no matter the level of autonomy in that system or the use of enabling technologies.
It was then reported that the UK government does not use lethal weapons without human oversight. However, the minister said that a robust application of that framework is seen as the best way of ensuring the lawful and ethical use of force and autonomous weapons in all circumstances.
Hence, it was concluded that the UK needed to show global leadership on lethal autonomous weapons and reaffirm its commitment to ethical AI. The Ministry of Defence has then agreed to publish a defence AI strategy by Autumn 2021, which will set out the vision of the UK as effective and trustworthy.