Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
46 pages
1 file
This paper explores and presents a novel scientific paradigm to the ethics, methodologies, and dichotomy of autonomous military robot systems used to advance and dynamically change how warfare in the twenty first century is conducted; and judged from an ethical, moral and legal perspective with the aim of creating a new concept through a scientific survey.
The U.S. military has started to construct and deploy robotic weapons systems. Although human controllers may still be monitoring the functioning of the technology, the next logical step is to transfer incrementally more of the decision-making power to the robots themselves. Thus, this article seeks to examine the ethical implications of the creation and use of "autonomous weapons systems."
Journal of Indo-Pacific Affairs, 2022
Both corporate leaders and military commanders turn to ethical principle sets when they search for guidance concerning moral decision making and best practice. In this article, after reviewing several such sets intended to guide the responsible development and deployment of artificial intelligence and autonomous systems in the civilian domain, we propose a series of 11 positive ethical principles to be embedded in the design of autonomous and intelligent technologies used by armed forces. In addition to guiding their research and development, these principles can enhance the capability of the armed forces to make ethical decisions in conflict and operations. We examine the general limitations of principle sets, refuting the charge that such ethical theorizing is a misguided enterprise and critically addressing the proposed ban on military applications of artificial intelligence and autonomous weapon systems.
Philosophy & Technology, 2011
Ethical reflections on military robotics can be enriched by a better understanding of the nature and role of these technologies and by putting robotics into context in various ways. Discussing a range of ethical questions, this paper challenges the prevalent assumptions that military ...
2014
Military robots are gradually entering the theater of war in many guises. As the capabilities of these robots move toward increased autonomous operation, a number of difficult ethical and legal issues must be considered, such as appropriate rules of engagement and even notions of robot ethics. In the distant future, as military "artificial beings" that draw on expected advances in cyborg and android technologies are developed, further issues of conscience, consciousness, personhood, and moral responsibility also arise.
International Scientific Conference “Defense Resources Management in the 21st Century, 2017
Robotic technology offers great benefits to humanity. However, one of the first uses of this technology is for war. We already have seen the actual use of armed unmanned aerial vehicles to destroy enemy targets in war zones. Some experts believe that robots will change the nature of wars. Even with the current artificial intelligence technology, it is possible to task a robot to kill a human without human intervention. There are many ethical issues surrounding robotic warfare. Current warfare ethics is shaped by the principles of Law of Armed Conflict (LOAC). These principles are formed based on the experiences of earlier wars. These wars were fought only by humans. However, in the robotic warfare era, there are new types of combatants. Therefore, we may need a new set of principles for LOAC. Currently, there are international campaigns for a ban on Lethal Autonomous Weapon Systems (LAWS), in other words, killer robots. Some experts advocate regulating the use of military robots rather than calling for an international ban. In this study, we discuss the current developments regarding the call for a ban and the arguments for a regulation on killer robots. In addition, we introduce a set of principles to guide the development of a Law of Robotic Armed Conflict (LORAC).
Mechanisms and Machine Science, 2025
This article investigates the growing use of robots and automation in military operations, emphasizing the ethical challenges posed to international humanitarian law. The Iraq War marked a key shift, transforming robots from tools viewed skeptically to vital military assets. By 2006, robots had executed over 30,000 missions, and demand for unmanned aerial vehicles (UAVs) surged. These technologies span military branches, including the navy's use of unmanned submarines. The focus is on Lethal Autonomous Weapon Systems (LAWS), which can independently make combat decisions. Nations like the U.S., China, and Russia are advancing LAWS, raising ethical concerns about autonomous warfare. The study aims to clarify issues surrounding LAWS, examine international arms control discourse, and propose regulatory strategies. Key areas of discussion include defining LAWS, reviewing debates under the Convention on Certain Conventional Weapons (CCW), addressing regulatory challenges, and suggesting regulation methods for dual-use technology weapons. The article stresses the need for preemptive arms control to limit LAWS development and anticipates future ethical and military landscapes shaped by these technologies. It calls for aligning future LAWS regulations with existing frameworks to manage their impact effectively.
Current Robotics Reports, 2020
Abstract Purpose of Review: To provide readers with a compact account of ongoing academic and diplomatic debates about autonomy in weapons systems, that is, about the moral and legal acceptability of letting a robotic system to unleash destructive force in warfare and take attendant life-or-death decisions without any human intervention. Recent Findings: A précis of current debates is provided, which focuses on the requirement that all weapons systems, including autonomous ones, should remain under meaningful human control (MHC) in order to be ethically acceptable and lawfully employed. Main approaches to MHC are described and briefly analyzed, distinguishing between uniform, differentiated, and prudential policies for human control on weapons systems. Summary: The review highlights the crucial role played by the robotics research community to start ethical and legal debates about autonomy in weapons systems. A concise overview is provided of the main concerns emerging in those early debates: respect of the laws of war, responsibility ascription issues, violation of the human dignity of potential victims of autonomous weapons systems, and increased risks for global stability. It is pointed out that these various concerns have been jointly taken to support the idea that all weapons systems, including autonomous ones, should remain under meaningful human control (MHC). Main approaches to MHC are described and briefly analyzed. Finally, it is emphasized that the MHC idea looms large on shared control policies to adopt in other ethically and legally sensitive application domains for robotics and artificial intelligence. Keywords: Autonomous weapons systems, Roboethics, International humanitarian law, Human-robot shared control, Meaningful human control
2009
War robots clearly hold Iremendous advantages-from saving the lives of our own soldiers, to safely defusing roadside bombs, to operating in inaccessible and dangerous environments such as mountainside caves and underwater. Without emotions and other liabilities on the battlefield, they could conduct warfare more ethically and effectively than human soldiers who are susceptible to overreactions, anger, vengeance, fatigue, low morale, and so on. But the use of robots, especially autonomous ones, raises a host of ethical and risk issues. This paper offers a survey of such emerging issues in this new but rapidly advancing area of technology.
Ethics and Information Technology, 2010
Telerobotically operated and semiautonomous machines have become a major component in the arsenals of industrial nations around the world. By the year 2015 the United States military plans to have one-third of their combat aircraft and ground vehicles robotically controlled. Although there are many reasons for the use of robots on the battlefield, perhaps one of the most interesting assertions are that these machines, if properly designed and used, will result in a more just and ethical implementation of warfare. This paper will focus on these claims by looking at what has been discovered about the capability of humans to behave ethically on the battlefield, and then comparing those findings with the claims made by robotics researchers that their machines are able to behave more ethically on the battlefield than human soldiers. Throughout the paper we will explore the philosophical critique of this claim and also look at how the robots of today are impacting our ability to fight wars in a just manner.
Journal of Military Ethics, 2015
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Ethics and Robotics, 2009
Challenges to national defence in contemporary geopolitical situation, 2020
The Changing Scope of Technoethics in Contemporary Society, 2018
Royakkers, L., & Olsthoorn, P. (2014). Military Robots and the Question of Responsibility. International Journal of Technoethics (IJT), 5(1), 1-14.
Fordham International Law Journal, 2021