AMR Three-Layer Architecture Hybrid Deliberative and Reactive Three-Layer Architecture The hybrid model is a framework that enables the development and implementation of robust intelligent robots. Hybrid architectures, are robust also due to it allowing the reconfiguration of reactive component systems, based on the knowledge of the world that it is able to construe through their facility to analyse behavioural elements, Bianca Mariela I Badano (2008). This approach is composed of the following hierarchical and reactive pattern shown on the figure below. Hybrid Deliberative and Reactive Three-Layer Architecture with Sensors The online public survey total responses are made of the various demographics which are shown 01 figure 3: national defence organisations (3.9%-13), armed forces and security services (4.3%-6), roboticists (2.6%-6), policy makers (2.1%-7) and the public. Figure 3: Distribution of Survey Demographic by Community Category. Figure 3.1: Distribution of Survey Demographic by Gender Type. Due to the survey being part of an academic project dissertation research, which was distributed widely within the Sheffield University departments, the majority of the responses were (87.1%) by the public. But for clarity and accuracy it is appropriate to emphasise that some of the other variable participants, are or happened to be students who are engaged with some academic research related to the problem domain. The gender gap between male, and female was vast, with the greatest responses being from males at (68.6% - 223) and females at (31.4% - 102); this is shown on figure 3.1 below. Figure 3.2: Distribution of survey demographic by education type. The level of education among all the demographics were divided as followed with (96.0%) of respondents having completed their college education, and (0.8%) working towards it. On the category of bachelor’s degree (50.5%) had completed it, as opposed to (48.4%) who were working towards it, and with (34.0%) who had completed their postgraduate degree, as contrasting with (36.4%) who were working towards it, as shown on figure 3.2. The total amount of respondents was 330. The table 3.3 bellow shows the level of education variety, in contrast with the age type differentiatior among all the groups. Figure 3.4 Characteristics Questions Acceptance of Community Types for the Computational Characteristics Figure 3.5 Characteristics Questions by Community Type. Figure 3.6 Ethical Acceptability by gender type Note that between the two types of community, males seem to be the most to agree to the prospect of an autonomous military robot being constrained by the ethical standards established by individuals. This section explores some ethical considerations, it is composed of two questions which the differing, is between the ethical object whether an autonomous military robot or autonomous robot, and the computational rule, whether it can be constrained by ethical standards or computer programs. This is shown on figure 3.6 and Figure 3.7 which are compared based on the gender demographic male or female). The choices for these two questions ‘3-4’ of the survey were; Strongly disagree, Disagree, Neither agree nor disagree, A gree and Strongly agree. These two questions assessed the views of the participants in relation to the two different types of computational constraining ethical algorithms, which might be implemented to allow robots to behave ethically. In addition, one also presented these two questions in contrast with the following community types, demographics national defence organisations, armed forces and security services, roboticists, policy makers, and th oublic) with the total amount of 325 respondents all together which is show on figure 3.8 to 3.11. Figure 3.7 Ethical Acceptability by gender type In the case of figure 3.7 the result seems to be very similar, with males still being the mos agreeable as opposed to the females on the subject matter. Figure 3.9 Ethical Acceptability by various community types. This figure shows the percentage values for each community type for figure 6.9. Figure 3.8 Ethical Acceptability by various community types. These various community types, when making their choices, differentiate extremely great, in that the community type that is the most agreeable is the armed forces and security services followed by the policymakers and respectively. The Table below shows the actual percentage of each community type. The total of respondents for this question was 327 participants. Figure 3.10 Ethical Acceptability by various community types. Figure 3.11 Ethical Acceptability by various community types. The behaviour of autonomous armed robots will be ethically acceptable in a war zone where a civilian population is present Figure 3.12 Ethical Acceptability of a Robot in a War zone where civilians are present. Figure 3.13 Ethical Acceptability of a Robot in a War zone where only armed forces are present. The overall participants were 318. Figure 3.14 Ethical Situations by level of Autonomy by Gender Community Type. These following three questions shown below on figure 3.14, explore the alternative of having an armed autonomous robot as the main selector of targets where a human confirms, before lethal force is applied. The total responses for the first question (9), was 314 contributors. The responses collected, shows both of community gender types agreeing with the envisaged rational of having a human in the loop. Confirming once a selected target has been selected before lethal action is taken. The responses were gathered by using the “Likert-type-scale model”. Having a human in the loop to confirm or a robot under the supervision of humans, is the acceptable narrative which is also noted on question (10) shown on figure 3.15. An autonomous robot should act only under direct and explicit orders from of a human being Figure 3.15 Ethical Situations by level of Autonomy by Gender C ommunity Type. PIQuUre 0.10 EUMCdl SIUAUONS DY LeVel OF AULONOMY DY GeNGer U OMMMUDITY 1 ype. When participants, are asked implicitly on the issue of autonomy, whether a robot should have a greater level of autonomy, to the point of operating entirely independently of humans. One is able to visualise a shift on the views collected from both gender types. The views shifted to the point of equating both gender types on the same “Likert-Scale-Type” point ‘strongly agreeing’. The main contrast which the respondents show is that 12.2% of males still agree with concept presented, which had a total of 329 responses for this question (11). Figure 3.16 Ethical Situations by level of Autonomy by Gender Community Type. Figure 3.17 Conforming to the Rules of Engagement and Laws of War. The figure below for question (19) shows the same question in contrast with the age type differentiation, which shows all the age types in agreement with the narrative behind the question, except with a small margin of difference. The under 20 years of age seem to show a great level of acceptance in comparison with the other two types of age with 31.1% of them strongly agreeing with the possibility of autonomous robot conforming with the laws of war and rules of engagement. The next two questions highlight the idea of equipping robots with lethal hardware such as shown on figure 3.19. Each respondent had five choices to choose, there was two types of community, males and females. The choices range from machine guns to nuclear weapons, in total there was 324 participants for question (24) shown below on figure 3.19, and 324 for question (25) presented on figure 3.20. Based on the choices presented to the respondents, it is clear that ‘machine gun’ is the most acceptable ethal hardware with 95.1% from males and 91.4% from females. The only difference between question 24) and (25) was in regards to the object being an autonomous which in the case of question (24) the object is remotely controlled by a soldier. Figure 3.19 Acceptance of Military Lethal Hardware. Nuclear weapons were the least acceptable but with a few females viewing it as acceptable, which was the least acceptable for male type. Figure 3.21 Following Directives. Figure 3.20 Acceptance of Military Lethal Hardware. Figure 3.22 Following Directives. These both questions had a total of 329 of all the participants which the results are shown on figure 3.21. The most acceptable notion was “the one asking the most moral requests” with the least acceptable being the “the individual it likes most”. This choice “the individual it likes most” implies that somehow thee machines have some sort of sentient which is not true as discussed in many of the chapters before. The following two questions reconnoitre the opinion or concept of classifying what is perceived as behaving ethically during or within a conflict ‘war zone’. Figure 3.24 Behaviour Ethical During a Conflict ‘War’. Figure 3.25 Moral Standards for Robots. The next question shown on figure 3.25 below creates a dilemma on the hypothesis of ethical standards for robots, which has been suggested by present literature connected to the problem domain, that somehow robots should be treated ethically different. The results below indicates that they should be treated the same, and with better as well as with higher ethical standards than humans if possible. A total of 323 respondents made their choice, and age type known when answering the question (32) below. Participants were given six choices, with the greatest choice (39.6%) being from the under 40 years of age, secondly from the under 65 years of age (38.2%) and (32.8%) from the under 20 years of age. The most unexpectedly was the fact that 12.2% of participants belonging from the under 20 and under 40 years of age see robots being held accountable for a “low level ethical standards” than human soldiers are held to. Figure 3.26 Ethical and Legal responsibility. These two questions (33 and 34) had a total of 321 partakers made of female and males. The major contrast between these two questions lies on the fact, that one entity is remote controlled, and the other is an autonomous military robot. The answer choices vary from military robot, military robot developer, politicians, military chief high ranking and software engineer/military robot electro- mechanic. As shown on figure 3.28 below, 42.6% (43) out of 100% (101) of females considered military chief high ranking as the most responsible when such AI objects begins to malfunctioning to the extent of taking a human life. And with 31.8% (70) out of 100% (220) males stating the most responsible when such AI objects begins to malfunctioning to the extent of taking a human life as being military robot developer. However, a great proportion of males also considered military chief high ranking as being the most responsible for such system failures. Figure 3.27 Ethical and Legal responsibility. This next question examines the benefits of implementing, and deploying the technology in question or substituting it as the alternative military soldier for lethal purposes. The answer choices were based on the ‘Likert-Scale-Type’ five points as shown below from “Strongly Agree to No opinion/Don’t Know choice, which has not been considered but shown for actual clarity. All the respondents participated on this research responded to this question w hich resulted in a total of 328 partakers. All the respondents had three choices to choose from as shown above on figure 3.28, and 3.29 below which shows all the results in more detail, with the most popular choice being “saving human lives (soldiers and civilians)”. This is one of the main arguments being used to advance the implementation and deployment of such killer machines. Out of 100% of all the participants 41.4% of them agree with the supposition illustrated previously, “saving human lives (s oldiers and civilians)”. With 28.1% of respondents disagreeing with the idea of military robots “Eradicating friendly fire accidents”, 24.4% of members of all the communities agreeing that AMR wil “reduce the inhuman treatment of enemy combatant”, and finally 23.5% Neither A gree/Nor Disagree, with the fact that military robots may “Reduce the inhuman treatment of civilians in warfare”. Figure 3.29 Benefits of Using Robots as the Alternative Military Soldier. Finally, the last question of t] acceptable to use or deploy robots. All the respondents were given four ch 3.29. The choices ranged from “In hospitals, as medical robots to help perf the operating theatre”, “In elderly care homes, as robot caregivers caring such like heart operations in his research explores the various situations w for the elderly”, “In the military “battlefield” as a robot soldiers”, and “In surveillance tool aiding the gathering of information”. For this specific question there were a total of 330 participants, the least po caring for the elderly” which responses chosen was “In hospitals, as medical robots to help perform complex operations such like gathered 3.3% of responses. The most accep hich may be ethically Oices as shown on figure orm complex operations the military as a spy and pular acceptable situation was “In elderly care homes, as robot caregivers able choice with 48.8% of heart operations in the operating theatre” and with more than half of this percentage seeing or viewing “Tn the military as a spy and surveillance tool aiding the gathering of information” as being acceptable to use or deploy robots of thi Ss nature. 22. In which, if any, of the following roles should an autonomous military robot be employed: A ppendix B: A Supplementary Analysis for Section 6.6.2 Ethical Acceptability Appendix C.1: Supplementary Analysis for Section 6.6.4 Warfare Dilemmas Appendix C: Supplementary A nalysis for Section 6.6.4 Warfare Dilemmas Itis right and acceptable for an autonomous robot to be given and be entitled to the same rights as human? Given the technological means available today, that a robot could be realistically a real ethical agent? \ ppendix C.3: Supplementary Analysis for Section 6.6.4 Warfare Dilemmas It is adequate to codify or embody ethics into a robot that is directl controlled by a human Appendix C.5: Supplementary Analysis for Section 6.6.4 Warfare Dilemmas Itis adequate to codify or embody ethics into an autonomous robot? Itis a good idea for an armed robot to have full autonomy? In which, if any, of the following roles should a military robot under the full control or supervision of a human being work: Appendix D.1: Supplementary Analysis for Section 6.6.5 Rules of Engagement and Laws of War In which, if any, of the following roles should an autonomous military robot be employed: In which, if any, of the following roles should an autonomous robot be employed as the alternative soldier: Question 3 Qne-Way ANOVA-:, Strongly Agree, , Agree,, Neither Agree/Nor Disagree, , Disagree, , Strongly Disagree, Appendix D.2: Supplementary Analysis for Section 6.6.5 Rules of Engagement and Laws of Wai