Share:

Call to Action based on Opinion on Responsible Dual Use from the Human Brain Project

Please consider the following recommendations for dealing with dual use concerns in relation to misuse of brain research and new computing technologies. This is a call to action to follow these recommendations. Signers of this petition support this initiative to be recognized and applied in all countries.

All credit for the recommendations that this petition supports to the The Ethics and Society group of the Human Brain Project (HBP) Subproject 12. Their paper can be found at this link. https://sos-ch-dk-2.exo.io/public-website-production/filer_public/f8/f0/f8f09276-d370-4758-ad03-679fa1c57e95/hbp-ethics-society-2018-opinion-on-dual-use.pdf

First print 2018
Copyright: Danish Board of Technology Foundation ©
Layout: Danish Board of Technology Foundation
Cover Illustration: Søren B. Jepsen

Index
1. Introduction Page
2. Dual use in a changing context: defence and security in the European Union
3. Responsible Dual Use
4. Political, Security, Intelligence and Military Research of Concern
5. Stakeholder and Citizen Views on Dual Use Research of Concern
6. Conclusion
7. Recommendations for the Human Brain Project
8. Recommendations for the European Union
9. Recommendations for Other Social Actors

1. Introduction

1.1
Current and newly emerging insights and technologies arising from research in brain sciences increase capabilities to access, assess and affect thought, emotion and behaviour. While much of this research and development is directed towards clinical use, it also has applications in other settings, notably in the political, security, intelligence and military (PSIM) domains. This is often referred to in terms of ‘Dual Use’. Many of these potential uses raise important social and ethical questions which demand the attention of all those involved in the research, administration, management and regulation of neuroscience research and related technological developments, including those in information and communication technologies and robotics.
1.2
For this reason, the Ethics and Society division of the Human Brain Project undertook research, organized a series of consultations, webinars, workshops and surveys with citizens, experts, scientists and engineers and other stakeholders, and developed a number of scoping reports to identify current and potential applications of brain research and brain inspired technologies in the above-mentioned domains and to analyse their social and ethical implications. In these activities, we explored the strengths and weaknesses of existing definitions of dual use, undertook conceptual clarification of the issues involved, described the scope of existing regulation in the EU and elsewhere and identified key ambiguities in those regulations and guidelines, including the undertakings that researchers are required to make before receiving EC funding. These reports form the basis of this Opinion and its recommendations to the Human Brain Project, to the wider neuroscience and ICT community, to authorities and industry concerned with political, security, intelligence and military research and development in neuroscience, neurotechnology and brain ICT, and to EU member states and the European Union.
1.3
In regulations concerning EU funding of research, it has been conventional to define ‘dual use’ as the military use of technologies developed for civilian purposes and to specify that such funding can only be provided for research with exclusively civil applications. A further set of regulations concern the question of ‘misuse’, and requires applicants for research funding, and evaluators of research proposals to consider the potential of the research for ‘misuse’, by which is meant “research involving or generating materials, methods, technologies or knowledge that could be misused for unethical purposes” despite the benign intentions of the researchers. A third set of regulations concern export controls upon items with a potential for ‘dual use’; these controls are placed on the export of items outside the territory of the EU if it is believed they may be used in connection with a biological, chemical, nuclear weapons or ballistic missile weapons programme, in violation of an arms embargo in contravention of UN treaties and other legislation.
1.4
Other organizations, for example the World Health Organization and the US National Institutes of Health focus on ‘dual use research of concern’, combining questions of dual use and misuse. Thus for the US NIH, dual use research of concern is “life sciences research that, based on current understanding, can be reasonably anticipated to provide knowledge, information, products, or technologies that could be directly misapplied to pose a significant threat with broad potential consequences to public health and safety, agricultural crops and other plants, animals, the environment, materiel, or national security.” However, concerns will not always be related to ‘misapplication’, and the classification of a piece of research as ‘of concern’ will always be a matter of judgement subject to dispute. Hence it is not self-evident who should be empowered to make such a decision, how they should make such a decision, and with what consequences.
1.5
In this Opinion, we suggest that we can increase our ability to identify which programmes and projects of research, development and innovation are ‘of concern’ by applying the principles of Responsible Research and Innovation (RRI) to the concept of ‘dual use’ and distinguishing between ‘responsible’ and ‘irresponsible’ systems of research and technological development. We therefore use the term ‘dual use research of concern’ (DURC) to refer to neuroscience research and technological innovations, and brain inspired developments in information and communication technologies, for use in the political, security, intelligence and military domains, which are either directly of concern because of their potential for use in ways that threaten the peace, health, safety, security and well-being of citizens, or are undertaken without responsible regard to such potential uses.
1.6
We focus here on those brain inspired neuro- and ICT technologies that are already in use or in advanced stages of development, for example, in warfighter ‘enhancement’, intelligence gathering, image analysis, threat detection, deception detection, manipulation of emotional states, incapacitation of adversaries, and the development of autonomous or semi-autonomous weapons, or weaponized robots using artificial intelligence technologies and machine learning algorithms for target detection and elimination.10 It is also important to note that some of these technologies are ‘of concern’ because they pose threats to the health, safety and security of those military, intelligence or security personnel who are required to deploy them. These developments are already affecting training and deployment of military, security and intelligence personnel, reshaping intelligence and surveillance activities, being used in control of terrorist incidents and civil unrest, being implemented in the battlefield of the present, being developed by non-State actors, and underpinning imaginations and strategies of nation states concerning the battlefield of the future.

2. Dual use in a changing context: defence and security in the European Union

2.1
The Lisbon Treaty on European Union states, in Article 3.1, that “The Union’s aim is to promote peace, its values and the well-being of its peoples” and emphasizes elsewhere the centrality of the Union’s contribution to peace and security.12 It also commits itself to respect the principles of the United Nations Charter, which states, in Article 1, that the purposes of the United Nations are to maintain peace and security, to prevent and remove threats to peace, to suppress acts of aggression, and to settle disputes by peaceful means. Nonetheless, Article 42 of the Treaty states that “common security and defence policy shall be an integral part of the common foreign and security policy. It shall provide the Union with an operational capacity drawing on civilian and military assets” and requires member states to “make civilian and military capabilities available to the Union…. Member States shall undertake progressively to improve their military capabilities. The Agency in the field of defence capabilities development, research, acquisition and armaments (hereinafter referred to as ‘the European Defence Agency’) shall identify operational requirements, shall promote measures to satisfy those requirements, shall contribute to identifying and, where appropriate, implementing any measure needed to strengthen the industrial and technological base of the defence sector”. Some commentators consider that, taken together, these obligations in the Lisbon Treaty produce an inescapable tension between the fundamental commitment of the European Union to peaceful resolution of conflicts and the pressure for military innovation which must draw upon scientific and technological innovation, including, today, innovation inspired by neuroscientific research and developments in information and communication technologies, and artificial intelligence. This tension sets the context for current dilemmas concerning dual use.
2.2
The research policy of the European Union is set out in successive Framework Programmes for Research and Technological Development – funding programmes whose aim is to support and foster research in the European Research Area. The current Framework Programme is called ‘Horizon 2020’ and the Human Brain Project is a Horizon 2020 Future and Emerging Technology Flagship Project. The European Commission specifies that “only research and innovation activities focusing on civil applications are eligible for funding under Horizon 2020.” However, in setting out which research can be funded, the Commission goes on to clarify that this does not prohibit collaborations with defence or military related organizations, or research on defence-related subjects so long as the “aims are exclusively focused on civil applications”. It seeks to clarify eligibility in a note of guidance which states that “In order to determine whether a project or proposal meets the conditions laid down in the regulation, the objective(s) of the proposed activity have to be assessed. If the technologies/ products/services concerned are intended to be used in non-military activities or aim to serve non-military purposes, they will be considered as having an exclusive focus on civil applications.” However, since neither the objectives, aims or intentions of researchers delimit the potential uses of their research, these protocols make it difficult to identify exactly which kinds of research would be ineligible for funding within the context of Horizon 2020.
2.3
Further, the European Union, in common with many other organizations, places increasing emphasis on ‘open science’ through its vision of ‘Open Innovation, Open Science, Open to the World’.16 In urging researchers and innovators to make not only their results but also their data freely and openly available to the global scientific community, unless there are very specific reasons to the contrary, they also make it difficult if not impossible for those undertaking research which is focussed on civilian uses to prevent or restrict their data and findings being used for undesirable or irresponsible political, security, intelligence or military purposes.
2.4
Crucially, the context of dual use research in the EU is changing in response to growing concerns about terrorism, cyber-threats and potential challenges to military and defence capabilities posed by artificial intelligence and robotics. The European Commission launched the European Defence Fund in 2017, and will be offering grants for research into defence-related products and technologies, specifically within the areas of cybersecurity, big data, artificial intelligence, robotics and supercomputing. These developments demonstrate an increasing need to define the relationship between the EC’s civil research funding through its successive Framework Programmes and its emerging funding programmes for research with military, security and defence applications, especially as they apply to neurobiological research and brain inspired technologies.
2.5
Since the Second World War, the research and technological developments with which we are concerned in this Opinion are increasingly being undertaken by commercial companies. Such private corporations undertake their own research, but also make use of publicly funded and openly available research. Products embodying such research are then sold to State and non-State actors. This military-industrial complex, made famous through the the farewell address of President Dwight D. Eisenhower in January 1961,17 is undoubtedly most highly developed in the United States, but military-commercial relations of this sort are present in all advanced industrial societies. Military, security and intelligence organizations sometimes simply ‘harvest’ commercial technologies and adapt them for their own purposes, but more substantively, most national security and defence agencies depend for many of their capabilities on the procurement of key elements of their technological requirements from commercial organizations. In the context of a pervasive technological arms race, companies seek commercial gain by developing the most advanced technological products that will attract such military, security and defence contracts. Many of these products already embody innovations inspired by neuroscientific research, and those based on research in information and communication technologies, robotics and artificial intelligence.21 These powerful commercial drivers create complex challenges for policies and practices to regulate the use of neuroscience research and neurotechnological innovation for political, security, intelligence and military purposes.

3. Responsible Dual Use

3.1
Through its social and ethical research, public engagement and ethics management, the Ethics and Society division promotes Responsible Research and Innovation (RRI) practices within the HBP. In the 2014 Rome Declaration on Responsible Research and Innovation in Europe, RRI is defined as an “on-going process of aligning research and innovation to the values, needs and expectations of society”;22 in the HBP, the AREA framework - Anticipate, Reflect, Engage and Act - is used to help implement RRI in relation to emerging social and ethical concerns. 23 Responsibility, here, does not simply refer to responsible conduct by individuals, which is established by professional codes of ethics, and by regulations requiring ethical approval of research by relevant ethics committees, and a framework of laws and regulations surrounding different kinds of scientific research: in the HBP, this aspect of RRI is overseen by an apparatus of ethics management. More broadly, in the context of RRI, responsibility refers, not to fixed norms, but to processes and practices within research and development systems, and the extent to which they encourage or constrain the capacity of all those involved in the management and operation of research to reflect upon, anticipate and consider the potential social and ethical implications of their research, to encourage open discussion of these, with a view to ensuring that their research and development does indeed contribute to the health and well-being of citizens, and to peace and security.
3.2
There are many features of research and development organizations that can reduce their capacity to meet the challenge of responsibility, leading to the neglect of ethical principles, and the lack of precaution and foresight. These are particularly pronounced in a context where State and non-State actors seek to prevail over adversaries through technological superiority. These include ‘technology push’ – that is to say the wish to find a market need that a given novel technology might fulfil – and market pull – that is to say demands from potential customers for a technology that will meet their needs. While these are of particular relevance in the commercial domain, they also shape publicly
funded research in a context in which universities seek to enhance their income through the licencing of valuable intellectual property and the creation of ‘spin-out’ companies. Even where research is not explicitly directed to commercial ends, research funders increasingly demand promises of rapid impact, and this can mitigate against careful and considered reflection and anticipation of the potential consequences of new technologies.
3.3
A policy of responsible innovation thus aims to introduce social and ethical reflection into innovation process, to mitigate systemic forces that work against responsibility, and to open up the decision making process at all stages to a wider range of social actors. In order to achieve this, those involved in the research, and in its management and direction, must be able to explain clearly and openly what they are doing, why they are doing it, what are the potential benefits, what are the potential risks and how are they to be controlled. We suggest that introducing these principles of RRI can enable us to clarify what might count as Dual Use Research of Concern, and thus can help underpin a framework for governing the relationship between brain research and development for civilian applications and its uses in political, security, intelligence and military domains.
3.4
Some scientists and ethicists take the view that any use of scientific research for military purposes is unethical, violating the principle that scientific research should solely be undertaken for the purposes of peace and well-being. They argue that even when justified in the name of defence, such research actually reduces the possibility of the peaceful resolution of conflicts. When articulated within an ethic of pacifism, such a stance is internally coherent and it is not the role of this Opinion to adjudicate upon it. However, for the present and foreseeable future, armed conflicts between nations will endure, asymmetrical warfare between State and non-State actors is likely to increase, demands will continue for novel technologies to enhance internal and external security, and a powerful arms industry will seek to develop and market technologically enhanced products drawing on research in neuroscience and information and communication technologies. This Opinion seeks to set out a framework to promote and enhance responsibility among those involved in the research and development of such technologies.
3.5
It is important to acknowledge that there are important social benefits from research and development conducted in security, military and defence domains, but also to recognise that such developments can generate new dilemmas. Consider, for example, the Internet and GPS, which were both developed by the US military, the development of neuroprosthetics for war veterans, or research into treatments for Post-Traumatic Stress Disorder. Each has generated great social benefits, yet each can also be applied in ways that raise social and ethical concerns. Thus while the Internet has revolutionized communication, it also facilitates global surveillance of the activities of individuals without their consent. The development of prosthetics controlled by brain-computer interfaces also facilitates the control of robotic weapons from locations remote from the battlefield, which can insulate military personnel from awareness of and responsibility for the consequences of their decisions. Psycho-pharmaceuticals such as propranolol can aid in the treatment of post-traumatic stress disorder among war veterans, but it has also been suggested that it could be used facilitate extreme methods of interrogation by mitigating the effects on suspects, and by reducing the consequences for interrogators, by blocking the reconsolidation of their traumatic memories of the events in which they have participated.
3.6
It is also important to recognise that the identification of research ‘of concern’ is seldom straightforward. For example, the accuracy of targeting of weaponized drones might be increased by brain inspired guidance technologies, and weaponized robotic devices might be used to defend against terrorist attacks; in each of these cases such technologies might lower the threshold for a decision to attack adversaries, but might also minimise both military and civilian casualties. There will inevitably be debate about whether research and development that enables such uses is ‘of concern’; application of the principles of RRI does not seek to eliminate such debate but to enable it, to build capacity to reflect on the issues involved, and to engage researchers and other stakeholders in the decision process.
3.7
In any event, there are no clear divisions between civilian research and military, defence or security research. In many universities, there is considerable interpenetration between civilian research and research that is funded by the security, military and intelligence sectors. This is despite the fact that the constitutions of universities in some countries seek to restrict research funded by, or directed to, the military. For example, many US Universities have policies that restrict or prohibit “classified” research, and some German universities have a “civil clause” that requires the university to conduct
research exclusively for peaceful (civilian) purposes and excludes military research. However, evidence demonstrates that leading US and German Universities receive large sums in grants from military organizations and defense departments, despite the reservations of many of their academics. The same is certainly true in the universities of many other nation states.
3.8
For these reasons, simple attempts to draw a clear ethical distinction between civilian and military research on the basis of the aims, objectives or intentions of the researchers, or on the basis of the organization or institution where it is carried out, are unhelpful. It is also not always appropriate to draw this distinction in terms of military or civilian funding sources; for example in the US, a considerable proportion of the portfolio of research in the US BRAIN Initiative was channeled through the Defense and Allied Research Projects Agency (DARPA), on the basis that the outcomes would have major civilian benefits in understanding brain function, as well as potential applications in the military, ranging from rehabilitation of wounded warfighters to more effective weapons systems.26 Protocols and regulations that do utilize such simplistic distinctions are, thus, likely to fail to prevent the problems that they purport to address. Challenging as it may be, it will always be necessary to use judgement to identify those dimensions of research and technology development that are of concern because they are being developed or deployed in systems, organizations and practices that mitigate against responsibility, that is to say, do not permit, encourage or respond effectively to consideration of their potential for uses that threaten the peace, security, health and well-being of citizens and societies.

4. Political, Security, Intelligence and Military Research of Concern

4.1
In this Opinion, we focus on four domains of application of brain-inspired research and innovation that may raise dual use issues of concern: Political, Security, Intelligence and Military. While many nations are engaged in the development of such technologies, the United States remains the world leader in terms of publicly acknowledged investments in this area. Thus Tennison and Moreno argue that “during the past decade [the first decade of the twenty first century], the US national security establishment has come to see neuroscience as a promising and integral component of its 21st century needs”.27 The authors estimate that DARPA invested about US$240 million for such research in the fiscal year of 2011,28 the Army invested around US$55 million, the Navy some US$34 million, and the Air Force approximately US$24 million. The US Intelligence Advanced Research Projects Activity (IARPA) also funds security-related neuroscience and artificial intelligence research projects, for instance those seeking to understand, and access, the ways in which knowledge is represented and stored in the brain.29 While much information on such research and development is freely available in the US, it is safe to assume that analogous research and development is being pursued in many other nation states.
4.2
By political uses, we refer to the use of neuroscience or neurotechnologies by state authorities to govern or manage the conduct of individuals, groups or populations, for example by changing or manipulating attitudes, beliefs, opinions, emotions or behaviour.30 Thus, for example, neuroscientific research on the non-conscious determinants of decisions can be used to manipulate individual choices without their consent. Substances such as the hormone and neurotransmitter oxytocin can be used to inspire trust and facilitate ‘pro-social’ conduct’. There are many other, somewhat speculative, accounts of how neuroscience research can underpin methods to covertly shape decision making in desired directions.
4.3
There are also many ways in which neuroscience and neurotechnologies can be deployed in the name of security, that is to say in the strategies or pre-emption and preclusion that increasingly characterise the activities of the security apparatus in the name of protecting the nation against perceived internal threats arising from civil disobedience, terrorism, and associated risks. For example, in the US, IARPA funds projects such as Knowledge Representation in Neural Systems (KRNS), which seeks insights into the brain’s representation of conceptual knowledge, and Machine Intelligence from Cortical Networks (MICrONS), which aims to reverse-engineer the ‘algorithms of the brain’ to revolutionise machine learning.34 A further example in the domain of security relates to the use of neuroscientific research to develop ‘non-lethal’ or ‘less lethal’ nerve agents, such as ‘calmatives’, whose use is prohibited in warfare under various treaties, but which some consider to fall outside the scope of these treaties when used to control crowds of demonstrators in the name of national or homeland security.
4.4
Neuroscience and neurotechnologies initially devised for civilian brain research or clinical use are also increasingly utilized in the work of intelligence agencies.36 Brain imaging technologies are being employed for the purposes of detection of lies or deception in interrogation, despite the concerns of many layers and ethicists.37 Neurofeedback and brain-machine interfaces are being used to augment performance of intelligence analysts.38 Machine learning is being used to in the analysis of data from surveillance systems, and also to identify and apprehend individuals on suspicion that they might commit terrorist acts.39 Civilian researchers themselves have raised concerns about partnerships between commercial companies and military organizations that seek to weaponize the capacities of artificial intelligence.40
4.5
There are also many military applications of contemporary developments in neuroscience and neurotechnology that are already in use, and many more are in development or under consideration.41 While there is a long history of the use of brain altering pharmaceuticals in warfare, there is considerable interest in the potentials of novel psychopharmaceuticals to modulate cognitive capacities and emotions, for example the use of drugs such as modafinil, developed to treat sleep disorders, to prevent the degradation of performance arising from sleep deprivation.42 Many of the technologies for ‘warfighter enhancement’ have raised particular concerns.43 Developments in this domain include non-invasive neuromodulation to enhance threat detection, 44 and the use of implanted brain-computer interfaces to read and modulate neural activity. DARPA’s SUBNETS programme, announced in 2014 as part of the BRAIN initiative, was directed towards clinical applications, seeking “to create an [brain] implanted, closed-loop diagnostic and therapeutic system for treating, and possibly even curing, neuropsychiatric illness.”45 In 2016, in a further development with implications for warfighter enhancement, DARPA announced a $60M project, also part of the BRAIN initiative: the Neural Engineering System Design (NESD) programme, which “aims to develop an implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world. The interface would serve as a translator, converting between the electrochemical language used by neurons in the brain and the ones and zeros that constitute the language of information technology”: its aim was thus not only to ‘read’ the brain but to enable its activity to be digitally modulated.46
4.6
Research that draws on technologies developed to control prosthetic limbs, and seeks to combine this with artificial intelligence technologies and neural networks in the development of robots and autonomous weapons is particularly controversial, not least because of the risk of action being taken by such autonomous agents without due deliberation about key issues of ethics, proportionality and consequences. These matters have been of particular concern to AI researchers, leading to the formation of ICRAC, an international Committee for Robot Arms Control,47 many governments have discussed these concerns in the meetings of the un Convention on Certain Conventional Weapons (CCW),48 and the United Nations has considered potential pathways to ban ‘Lethal Autonomous Weapons Systems’ (LAWs), especially in the context where some counties have said they will ignore any such ban.

5. Stakeholder and Citizen Views on Dual Use Research of Concern

5.1
In preparing this opinion we carried out interviews, face-to-face engagement and online in-depth surveys with stakeholders and citizens, in line with the aspirations of RRI to open up the research and innovation process to debate and discussion. Input from the consultations has influenced our analysis and recommendations, and we highlight below the major themes that emerged.
5.2
Stakeholders included e.g. neuro-related industry, researchers, peace NGO’s, military academies and human rights actors.50 Concerns were expressed about the risks of a new arms race, based on technologies, which by their nature could be difficult control. On the other hand, some stakeholders emphasized that some of these technologies could enable us to reduce some of the consequences of conflict, for example, by avoiding collateral damages. There was a call for stronger societal influence in the steering of dual use related research. Some stakeholders drew attention to the contradiction between, on the one hand, the active use of semi-autonomous systems in current conflicts, and, on the other hand, simultaneous calls for strong implementation of rules of engagement. Some believed that the best path was to resolve this contradiction by stronger regulation of the design of the technology itself, while others believed that the way forward was by stronger restrictions in rules of engagement.
5.3
Several stakeholders argued that there was a strong need for education on dual use and related ethical and societal considerations. Concerns were raised about the fact that curricula in research areas of relevance to dual use are lacking or at least very weak on these issues.51 Calls were made for the development of concrete education programmes involving ready-made and easily accessible educational materials and group learning processes.
5.4
Citizens believed there were insufficient powers available for preventing or controlling the development of dual use technologies.52 Their concerns were about civilian use as well as dual use in political, security, intelligence and military domains. Most believed that the Human Brain Project should not allocate funds directly for dual use research, but many accepted collaboration with institutions doing dual use research. This reflects a sense that there was a need for stronger separation between civil research and dual use research.
5.5
When citizens were asked to consider three areas in which neuroscience and neurotechnologies
could be used - brain-computer interfaces, medicine and artificial intelligence – their concerns focussed on the following: changing/controlling personality, the mind and free will; increased surveillance and infringement of privacy; and the risks of hacking future dual use related technologies. As with stakeholders, citizens supported the development of international regulation, steering and control mechanisms, concretely in terms of an EU standing committee. A majority of their suggestions for change were directed towards policy-making. Despite concerns about dual use, the consultations revealed strong support for continued neuroscience investments, mainly because of the belief that these would increase options for new treatments.

6. Conclusion

6.1
The boundaries between civilian and non-civilian uses of neuroscience and neurotechnology are increasingly difficult to delineate. It is not only that the openness of scientific research makes it almost impossible to control its irresponsible uses, or its use by those with malign intent. It is also that it is increasingly difficult to disentangle the respective contributions of civilian research and research funded with a direct military objective in relation to military technology development and use.
6.2
While there are tremendous civilian benefits to advances in these fields, it is imperative to distinguish between the responsible and irresponsible development and deployment of neuroscience and neurotechnology within the political, security, intelligence and military domains. Recognizing that such developments occur along a complex pathway,53 it is necessary to distinguish between appropriate measures at the different stages of that pathway – basic research, development of applications, processes of adoption, regulation of uses - in order to identify those aspects that are of concern because of inadequate consideration of their potential for uses that threaten the peace, security and well-being of citizens, including the well-being of military, intelligence and security personnel themselves, and hence that require special attention or regulation.
6.3
The requirement in existing European Commission Framework Programme funding for an ‘exclusive focus on civilian applications’ thus needs to be considered in light of the potential of many innovations in neuroscience and ICT to be developed or deployed within political, security, intelligence or military domains. Hence, there is a need to re-examine the relationship between civil and military research funded by the European Union and its agencies, recognizing that this distinction does not adequately identify those kinds of research and innovation that potentially raise dual use concerns as defined in this Opinion and therefore should be the subject of particular scrutiny.
6.4
We consider that these issues require ongoing attention, not only within the Human Brain Project itself, but also in the European Union – both member states, Commission and European Parliament - and in the wider neuroscience and ICT community. Our recommendations are proposed to that end.

7. Recommendations for the Human Brain Project

We note that, in the light of the issues discussed in this Opinion, the Human Brain Project Governance Bodies has mandated Subproject 12 (Ethics and Society) to lead an HBP Working Group to develop an action plan on Dual Use, for approval by the Governance Bodies, and for implementation during and after the HBP project period. The Working Group is mandated to engage with the European Commission, with other relevant expert and stakeholder groups and with the public, and to propose actions to the Human Brain Project and other stakeholder groups on the following recommendations.
7.1
We recommend that the Human Brain Project evaluates the potential implications for dual use research of concern of the HBP programme as a whole as well as examining its individual components. That is to say we recommend the examination of whether, taken together, individual elements of the HBP that have no obvious dual-use implications on their own may present dual-use concerns in combination with others. a. We recommend that this be a key role for the Ethics Rapporteurs in each Sub-Project, and hence that the Co-Design Projects should also engage in the Ethics Rapporteur process. b. We recommend that appropriate mechanisms for ongoing periodic review should be implemented.
7.2
We recommend that all those accessing and using the Human Brain Project platforms, as a condition of access, must:
a. explicitly affirm their commitment to the principles of responsible research and innovation in their research, and their commitment to ensure, to the best of their abilities, that their work will not be used in ways that threaten the peace, security, health and well-being of citizens, through signature to a statement of intent or functional equivalent.
b. explicate such ethical intent in a formal statement to be included on publications and proposals of scientific work.
7.3
We recommend that the Human Brain Project gives careful consideration to the potential political, security, intelligence and military uses of concern arising from Pre-Commercial Procurement from private companies and ensures that such private contractors are required to put in place transparent and auditable processes of ethical governance, demonstrating a commitment to the principles of responsible research and innovation as it applies to dual use research of concern, and to confirm their adherence by establishing a formal policy.
7.4
We recommend that, in the light of considerations of dual use research of concern, the Human Brain Project Science and Infrastructure Board considers:
a. whether and on which conditions to partner with institutions and projects that receive military funding, so as to ensure that any research funded by the HBP does not contribute to dual use research of concern.
b. whether and on which conditions to provide platform access to individuals or institutions with funding or other ties to defence agencies.
7.5
We recommend that the Human Brain Project develops an educational programme concerning the political, security, intelligence and military uses of brain inspired research and development:
a. that provides ongoing seminars/symposia, webinars, publications, and online informational material addressing the potential irresponsible uses of HBP research in political, security, intelligence and military domains
b. that directs this educational material to all HBP researchers as well as members of the public, governmental agencies, policy makers, regulators, research funders, etc.
c. that establishes requirements that any/all HBP research personnel participate in these educational activities on a defined, regular basis.

8. Recommendations for the European Union

8.1
We recommend that the European Commission extends its policies on dual use research, beyond a focus on the aims, objectives and intentions of the researchers, to ensure that adequate processes are established across the research and development pathways for proper and transparent consideration and effective mitigation of the risks that may be posed to the peace, security and well-being of citizens, including the well-being of military, intelligence and security personnel themselves.
8.2
We recommend that the European Commission addresses the tension between the policy of ‘Open Innovation, Open Science, Open to the World’ and the need to regulate and restrict dual use research of concern.
8.3
We recommend that future large-scale Future and Emerging Technology Flagship projects and the mission-oriented elements of Framework Programme 9 should include specific activities to ensure RRI and ethical governance, including assessment of potential dual use research of concern.
8.4
We recommend that dual use research of concern becomes a research theme in EU research programmes, an element in the RRI cross-cutting issues and in sub-programmes such as the Horizon 2020 Science with and for Society (SwafS) programme.
8.5
Given that there are a number of active conventions and treaties in this area, we recommend that the EC should ensure that all Framework Programme partner countries are signatories of all relevant international treaties and conventions, have ratified them, and are in compliance.
8.6
In order to address these issues as they arise in the present and the near future, we recommend that the European Commission should establish a Standing Committee or High Level Advisory Board, which has a multi-actor composition, to have oversight of all EC funded research with political, security, intelligence and military potentials, to review the existing regulations in the light of the issues raised in this Opinion, and to report to the Commission and to Parliament on these issues, formulating recommendations concerning future EC funding and related issues as appropriate.

9. Recommendations for Other Social Actors

9.1
In a context in which the lines between law enforcement and national security are increasingly blurred, we recommend that work is undertaken by relevant international bodies to refine and extend existing treaties such as the Biological Toxins and Weapons Convention and the Chemical Weapons Convention, to remove the ambiguity over the ‘domestic’ use of ‘calmatives’ and ‘less-lethal’ agents, and to address issues raised by novel methods such as gene editing that may enable the weaponization of neurobiology.
9.2
We recommend that all higher education training and post-university professional training of neuroscientists and neurotechnology engineers, including researchers in robotics and artificial intelligence, should be required to include education in the social and ethical issues raised by their work, including questions of dual use.
9.3
We recommend that universities and other organizations engaged in neuroscientific and neurotechnological and neurorobotics research, including those funded from military and defence sources, establish policies for self-regulation in relation to dual use research of concern and related processes for technology assessment, together with transparent processes of ethical governance, including the education of managers and researchers on the principles of Responsible Research and Innovation, the screening and monitoring of research and development and the mitigation of dual use issues of concern as and when they arise.
9.4
We recommend that industry and corporations involved in artificial intelligence, machine learning and autonomous intelligent systems establish ethical review or oversight panels as part of transparent processes of ethical governance, and seek to build on existing developments to raise capacity for ethical awareness among their researchers.