A new report on artificial intelligence from the Congressional Research Service says the technology will have a revolutionary impact on warfare.
The report also says that AI development presents unique challenges for military acquisitions, and suggests that the Defense Acquisition Process may need to be adapted for acquiring systems like AI. It states that AI acquisition faces a number of challenges, primarily cultural, which lead to discord with companies developing the technology and military aversion to adapting weapons systems and processes.
The U.S. military is already integrating AI into combat through Project Maven, which is using the technology to identify insurgents in Iraq and Syria. The report highlights a range of areas in which the military is looking to integrate AI, including intelligence and surveillance, command and control, lethal autonomous weapons systems, autonomous vehicles and cyberspace.
The study from Daniel Hoadley and Nathan Lucas points out that the U.S. is under immense pressure from international rivals to compete for innovative AI military applications. Last year, China released its own plan to capture the global lead in AI development by 2030, and Russia is also hugely active in developing the technology, particularly in the area of robotics.
The study also raises the risks associated with AI, along with the benefits. For example, analysts believe it is unpredictable and vulnerable to manipulation, presenting challenges to human-machine interaction. SpaceX CEO Elon Musk sent a letter to the UN last year, co-signed by 114 leaders, which warned that autonomous weapons fueled by AI will “permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans comprehend.”
According to the report there are several issues that Congress needs to consider for military AI development, which include finding the right balance of commercial and government funding for AI development, and what influence Congress should have on Defense Acquisition reform initiatives when it comes to AI.
Funding for AI has been a particular area of concern, with one expert telling the Senate Committee on Commerce, Science, and Transportation last year that “federal funding for AI research and development has been relatively flat, even as the importance of the field has dramatically increased.”
Lieutenant General John Shanahan, lead for the Pentagon’s most prominent AI program, also identified funding as a barrier to AI development and a study from Govini shows that DoD spent $800 million on AI in FY2017.
Those opposed to increased federal funding on the technology, though, say that the military ought to be taking greater advantage of the research conducted by the commercial sector. Guidance from OMB directs DoD to “identify existing R&D programs that could progress more efficiently through private sector R&D, and consider their modification or elimination where federal involvement is no longer needed or appropriate.”
The study also highlights the need for Congress to fully implement AI oversight, with several experts calling for separate AI bodies within DoD. For example, in 2016 the Defense Innovation Board, chaired by Alphabet CEO Eric Schmidt, recommended the creation of an AI and Machine Learning Center of Excellence inside DOD “to spur innovation and transformational change.” Lethal Autonomous Weapons Systems are also an area of contention with many experts concerned that efforts to control LAWS will stifle development of other useful military AI technology due to strict controls on applications that may be put to use in a lethal system, even if they are not fundamentally designed to do so.
It goes on to point out some of the acquisitions challenges that DoD face in implementing AI, such as the potential for human manipulation, and the greater risk to human life posed by military applications of AI algorithms, compared with other sectors. As well as the technological adaptation challenges, the study also highlights that DoD may need to adjust the timeline and processes in military acquisitions to match those of commercial companies. “Defense acquisition processes might not be agile enough for fast-paced software systems like AI,” it says.
While the governing DoD instruction 5000.02 stipulates a five-phase process that can take an average of 91 months to move to operational capability, commercial companies might deliver an AI product in six to nine months. It cites a GAO report that surveyed 12 U.S. companies who chose not to do business with DoD, and all 12 stated that the complexity of the Defense Acquisition Process was the reason why.
The report concludes that Congress needs to take action to address some of these issues in relation to AI and national security: “Congressional action on AI funding, acquisitions legislation, development of AI norms and standards, and issues of international competition has the potential to significantly shape the trajectory of AI technology, and experts agree that continuous evaluation of legislative actions may be necessary to keep this technology pointed in a direction that preserves U.S. national security.”