House Armed Services Digs Into Information Warfare

First, my first Lawfare article has been posted on data brokering and national security.

Second this piece has been cross posted at The Wavelength, my subscription newsletter. Subscribe today if you want to receive these posts in your inbox.

The House Armed Services Committee’s Cyber, Innovative Technologies, and Information Systems Subcommittee conducted a 30 April hearing titled “Technology and Information Warfare: The Competition for Influence and the Department of Defense.”

Twitter

Will the U.S. catch up to Russia and the PRC on information operations?

Cocktail Party

A house committee continues to bang the drum on military information operations. A panel of experts advised the committee on how the Department of Defense (DOD) in particular and the United States (U.S.) in general can better conduct information warfare. There was agreement that the Russian Federation and People’s Republic of China (PRC) were far more advanced in this realm and that the U.S. has not yet begun to take seriously this new area of competition.

Meeting

The subcommittee chair, Representative James Langevin (D-RI), intimated the DOD is not moving fast enough or effectively enough to combat Russian, Chinese, and other information operations. He also pinned blame on the last administration and called on the Biden Administration to articulate an effective strategy. Langevin conceded that Congress may not have provided a sufficient level of resources for the DOD.

Langevin’s counterpart, Ranking Member Elise Stefanik (R-NY), was even more direct in her disapproval of how the Pentagon has used the authority on information operations Congress has provided and its disregard of Congressional intent.

The witnesses offered a buffet of options as to how the U.S. and DOD can pull even in information operations with its adversaries, much of which requires buy-in from and direction to the Biden Administration.

Geek Out

Chair James Langevin (D-RI) (watch his opening statement) contended the United States (U.S.) is at a disadvantage to the Russian Federation, the People’s Republic of China (PRC), and terrorist organizations in the information warfare realm. He asserted these entities use information warfare to seek and gain asymmetrical advantage against the U.S. and to undermine the international order and democratic values. Langevin stated the recently released Intelligence Community’s Worldwide Treats Assessment made clear U.S. adversaries are using information warfare to undermine the U.S. by sowing discord among citizens. He noted these efforts have turned what once was a U.S. strength in its informational advantage into a weakness.

Langevin claimed that at present the U.S. military is at an immense disadvantage in the information environment. He stressed that the U.S is under threat even though adversaries do not need to engage physically or cross U.S. borders. Langevin predicted these threats will grow as artificial intelligence, machine learning, and other technologies advance that will allow the speed and scope of these operations to increase exponentially. He quoted the National Security Commission on Artificial Intelligence (NSCAI) which warned that AI will make information operations much harder to counter.

Langevin declared his solution to information operations entails a forceful U.S. response to deter bad actors, investment in robust public diplomacy, and education of U.S. residents and servicemembers. He added the U.S. must articulate a vision for the information environment and delineate thresholds that will trigger a response. Langevin lauded the NSCAI recommendation that the U.S. develop a new strategy to counter disinformation while investing in technology to counter AI-enabled warfare.  Langevin stated the subcommittee will examine how the DOD is structured to confront information operations and disinformation, cyber threats, the electromagnetic spectrum, military information operations, deception, and operational security.

Langevin noted the committee has pushed the DOD to adapt to the new and evolving information environment and asserted the DOD has a key role to play. Langevin said the committee and Congress have pushed the Pentagon to adapt to the weaponized information environment, including by creating the position of the Principal Information Operations Advisor. He voiced his concern that the DOD has been slow to adapt to the current information operations realm. Langevin quoted a memorandum most of the combatant commanders wrote asking for more assistance and greater action in pushing the DOD to evolve along with current conditions.

Ranking Member Elise Stefanik (R-NY) (watch her opening statement) asserted information warfare is one of the most complex and important missions the DOD undertakes. She claimed that just as in large scale wars of the past and today’s gray zone operations, shaping the information environment is frequently critical. Stefanik stated it is important not only to target and erode support for adversaries but also to win the hearts and minds remains the ultimate objective of information operations. She quoted a former senior advisor to a Secretary of Defense who said victory comes when the enemy speaks your language and embraces your ideas. Stefanik declared Russia, the PRC, Iran, and non-state actors are weaponizing information to undermine the U.S. and its interests, deploying asymmetric information capabilities rather than engaging in traditional military competition. She asserted the U.S. must not only fend off these efforts but also deploy its own capabilities to exploit and shape the environment.

Stefanik remarked that the media and online world of today are much different than in the past with new technology allowing words and ideas to spread faster and wider than ever before. She said that in the future international competition, diplomacy, and military operations will be ever more based on “human-centric networks and patterns.” Stefanik claimed the DOD and IC recognize this and are adapting to this new landscape. She noted Congress has given the Pentagon clear authority to conduct information operations. Stefanik said Congress expects use of this authority and argued it is not feasible to rely only on Special Forces to conduct information operations. She asserted the entire DOD must use information operations that are effective and positively shape the environment.

Stefanik remarked that Congress required the DOD to assess and report on its information operations two years, but Congress is still awaiting this briefing and strategy.  She noted the subcommittee has jurisdiction over these matters but without input and cooperation from the DOD, it is difficult for the subcommittee to support the Pentagon. Stefanik noted Congress has created the position of Principal Information Operations Advisor to establish one person as the overseer of military information operations. She expressed disappointment that this position was “layered under” the Under Secretary of Defense for Policy contrary to Congressional intent. Stefanik stated this position was not established to be part of the larger bureaucracy but to have an agile, unified approach to information operations.

Stefanik stated the NSCAI recommended reforms to address AI-enabled information threats as well as increased coordination with the Department of State’s Global Engagement Center to counter propaganda targeted towards the U.S. She said she wanted to hear how the DOD could work with the IC to better handle and conduct information operations and how the department can protect the U.S. as adversaries continue to wage a persistent information war on U.S. interests abroad and in the U.S.

Former National Security Agency General Counsel Glenn Gerstell (watch his opening statement and read his full written testimony) argued:

  • We know disinformation is already a big problem, and we fear it could be even worse, so why haven’t we done something about it? As with any complex problem, there are many answers.
  • There are steps we can take to start to fix the problem. No one solution is at hand, but we have tools at our disposal to use and they will, bit by bit, make a difference. I’ll mention just three that will help attenuate the threats to our national security.
  • Probably the most obvious tool is the law, but we first have to get over what seems like a big obstacle. We want neither government nor the private sector to be the final arbiter of the truth or the decider of what we hear and see. Yet allowing the private sector to profit from manipulating what we view online without regard to its truthfulness or the consequences of viral dissemination is simply not sensible public policy. But it’s not all or nothing, there is room for some thoughtful regulation. After all, the First Amendment applies only to government and not to private businesses.
  • So there’s room for Congress to act in tightening rules on political campaign ads, perhaps by making certain knowing or intentional falsehoods illegal, such as deliberately spreading incorrect information about polling places – much in the way that the law prevents someone from filing a false police report. Admittedly, there is a delicate line between a prank or spoof, and something clearly malicious and potentially illegal. But the mere fact that the line may be difficult to draw, need not preclude legislation that provides a framework for that process. As has been the subject of recent Congressional attention, some amendment of Section 230 of the Communications Decency Act could be helpful. However well-intentioned at the time of its adoption, the law has come to insulate the business models of social media platforms that are the source of information for billions of people around the globe. These ad-driven models rely on secret, complex algorithms that micro-target users, facilitating the forwarding of material without regard to its accuracy, thus allowing falsehoods to go viral, and often amplifying problematic material.
  • Another obvious tool is the technology itself. The very technology that helps spawn the problem can be used to correct it too, with AI helping social media platforms spot lies in the first place, identify doctored videos and photographs, and track the dissemination of falsehoods by both domestic and foreign users. And after social media was awash in disinformation during the pandemic and this last election, the platforms finally abandoned their hands-off approach and were more muscular in blocking objectionable content and taking down sham or malevolent accounts. True, there will always be difficulty in deciding what’s sufficiently objectionable or incorrect to warrant labeling or even removal – but again, just because it’s tough to draw the line doesn’t mean we shouldn’t even start. One helpful step would be for greater transparency about how such decisions are made, and how a platform’s algorithms make recommendations and curate what we see and hear.
  • Finally, there’s a whole range of other steps that can be taken beyond regulation of social media platforms. For example, we could promote international coordination to stop the export of disinformation or to bring cross-border cyber criminals to justice. We could do a much better job of organizing our federal government in a coherent way to fight disinformation, perhaps by setting up a national disinformation center within our intelligence community, just the way we’ve successfully done with the national counterterrorism center. The Intelligence Community could work more in a more integrated way with the military to counter adversaries’ ongoing malign influence campaigns. Saving the potentially most profound step for last, we would garner rich benefits by teaching digital literacy and putting civic education back in our schools – so that disinformation, whether foreign or domestic, will be less likely to take hold in an educated and cyber-sophisticated populace.

Wilson Center Disinformation Fellow Nina Jankowicz (watch her opening statement and read her full testimony) argued:

To meet the challenge of perpetual information competition, the Department of Defense and broader United States Government should organize themselves around a posture of Enduring Information Vigilance. This framework sets out how the USG, through the “three Cs”—capability building, inter- office and interagency coordination, and international cooperation—can work more effectively to detect the vulnerabilities that adversaries exploit, manage those attempts, and ultimately deny adversaries any benefit.

1. Capability: Beyond Discrete Campaigns

In ensuring that the DOD workforce is capable of proactively monitoring and identifying informational vulnerabilities that U.S. adversaries might use in information operations, the old military adage “don’t operate the equipment, equip the operator” is prescient. Tools for detecting online campaigns and inauthentic activity have developed rapidly in recent years, and parts of the national security infrastructure have adopted them, but none of these tools is a panacea without skilled staff and a baseline of resilience in the general population.

Enduring Information Vigilance relies on skilled people with a nuanced understanding of the threat who are capable of applying the full range of tools and techniques for monitoring, detecting, and responding to information operations. Section 589E of the 2021 NDAA, which “establish[es] a program for training members of the Armed Forces and civilian employees of the Department of Defense regarding the threat of foreign malign influence campaigns targeted at such individuals and the families of such individuals, including such campaigns carried out through social media” is an excellent starting point for these efforts, given that active-duty personnel and veterans have both been targets of state-sponsored information operations in the recent past; veterans were also a key contingent among those who stormed the Capitol on January 6. As this program is implemented, it is critical that training is produced together with nonpartisan subject matter and pedagogical experts and is engaging and well-resourced. This broad-based training, which would reach the 2.75 million active-duty, reserve, and civilian employees of the Department of Defense, and could also be rolled out to all civil servants and their families across the Federal Government; a bill providing for such a program is being spearheaded by the Task Force on Digital Citizenship and the Office of Congresswoman Jennifer Wexton.

Beyond such a broad resilience-building program, it is critical to equip specialists with the training and tools they need. The National Security Commission on Artificial Intelligence (NSCAI) suggests the establishment of a “Digital Service Academy to train current and future employees,” though other nations’ efforts suggest such training need not be relegated to a standalone body. Instead, a more agile and responsive training program might be integrated into employees’ regular professional development activities. U.S. allies have adopted a similar approach; The UK Government trains its public-sector communications personnel on the “RESIST” toolkit, which emphasizes the importance of understanding the objectives of information operations when formulating appropriate responses. Critically, the toolkit points out:

The speed and agility of your response is crucial in countering disinformation. This can mean working to faster deadlines than is usual and developing protocols for responding that balance speed with formal approval from senior officials.

This is not DOD—or the Federal Government’s—strong suit. Proactive, creative communications are often stymied and stifled by government clearance processes, resulting in ineffective and even embarrassing products that have little chance at countering sometimes-slick adversarial operations.

2. Coordination: All Sectors, At All Times

The breadth of activity related to hostile state information operations, whether Russian campaigns or China’s “three warfares” approach, spans the remit of multiple government agencies. The Department of Defense and wider USG must break out of siloed national security thinking, coordinate more effectively, and provide space for cross-sector cooperation. From hard security and defense to cultural activity and media, as well as many other realms of society not typically situated at the forefront of foreign interference, hostile states have the potential to exploit the government’s difficulty to work effectively across traditional departmental boundaries. This “bureaucratic vulnerability” can lead to poor information flow, competition for resources and influence, or the exclusion of key stakeholders.

These shortcomings emphasize the need to work more effectively across government. Newly built capabilities required for monitoring, detecting, and understanding the multiple elements of hostile information activities must be integrated to advance a shared view of what adversaries are doing, whom they are targeting, and whether these activities are effective.

In its report, the NSCAI recommends the creation of a Joint Interagency Task Force bringing together the Departments of “State, Defense, Justice, and Homeland Security, and the [Office of the] Director of National Intelligence to stand-up an operations center to counter foreign-sourced malign information…survey the landscape of relevant public and private actors, coordinate among them, and act in real time to counter foreign information campaigns.”

While I agree with the NSCAI’s conclusion that the Federal Government requires a central node for the monitoring and coordination of intelligence and policymaking around disinformation, ideally in the White House, my research across Central and Eastern Europe suggests it is necessary to involve nontraditional security departments via leads with the necessary security clearances in such efforts as well. Building this situational awareness across the government will enable the prioritized coordination of effective responses in the short term and beyond. Policy and operational levers for ameliorating vulnerabilities and building resilience against information threats in the long term lie with departments of education, health, and at local levels; they require policies that ensure a thriving and pluralistic media, societal awareness of the threat, robust media and digital literacy, and an understanding of civics.

3. Cooperation: International Partnership

Hostile influence activities have never occurred at such a scale before. Any deterrent effect of Enhanced Information Vigilance is augmented by demonstrating resolve and denying benefit to adversaries through a collective stance against their activities, including better sharing of information and knowledge to identify threats, tactics, and tools, and the formulation of effective responses. In the wake of the attempted assassination of Sergei Skripal in the United Kingdom in 2018, the coordinated expulsion of over 140 Russian diplomatic personnel from allied nations demonstrates how a well-coordinated response can impose costs on a threat actor. Building cross- border resilience and reducing vulnerability to deny benefit, however, requires enduring cooperation and demonstrations of shared capability and resolve.

The NSCAI suggests that one way to build this resolve is through an international task force to counter and compete against disinformation, led by the Global Engagement Center (GEC) at the Department of State. In principle, this is an operable suggestion, though I would add some nuance to its implementation. To begin with, the GEC’s remit is too large, budget too small, and reputation within the interagency and international community too uncertain to add such a task force to its portfolio. Currently, the GEC conducts open source intelligence analysis in addition to its coordination, policymaking, and programmatic work. I recommend that intelligence gathering and analysis be left to the Intelligence Community and shared within the interagency. While the GEC should benefit from such analysis, its limited resources are better allocated in coordinating with embassies and other agencies in establishing and implementing policy and program priorities.

Finally, while the idea of a task force for international coordination is a noble one, the United States must be careful not to reinvent the wheel in its desire to engage on issues related to information operations. We are arriving late to this party and should seek to use American convening power to augment, not upstage, existing task forces and coordination efforts, particularly those spearheaded by close allies, such as the International Partnership for Countering State-Sponsored Disinformation (led by the United Kingdom in cooperation with the GEC) and the G7 Rapid Response Mechanism (led by Canada).

Stanford University’s Center for International Security and Cooperation Senior Research Scholar Dr. Herb Lin (watch his opening statement and read his full written testimony) stated:

  • The general thrust of my remarks is that the Department of Defense is poorly authorized, structured, and equipped to cope with the information warfare threat facing the United States as a whole, although it can make meaningful contributions in addressing a portion of the problem.
  • The DOD can pursue offensive and defensive activities with respect to information warfare, but it must be realized that offensive activities will not help to defend the U.S. population against the information warfare threat. Moreover, since our information warfare adversaries are authoritarian entities, they already exercise a great deal of control and influence over the information that flows through their borders or into their spheres of influence. Thus, offensive information warfare activities of the United States would be pitted against a strong suit of authoritarian governments.
  • Nevertheless, should the DOD wish to prosecute the offensive side of information warfare against foreign adversaries, I begin with the observation that the DOD cyber operators appear to be expanding their purview into the information warfare space. However, the expertise of DOD cyber forces to this point in time has focused on the information delivery side of cyber-enabled psychological operations. Prosecuting information warfare requires content as well, and it is by virtue of long experience in executing influence operations that U.S. Special Operations Command has developed its extensive psychological and cultural expertise on the information content side of psychological operations.
  • Thus, DOD should establish a standing operational entity that can integrate specialists in psychological operations and in cyber operations as co-equal partners. This entity would bring “to bear the respective expertise of each command [Cyber Command for cyber expertise, Special Operations Command for psychological operations] should . . . enhance the synergies possible between cyber- enabled psychological operations and offensive cyber operations, and it would be most desirable if the two commands could partner rather than compete over the cyber-enabled psychological operations mission.” The “standing” part of this entity is essential, as it would recognize the continuing need to conduct such operations against adversaries who believe that open conflict need not have been declared or even started for hostile activity in information space to begin.
  • Perhaps the most important policy matter in pursuing the offensive side of information warfare is the extent to which DOD offensive information warfare operations are constrained by a need to be truthful and not misleading. A long tradition of U.S. efforts in this regard, especially those undertaken during the Cold War, reflects a deeply-held belief that as long as the United States presents truthful information against adversaries that lie and mislead, it will prevail. But the Cold War ended before the advent of the Internet, social media, search engines and other information technologies that have changed the information environment by many orders of magnitude. The very successes of our information warfare adversaries today have demonstrated that truth does not always prevail, in part because lies spread faster than truth and because the first message to get through has significant advantages. What may have been true about likely winners and losers in the past may not be so true today and in the future.
  • How and to what extent, if any, should the United States and DOD adopt the tactical approaches of our information warfare adversaries against them is an open question. As an American citizen, I am very uneasy with the idea of my government using deception and misdirection as tools of its defense and foreign policy, and yet I wonder if relying only on truths that move at a snail’s pace in cyberspace leaves us at a fundamental disadvantage with respect to our adversaries. Sometimes we do accept disadvantage as a matter of principle—it is our stated policy to adhere to the laws of armed conflict whether or not our adversaries so. But the ethics of how to conduct information warfare ourselves is perhaps a different issue that is way above my pay grade to address.
  • Addressing the defensive side of information warfare conducted against the populace of the United States is also complex. DOD’s freedom of action is constrained by policy and public concerns about DOD actions that directly affect the information available to U.S. citizens. Nevertheless, DOD is well positioned to address the cyber-enabled information warfare threat for at least one important segment of the U.S. populace—the U.S. armed forces and their families. Consider that:
    • Every member of the U.S. military swears an oath to “the United States against all enemies, foreign and domestic.” But DOD offers essentially zero support and defend the Constitution of training on what it means in a practical or operational sense to “support and defend” the Constitution and how to identify an “enemy, foreign or domestic.”
    • Section 589E of the FY2021 National Defense Authorization Act called for the DOD to establish a training program regarding foreign malign influence campaigns for U.S. military personnel and their families. Although the legislation provided no specifics on the content of the training program, it is hard to imagine that it would not try to teach/educate U.S. military personnel how to identify and resist the influence of hostile information warfare campaigns.
    • Section 589F of the FY2021 National Defense Authorization Act called for DOD to assess aspects of the foreign information warfare threat to members of the U.S. armed forces and their families, although the legislative language used somewhat different terms than are used in this testimony.
  • Secretary of Defense Austin has taken action to counter extremism in the Department of Defense, including the military personnel within DOD. The scope, nature, and extent of extremism within the U.S. armed forces is unknown at this time, and Secretary Austin’s actions will shed some light on these matters. Nevertheless, to the extent that extremism is a problem, it is clear that information warfare operations and exposure to disinformation contribute in some ways to the problem.

Government Accountability Office Defense Capabilities and Management Team Director Dr. Joseph Kirschbaum (watch his opening statement and read his full testimony) said:

  • GAO found, in 2019, that DOD had made limited progress in implementing the 2016 DOD IO strategy and faced a number of challenges in overseeing the IO enterprise and integrating its IO capabilities. Specifically:
    • In seeking to implement the strategy, DOD had not developed an implementation plan or an investment framework to identify planning priorities to address IO gaps.
    • DOD has established department-wide IO roles and responsibilities and assigned most oversight responsibilities to the Under Secretary of Defense for Policy. The Under Secretary had exercised some responsibilities, such as establishing an executive steering group. However, the Under Secretary had not fulfilled other IO oversight responsibilities, such as conducting an assessment of needed tasks, workload, and resources. Instead, the Under Secretary delegated these responsibilities to an official whose primary responsibilities are focused on special operations and combatting terrorism.
    • DOD had integrated information-related capabilities in some military operations, but had not conducted a posture review to assess IO challenges. Conducting a comprehensive posture review to fully assess challenges would assist DOD in effectively operating while using information-related capabilities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Julius Silver from Pexels

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s