1 / 46

Cyberattack as a Tool of U.S. Policy

Cyberattack as a Tool of U.S. Policy. The National Academies Project supported by the MacArthur Foundation, Microsoft, and the National Research Council March 17, 2010 University of California, Berkeley Presentation supported by PCCI of the National Academies. Committee and report.

clarke
Télécharger la présentation

Cyberattack as a Tool of U.S. Policy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cyberattack as a Tool of U.S. Policy The National Academies Project supported by the MacArthur Foundation, Microsoft, and the National Research Council March 17, 2010 University of California, Berkeley Presentation supported by PCCI of the National Academies 1

  2. Committee and report Military WILLIAM A. OWENS, co-chair (USN Retired, fmr VCJCS) CARL G. O’BERRY, The Boeing Company (USAF Ret) WILLIAM O. STUDEMAN, USN Retired (fmr NSA Director) Foreign Relations and Diplomacy KENNETH W. DAM, co-chair, University of Chicago SARAH SEWALL, Harvard University Information technology THOMAS A. BERSON, Anagram Laboratories DAVID D. CLARK, MIT RICHARD L. GARWIN, IBM Fellow Emeritus (technology) JEROME H. SALTZER, MIT, (retired) MARK SEIDEN, MSB Associates International and National Security Law JACK L. GOLDSMITH, Harvard Law School GERHARD CASPER, Stanford University WALTER B. SLOCOMBE, Caplin and Drysdale MICHAEL A. VATIS, Steptoe & Johnson LLP Herbert S. Lin, Study Director 2

  3. The broad context for cybersecurity • Nations (and their military forces) are increasingly dependent on information technology • Military information technology • Command and control • Logistics • Weapons • Civilian information technology • Transportation (e.g., air traffic control systems) • Financial system (e.g., banking, stock markets) • Communications (Internet communications, telephone switching) • Hospital systems that monitor patients and dispense drugs • Manufacturing process control systems, • Power distribution and other utilities. • Systems used by retailers, consumer, homeowners • Thus, important IT functionality must be protected. • Cybersecurity: measures taken to protect or preserve a computer system or network and the information it holds. 3

  4. Defensive aspect of cybersecurity • Two public foci of cybersecurity: • Measures that likely targets can take to strengthen their resistance to cyberattack (passive defenses) • Anti-virus and intrusion detection software • Better password security • Greater attack resistance in software • More robust law enforcement mechanisms • Cybercrimes unit in FBI • Convention on Cybercrime • Defensive aspects of cybersecurity have received a great deal of attention: • Many reports, conferences, publications • White House czar for cybersecurity • White House Cyberspace Policy Review • Proposed bipartisan legislation (e.g., Rockefeller-Snowe) 4

  5. Offensive aspect of cybersecurity • A generally classified subject (usually Secret, often higher) • Classified National Security Presidential Directive 16 (July 2002) allegedly ordered the U.S. government to develop national-level guidance for determining when and how the United States would launch cyber-attacks against enemy computer networks. • Very little public debate about this offensive aspects of cybersecurity, especially in contrast to discussion of defensive aspects of cybersecurity. • Similar to situation 50 years ago, before Herman Kahn’s “On Thermonuclear War” and “Thinking the Unthinkable” • Study undertaken to show that it is possible to discuss important issues regarding cyberattack policy in an unclassified environment. 5

  6. Basic facts about cyber operations • Two categories of operations of interest: • Cyberattack: action to destroy, degrade, disrupt adversary IT or information therein • Cyberexploitation: action to (very quietly) obtain information from adversary IT • Note public conflation between cyberattack and cyberexploitation • Can be undertaken for both offensive and defensive purposes. “Offensive” in “offensive cyber operations” refers to the effect of an operation, and not its purpose. • Technical operations • Remote (virus, DOS attack, attacks over the Internet) • Close-access (supply chain attack, compromise of 3rd party supplier (antivirus vendor) or service provider (ISP)) • Social operations • Trick, bribe, extort, turn system operator • Often much easier than a technical operation • Cyberattack and cyberexploitation are technically very similar—both depend on remote/close-access, social/technical approaches. (Both require vulnerability and access, differ only in payload.) • Cyberexploitations are different from cyberattacks primarily in their objectives and in the legal constructs surrounding them (e.g., US Code Title 50 vs Title 10). 6

  7. Important characteristics of cyberattack • Indirect effects are almost always more consequential than direct effects. • Effects can span an enormous range; NOT of lesser significance even if “only” the computer is attacked, • Cyberattacks (and cyberexploitations too!) are inherently deniable and easy to conduct with plausible deniability. • Cyberattacks span an enormous range of scale, impact, and complexity • Unilateral cyberattack by USG to achieve military or political goal (Iranian nuclear computers) • Active cyber-response by USG to cyber-attack from abroad (i.e. defense as offense) • Information operations before or in kinetic war • Cyberexploitation as related operation • Outcomes of a cyberattack are highly contingent. • Identifying what targets to strike • Limiting collateral damage, predicting cascading effects may be hard when computers interconnect • Conducting battle damage assessment? How do you know what you did? • Success depends on • which systems are actually connected to which other systems, • what security measures are in place and actually operational, • what intelligence has been conducted in advance to guide attack planning (and/or to prepare the systems for attack). bias towards early use in conflict • Since base technologies used in computers and networks involved are mostly commercial, private sector may be involved to unprecedented degrees (e.g., use of commercial ISP for DOS attack) 7

  8. U.S. national security policy today 8

  9. (parts of) DOD policy re cyberwarfare • DOD seeks dominance in the cyber domain--the state in which U.S. and friendly forces have complete freedom of action in the domain and adversary forces have no freedom of action. • DOD implied declaratory policy on cyberattack: The United States acquires cyberattack capabilities as part of its overall deterrent posture, which is based on full spectrum dominance—the ability to control any situation or defeat any adversary across the range of military operations. Cyberattack capabilities provide the U.S. military and intelligence communities with additional options for action and use, and are thus intended for use just as any other weapons could be used in support of U.S. military or intelligence objectives. Cyberattack capabilities are to be fully integrated into U.S. military operations when appropriate, and distinctions between cyberattack and kinetic force are not meaningful except in an operational context. Cyberattack capabilities may be particularly useful to the United States in many conflict scenarios short of all-out war. • DOD has publicly announced policy re cyberattack in one case • U.S. Strategic Command asserts authority to conduct an active threat neutralization (aka CND RA) to protect military computer systems and networks whose mission performance has been compromised by a cyberattack. USAF seeking automated response capabilities. 9

  10. Intelligence community has responsibilities for exploitation and covert action • Intelligence collection (including cyberexploitation) undertaken to further the interests of the United States outside CONUS – unlimited except if US persons involved. Not a violation of international law. • Intelligence collection on behalf of specific US companies – not undertaken as a matter of US policy (not true for some other nations, e.g., France) • Covert action – regulated by US statute: “activities of the U.S. government to influence political, economic, or military conditions abroad, where it is intended that the role of the U.S. government will not be apparent or acknowledged publicly.” Must be authorized by findings of the President, and reported to appropriate individuals in the U.S. Congress. Note alignment of plausible deniability requirement and technical characteristics of cyberattack. 10

  11. Illustrative applications of cyberattack (some military, some covert action) • Suppression of adversary air defenses. • Influencing the outcome of a foreign election using electronic voting machines. • Altering electronic medical records of adversary military leaders. • Disruption of adversary plans for military deployment. • Disruption of adversary infrastructure for censorship. • Altering traffic patterns to sow delay and confusion during crisis. Cyberattack may well work best in connection with coordinated kinetic action. Note well – these applications are only illustrative, and public information about actual cyberattacks conducted by the United States is nearly non-existent. 11

  12. One public story regarding alleged US cyberattack on the Soviet Union • Soviet Union actively sought to obtain Western technology (including pipeline control software). US discovered the list of sought-after technologies. • In 1982, the U.S. spiked software that was subsequently obtained by the Soviet Union. The software was “programmed to go haywire, [and] after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds.” • The result -- a large explosion in a Siberian natural gas pipeline (visible from space, looked like a 3 kiloton nuclear blast) • Beyond the immediate effect, “the Soviets came to understand [over time] that they had been stealing bogus technology, but now what were they to do? By implication, every cell of the Soviet leviathan might be infected. They had no way of knowing which equipment was sound, which was bogus. All was suspect, which was the intended endgame for the entire operation.“ • Source: Thomas Reed, At the Abyss: An Insider's History of the Cold War, Ballantine Books, New York, NY, 2004 12

  13. On cyberdeterrence 13

  14. The why and how of deterrence • Deterrence seems like the obvious inevitable choice in an offense-dominant world. • Passive defense is inadequate and eventually will fail; • Law enforcement actions are too slow and uncertain in outcome. • Deterrence of nuclear threats in the Cold War establishes the paradigm – largely successful. Based on a credible threat to impose unacceptable costs and to deny benefits of an attack. • Will deterrence work for cyberattacks? That is, how can we persuade adversaries to refrain from launching cyberattacks against U.S. interests? 14

  15. Applying deterrence to cyberconflict • Deterrence = “credible threat to impose unacceptable costs on an adversary and to deny the adversary benefits of an attack” • Many issues arise: • Attribution of attack to adversary • What system, which actor? • Ubiquitious technology and knowledge not limited to nation-states • Knowing that an attack has happened • Noisy background • Ambiguous effect (Exploitation? Delayed effect?) • What magnitude of effect counts as attack? • Plausible deniability • Slow forensics (may be months, not minutes) • Credibility of (secret) cyber capabilities • Possible nuclear response to certain kinds of cyberattack? • Uncertain utility of denying benefits (cf. first use – attacker can simply try again, especially if not detected) • Catalytic conflict and ease of false-flag operations. 15

  16. On escalation and termination • Deterring escalation is just as important (perhaps more so) as deterring onset of conflict. • Unintended escalation particularly dangerous when • operational actions are less visible to senior decision makers • outcomes of actions are more uncertain (e.g., cascading effects) • What are the connections between kinetic conflict and cyberconflict? (cyber as early use; conditions (if any) under which inhibition of cyber can inhibit escalation to kinetic conflict) • How can cyberconflict be terminated? • Noisy background of criminal and hacker (and perhaps 3rd nation) cyberattacks • Requirements for “termination” – how to de-mine? • How to suppress patriotic hackers? 16

  17. The reality of uncertainty in the operational environment • When inexperienced human beings with little hard information are placed into unfamiliar situations in a general environment of tension…. • “I have seen too many situations where government officials claimed a high degree of confidence as to the source, intent, and scope of an attack, and it turned out they were wrong on every aspect of it. That is, they were often wrong, but never in doubt.” (Former senior DOJ official) 17

  18. Bear in mind… • Cyberwarfare not separate from other spheres of potential conflict. • Options for responding to cyberattacks on the United States span a broad range and include a mix of dynamic changes in defensive postures, law enforcement actions, diplomacy, cyberattacks, and kinetic attacks. • Impact of U.S. cyberattack capabilities on other nations’ willingness to use comparable techniques against the US is uncertain. • Cyberwarfare is not just relevant to US government, and issues arise in deterring attacks on private sector entities. 18

  19. International law and offensive cyber operations 19

  20. Some difficult legal issues: key terms not defined • UN Charter prohibits “threat or use of force against the territorial integrity or political independence of any state” (Art. 2(4)) • “Force” not defined. By practice, it • includes conventional weapon attacks that damage persons or property • excludes economic or political acts (e.g. sanctions) that damage persons or property • UN Charter Art. 51 - “Nothing in the present Charter shall impair the inherent right of individual or collective self-defence if an armed attack occurs against a Member of the United Nations..” • “Armed attack” not defined, even for kinetic force. 20

  21. When is a cyberattack a “use of force” or “an armed attack”? • Easier: • Exploitation w/o damage or degradation (no); cyberattack that causes physical damage akin to kinetic attack (yes); use of cyberattack during acknowledged armed conflict (not covered by Art. 2(4) but subject to LOAC jus in bello). • Harder: • Economic damage without physical damage • Temporary, reversible interference with computer system • “Mere” data destruction or degradation • Introduction of Trojan horse software agents • Payload with exploitation and attack capabilities? (cf. human spy skilled in sabotage?) • Payload to accept a future upgrade with unknown capabilities? • Destructive payload with delayed action capability? (cf., pre-planted remotely detonatable mine) • Empty payload – a shell that can be remotely upgraded in the future • Cyberattack that has effects comparable to a kinetic armed attack is also an armed attack, but few good analogies to past kinetic precedents. (Bombing of Libya was asserted to be self-defense against a Libyan action that killed 2 U.S. servicepersons and wounded 60.) 21

  22. Some complicating questions 22

  23. Exploitation or attack? • Hypothetical scenario – U.S. conducts a cyberexploitation inside the Elbonian nuclear command and control system. • U.S explanation and rationale for exploitation is to gain greater insight into Elbonian nuclear operations so that when they take various actions, we do not overreact. So U.S. exploitation is benign and stabilizing. • If Elbonia detects agents of U.S. cyberexploitation, will they draw the same conclusion? • Broader question -- How will victim know if an operation is exploitation or attack? Is a probe of networks a prelude to attack? • Especially consider.. • During crisis, Elbonia undertake special and more intense security sweeps, and discovers U.S. cyberexploitation agents that have been in place for a long time. How will these agents be interpreted? 23

  24. Espionage & Use of Force? • A spy is one who acts under false pretenses to obtain information regarding another nation. • International law does not prohibit intelligence collection by spies • Can espionage (an unfriendly act) rise to a “use of force”? • Introduction of software agent with destructive capacity? (cf. human agent skilled in sabotage) • Introduction of remotely programmable software agent? (cf. pre-planted remotely detonatable mine) • Introduction of download shell to receive subsqeuent payload? • Repeated and continuing probes of military/DIB networks? 24

  25. The meaning of neutrality? • Nation A, sending bombers to attack Nation B but flying through the air space of Nation C, must obtain C’s permission to do so. C may not be regarded as neutral if A does indeed fly through C’s airspace. • Nation A, sending messages that direct its forces to attack Nation B but using the telecommunications facilities of Nation C need not obtain C’s permission to do so, and need not obtain C’s permission to do so (as long as C allows all nations to do so). C is neutral even if it allows A’s messages to be transmitted through C’s telecommunications facilities. Which is the right model for an Internet-based attack of A against C? 25

  26. Jus in Bello Basics • Principle of Non-Perfidy • Cannot pretend to be legally protected entity • Hard case in traditional war: distinction between ruse of war (e.g., use of misinformation to mislead adversary) and perfidy (e.g., pretending that a military installation is a hospital). • Principle of Proportionality • Collateral damage on civilian targets acceptable if not disproportionate to the military advantage gained. • Hard cases in traditional war: human shields, chemical plant in suburbs, etc. • Principle of Distinction • Military operations only against “military objectives” and not against civilian targets • Hard cases in traditional war: Serbian television station, Baghdad electrical grid, etc. 26

  27. Non-perfidy • Requirement for identification of USG cyberattacks? • USAF insignia on airplanes and cruise missiles. • Military personnel in distinctive uniforms. • Trojan horses with distinctive identifiers “This agent is a bona fide weapon of the US government”? • Public infrastructure so that any victim can verify the authenticity of such an identifier? • Requirement for identifying military and civilian targets in cyberspace? • Nations have obligations to enable identification of military assets (distinctive vehicles with insignias) and are entitled to identify entities legally immune to attack (Red Cross on ambulances, white flags). • What must be done to identify military computers/networks? IT assets of hospitals and religious institutions? Who will verify the latter? (International Red Cross?) 27

  28. Proportionality: uncertainty regarding outcome of a cyberattack • Outcomes highly uncertain (how should commanders account for uncertainty) • Indirect, cascading effects • Collateral damage difficult to calculate • Uncertainty amplified by need to gather intelligence promptly in many tactical situations points to early use, when there is time to collect intelligence 28

  29. Distinction—legitimacy of attacks that disable computer-dependent civilian services or communications? • Large fraction of Elbonian military communications take place over the Internet, and the Elbonian military is dependent to some extent on commercial power grid. Are the Elbonian Internet (e.g., routers) and power grid valid military targets? • To what extent are computer-dependent civilian services or communications “essential” to life in a modern society? Does disruption in these services rise to the level of causing death and destruction? 29

  30. Distinction: the meaning of “combatant” • A legal combatant is one who takes a direct role in hostilities under the command of the armed forces of a nation. • Consider two cases: • IO squadron commander’s brother is an Elbonian professor of computer science teaching courses in cybersecurity. Students conduct cyberattacks against U.S. systems. • U.S. reservist is a cyber weapons operator on weekends, security/systems administrator in her day job. Reservist takes actions against Elbonia in evenings. • Who counts as a combatant and why? 30

  31. Distinction and proportionality:Automated responses? • Neutralization of an incoming cyberattack may require rapid response. • Automated identification, characterization of incoming attack may be needed. • To respond quickly, return cyber-strike may need to be automated as well. • What are the dangers of such automation? • Strike innocent party? (Catalytic conflict possible as well) • Cause unexpected collateral damage (e.g., incoming cyberattack originating from a hospital)? • Provoke escalation? 31

  32. Private Sector Equities 32

  33. Google and China • Google raised two issues (“Operation Aurora”) • Attempts to compromise email accounts of Chinese human rights activists • Penetrations of 34 companies (mostly in Silicon Valley) to obtain corporate data and software source code. • China held responsible by Google for these actions. • Targeted attack against specific individuals, using previously unknown vulnerability in Internet Explorer that allows remote code execution. • Google undertook its own forensic investigation, gaining access to a computer in Taiwan and monitoring its operations to identify penetration targets. • Attribution to China made largely on the basis of attack’s technical sophistication and breadth and the targets of the cyber operations. • Some reports indicate that malware used in latter penetration employed an algorithm contained in a technical report published only on Chinese-language Web sites. • Non-circumstantial evidence is scarce—highlights difference between technical attribution and political decision to hold a nation accountable based on all sources of information. • Subsequent Google action to un-censor its China search engine • Some actions traced to elite Chinese IT schools • Many possible/plausible explanations (gov’t sanctioned activity, overly enthusiastic students, contest, final exam) 33

  34. Some questions raised by Google/China engagement • Google action to uncensor its search engines - retaliation for Chinese actions? • How and to what extent, if any, should private entities be allowed to shoot back? Does private shoot-back increase or decrease likelihood that a private entity will be attacked? • How and to what extent, if any, should private entities be allowed to conduct their own foresnic investigations (which may involve some degree of hack-back)? • Private actors in U.S. engaging in cross-border offensive operations (patriotic hackers, U.S. corporations acting in self-defense) have legal implications for the U.S. • U.S. responsibility potentially implicated if private actions rise to “use of force” • Possible interference with US government cyber operations 34

  35. More broadly… • Certain cyberattacks undertaken by the United States are likely to have significant operational implications for the U.S. private sector. • Internet-based attack may require cooperation of U.S./Allied ISPs (ISPs usually asked to suppress cyberattacks – what about shutting down a US attack?) • Shaping the cyber battlefield may require cooperation of U.S./Allied IT vendors and service providers. • Adversary response to U.S. cyberattack may affect U.S. ISPs and critical infrastructure may be affected 35

  36. Some Broad Observations and Issues 36

  37. Discussing policy regarding cyberattackin the open • Secrecy has impeded widespread understanding and debate about the nature and implications of cyberattack. Secrecy has been responsible in part for: • an ill-formed state of policy regarding cyberattack • a dearth of public scrutiny and congressional oversight • an increased likelihood that personnel with minimal background knowledge will be given important cyber responsibilities • an increase in the likelihood that policy will be formulated with narrow parochial or short-term interests foremost in mind. • inhibition of nongovernmental research and investigation regarding cyberattack • The open literature is rich – and all of those concerned about emerging national security concerns should learn about it. 37

  38. A broad range of conflict scenarios… • Cyberattacks can vary across an enormous range of effects. • Wide temporal and spatial scales of effect possible, depending on the cyberattack • May have strategic significance • May connect with kinetic and/or nuclear issues • Cyberattack for • non-lethal operations • support of other cyber operations (e.g., exploitation) • traditional military operations • conduct strategic attack with national scope 38

  39. Multiple equities… • Diplomatic • e.g., coordination with Allied cyberoperations • Law enforcement • Should cyberattack be regarded as law enforcement or national security matter? • Economic/private sector • Cooperation of private sector entities needed to prepare the battlefield? May affect business prospects. • Cooperation of private sector entities needed to execute an attack? 39

  40. Dominance in cyberspace: not possible • Enduring unilateral dominance in cyberspace is neither realistic nor achievable by the United States. • Many cyberattack technologies are inexpensive and easily available to non-state actors, including individuals, and these technologies include some that are as capable of doing great harm as those available to governments. • Much of the expertise needed to wield cyberattack weapons effectively is widespread. • The U.S. information technology infrastructure is likely to remain vulnerable to cyberattack for the foreseeable future. 40

  41. C2 for offensive cyber operations • Early use of cyberattack may be easy to contemplate in a pre-conflict situation, so a greater degree of operational oversight for cyberattack and cyberexploitation may be needed compared to use of other options. • Confusion on adversary’s part regarding intent of cyber operation – an exploitation may be seen as an attack. • Operational footprint left by cyberattack activities is small, and routine activities may be less visible to senior decision makers. • Automated response to attack may go awry, be uncalibrated to the actually-existing security threat and thus be provocative. • CND response action authorized by U.S. Strategic Command to protect military computer systems and networks whose mission performance has been compromised by a cyberattack – how will STRATCOM factor in diplomatic or political considerations? Or will its decision take into account only military or local tactical considerations for protecting the mission capability of U.S. military networks? 41

  42. Nuclear conflict as analogy for cyber • Many superficially obvious connections • Role of deterrence • WMD/strategic significance • But deeper analysis suggests badness of fit • Many of the same questions/issues arise in cyber as in nuclear • Answers to these questions are mostly very different • Some suggest biological weapons are a better metaphor. 42

  43. An NRC prize competition • Seeking contributed papers on cyberdeterrence – persuading adversaries to refrain from launching serious cyberattacks on the United States. • Abstracts due April 1, 2010. • First draft (4500-7500 words) due May 21, 2010. • http://sites.nationalacademies.org/CSTB/CSTB_056215 • $1,000 prize for great papers. 43

  44. For more information… Herbert Lin Chief Scientist, Computer Science and Telecommunications Board National Research Council 202-334-3191 hlin@nas.edu www.cstb.org (report, summary, press release) PLEASE FILL OUT ASSESSMENT QUESTIONNAIRE!! 44

  45. US view of deterrence • Deterrence [seeks to] convince adversaries not to take actions that threaten US vital interests by means of decisive influence over their decision-making. Decisive influence is achieved by credibly threatening to deny benefits and/or impose costs, while encouraging restraint by convincing the actor that restraint will result in an acceptable outcome. –Deterrence Operations: Joint Operating Concept, Version 2.0, December 2006; 45

  46. DOD – A U.S. nuclear response to cyberattack? • “Nuclear capabilities [of the United States] continue to play an important role in deterrence by providing military options to deter a range of threats, including the use of WMD/E and large-scale conventional forces. Additionally, the extension of a credible nuclear deterrent to allies has been an important nonproliferation tool that has removed incentives for allies to develop and deploy nuclear forces.” • “The term WMD/E relates to a broad range of adversary capabilities that pose potentially devastating impacts. WMD/E includes chemical, biological, radiological, nuclear, and enhanced high explosive weapons as well as other, more asymmetrical “weapons”. They may rely more on disruptive impact than destructive kinetic effects. For example, cyber attacks on US commercial information systems or attacks against transportation networks may have a greater economic or psychological effect than a relatively small release of a lethal agent.” • Source: National Military Strategy, 2004. Joint Chiefs of Staff 46

More Related