1 / 15

DOUT Deception Operations in Urban Terrain

DOUT Deception Operations in Urban Terrain. Douting * the Asymmetric Disadvantage Rand Waltzman * dout: verb, to snuff out, to put out like a candle (Oxford English Dictionary). “I make the enemy see my strengths as weaknesses and my

lidia
Télécharger la présentation

DOUT Deception Operations in Urban Terrain

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DOUTDeception Operations in Urban Terrain Douting* the Asymmetric Disadvantage Rand Waltzman *dout: verb, to snuff out, to put out like a candle (Oxford English Dictionary)

  2. “I make the enemy see my strengths as weaknesses and my weaknesses as strengths while I cause his strengths to become weaknesses and discover where he is not strong … I conceal my tracks so that none can discern them; I keep silence so that none can hear me.” Sun Tzu The Art of War, c. 500 BC “Though fraud [deception] in other activities be detestable, in the management of war it is laudable and glorious, and he who overcomes an enemy by fraud is as much to be praised as he who does so by force.” Niccolo Machiavelli Discourses, 1517 Quoted in Joint Publication 3-13.4, Military Deception, July 2006 Joint Chiefs of Staff

  3. Background • Asymmetric Strategies aimed at reducing an opponents strengths and exposing his weaknesses. • Battle on urban friendly terrain. • Use of deception. • Most of our urban operation are on unfriendly terrain. • That is an asymmetric disadvantage. • Solution: Effective use of deception as an asymmetric strategy to counter our otherwise asymmetric disadvantage.

  4. Potentiating Interrelationship Between Urban Terrain and Deception • The scope of deceptions is greater in the urban environment than in any other. • The cacaphonous “background noise” of urban environments hampers counterdeception faculties. • Cities offer a rich trove of materials with which to conduct deception. • Decision making is generally worsened in urban environments relative to other environments. • The presence and proximity of noncombatants complicate the intelligence picture at all operational levels. • Urban clutter attenuates the leverage of technology.

  5. Long Term Program Objectives • Help encourage and show the way to more emphasis and tighter integration of deception in military tactics, techniques and procedures into operations and technological development that would clearly be of benefit in a world of asymmetrically minded opponents. • Bring about a cultural revolution in the practice of tactical deception planning and execution. • Lift the shroud of mystery surrounding deception planning. • Show how it can be done effectively by intelligent operations officers that do not possess exceptional artistic/creative abilities. • Widespread distribution of deception knowledge among operations officers. • Quality control in the application of deception knowledge. • DOUT as an expertise multiplier. • Provide a framework for incorporating new technological advances in deception techniques as they are developed. • Help increase the speed of deployment of new capabilities. • Provide tools for the development of deception knowledge as a military asset.

  6. Program Goals • Provide tools to organically grow deception planning and execution capabilities and practice. • Produce an interactive deception planning system to be used in military operations for Captains and Majors.

  7. Program Tasks • Develop knowledge representation techniques for deception plans and actions with explicit representation of context. • Build a Machine Readable Case Library that reflects deception theory and best practices. • Build an interactive deception planner based on theory and best practices. • Build a Deception Management System (DMS) • Develop tool support for Cost/Benefit & Risk/Failure Analysis and Management for deceptive actions. • Build intelligent Lessons Learned system to manage acquisition, distribution and active incorporation of deception experience. • Integration with battle command systems (e.g., CPoF, FBCB2, etc.) • Test and Evaluation.

  8. DOUT ARCHITECTURE Interactive Deception Planner COA Assumptions/Strengths Case Library Desired Adversary Actions Deception Goals Deconfliction Analysis Risk Analysis Lessons Learned CODA Termination Criteria Deception Manager Cost/Benefit Risk/Failure Analysis Environment (Other Echelons, FBCB2, CPoF, PASS)

  9. Knowledge Representation & Machine Readable Case Library • Deception ontology. • Descriptions should contain information about context including prerequisite conditions and types of intelligence that are required. • Description techniques should take into account biologically inspired forms of memory that are being explored in, for example, DARPA BIC program. • E.g., the idea that concepts in memory can be more like perceptual simulations or reenactments of the original experiences and less like symbols. • Semi-automated markup schemes for annotating/representing cases. • Similarity measures based on ontology as a basis for case retrieval and comparison. • Similarity measure should take into account context and availability of intelligence of various types.

  10. Machine Readable Case Library • Hard Problem • Representation for deception plans and actions. • Use context models to document factors and interdependencies that have a significant impact on the deception process. • Use context models for automatic analysis and reasoning. • Capture dynamic aspects of plan execution. • Allows plans to be compared on different levels including measures of similarity. • Case-based reasoning to reuse experience with the adaptation of processes. • Why we think we can get there. • Recent work in the use of context models in representation and reasoning (e.g., Koldehoff, Minor, Kofod-Petersen).

  11. Interactive Deception Planner • Based on deception theory and best practices. • Inputs: • Planned operation COAs. • Intelligence information concerning • Overall situation. • Adversary key decision makers including all available information relating to their background and psychological profiles. • The way adversary decision makers, their staffs, and trusted advisors perceive friendly capabilities and intentions and how the adversary is likely to react. • Adversary detection and collection capabilities. • Current possible and probable adversary COAs and the adversary’s rationale for taking those actions. • Outputs: • Analysis and determination of prerequisite conditions and types of intelligence required in order for real-world operation COAs to succeed. • Estimation of the strengths of the given COAs underlying assumptions based on current and expected future situation. • Set of alternative adversary actions that would strengthen and support given COAs underlying assumptions. • Set of alternative deceptive actions that could be taken to induce adversary to take actions referred to above. • Set of alternative analyses of how suggested deceptive actions could adversely affect friendly forces that are not read on to the deception. • Set of alternative coherent courses of deceptive action (CODA). • Set of alternative CODA termination criteria. • Set of alternative CODA cost/benefit & risk/failure analyses.

  12. Interactive Deception Planner • Hard Problem • Analysis of COAs to determine the underlying assumptions required for their success, the strengths of those assumptions and adversary actions that would strengthen and support assumptions. • Determine set of friendly actions that would induce adversary to take actions that would strengthen and support COA assumptions. • Why we think we can get there. • Existing body of literature on plan recognition and plan understanding. • Existing body of literature on automated abductive reasoning as applied to persuasion. • Garvey & Lunt • Use of abductive models of inference channels for creating cover stories to reduce the risk due to an inference channel.  • Use of abduction for generating cover stories.  • Use of evidential reasoning to evaluate the effectiveness of a proposed cover story.

  13. Deception Management System • Help manage key elements of the deception execution cycle. • Need-to-know management. • Keep track of who should know what and when. • Monitor progress of operation: • Deception story communications methods are still appropriate/effective for target. • Vertical and horizontal coordination ensuring up-to-date integration between real-world operations and deception operations. • Makes sure that all participants are kept properly up to date during all phases of an operation. • Deception operations are in accordance with current rules of engagement, force protection issues, etc. • Termination criteria. • Data from COP DB. • Supporting technology: • Automatically anticipate needed data by reasoning about the deception operation and available data. • Dynamically reason about and modify data requests in response to changing conditions. • Learn about data needs from user requests. • Early warning system that detects when things are about to go wrong before they do. Monitor COP DB for: • Change of mission scenario in which the overall operational situation changes and circumstances that prompted the operation no longer pertain. • Changes in the operational situation result in increased risk and/or cost to friendly forces. • Operation might succeed, but not along a time line that is synchronous with other parallel IO or other aspects of the campaign. • Evidence to indicate that operation has become known to the adversary. • Opportunity detection system. Monitor COP DB for: • Evidence that change in choice of target, objectives, or information conduit will increase the probability of success or increase the impact of the operation.

  14. Cost/Benefit & Risk/Failure Analysis and Management • Risk of Deception Failure • Failure or exposure of the deception can significantly affect the friendly commander’s operational activities. • Commander must understand risk associated with basing the success of any mission on the assumed success of a deception. • Typical failure modes. • Target does not receive the story. • Target does not believe the story. • Target is unable to act. • Target is indecisive even if story is believed. • Risk of Exposure of Means or Feedback Channels • Risks to third parties. • Neutral or friendly forces not read into the deception may receive and act on deception information intended for the target.

  15. Intelligent Lessons Learned System. • Manages acquisition, distribution and active incorporation of deception experience. • Explicitly stated knowledge based on experience. • Knowledge that is automatically learned from observed actions. • As new technologies become available for creating deceptions, reason and learn about new ways of using the technologies based on previous cases. • E.g., “if we had technology X, we could have done Y and that would have produced effect Z.” • Learn about types of data that are required to support an operation by studying user requests in past operations.

More Related