1 / 9

Overview of DETER

The Challenges of Repeatable Experiment Archiving – Lessons from DETER Stephen Schwab SPARTA, Inc. d.b.a. Cobham Analytic Solutions May 25, 2010. DETER Highlights 3 distributed clusters, ~500 nodes Combination of DETER developed software and legacy Emulab DETER Capabilities Federation

domani
Télécharger la présentation

Overview of DETER

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Challenges of Repeatable Experiment Archiving – Lessons from DETERStephen SchwabSPARTA, Inc. d.b.a. Cobham Analytic SolutionsMay 25, 2010

  2. DETER Highlights 3 distributed clusters, ~500 nodes Combination of DETER developed software and legacy Emulab DETER Capabilities Federation Security Experimentation Environment (SEER) Templates DETER Emulab GMPLS GMPLS Emulab WAIL Overview of DETER SEER Federator CEDL DRAGON Internet Provisioned Connectivity Credentials USERS Plug-ins to configure federants

  3. DDoS Experiment on DETER (circa 2005) Background Traffic: REPLAY | NTCG | HARPOON HIGH FIDELITY TRAFFIC Topology: BUILDING-BLOCKS | JUNIPER ROUTER CORE REALISTIC CONNECTIVITY AND SCALE-DOWN Attack Traffic: DETER-INTEGRATED ATTACK SCRIPTING AUTOMATION OF VARIETY OF SCENARIOS UNDER STUDY Instrumentation: PACKET AND HOST STATISTICS CAPTURE | SPECTRAL ANALYSIS | METRICS CALCULATION | INTEGRATED VISUALIZATION SEER: TOOLBOX FOR RIGOROUS INVESTIGATION OF RESULTS CORE AS-11357 ATTACK TRAFFIC BACKGROUND TRAFFIC

  4. Security Experiment Methodology & Tools (circa 2005) DETER -- integrated workbench & tools for experimenters… Experimenter’sselect from apalette of predefined elements: Topology, Background and Attack Traffic, and Data Capture and Instrumentation Our Methodology frames standard, systematic questions that guide an experimenter in selecting and combining the right elements Experiment Automation increases repeatability and efficiency by integrating the process within the DETER testbed environment TOPOLOGY TRAFFIC ATTACK DATA-CAPTURE PALETTESs METHODOLOGY & GUIDANCE ? EXPERIMENT AUTOMATION … but this level of abstraction leads to major drawbacks

  5. Worm/Botnet Experiment (2009) 831 Virtual Nodes on 63 Physical PCs

  6. Experiment Specification • Large and Complex Experiments are more suitably constructed • by combining abstract elements • modeling different aspects (topology, traffic, networking devices, etc.) • with constraints on behavior Example of such an experiment on previous slide • hand-crafted (e.g. hand-compiled) experiment from abstract elements

  7. Initial Approach: Archiving it All • Intuition drawn from analogy with physical (discovery) sciences… • Record all aspects of experiment to ensure (ideal) reproducibility • Software • Artifacts being investigated (often the researcher’s new system!) • Operating Environment (OS, standard software on clients, servers in experiment scenario, network routers, firewalls, etc.) • Experiment & Test Infrastructure (initialization, control, data collection, data reduction, data analysis, data visualization, …) • Hardware • All end-systems and routers/switches • All firmware • All chips/chipset variants (Tulip 21140As are not Tulip 21140Es!) • Procedures • All scripts and manual interactions required to run the experiment • … networked systems require large and growing (unbounded) detail to describe precisely… which ideal reproducibility would seem to demand

  8. Challenges to Ideal Archiving • Separating Invariants from Contingencies • An experiment requires certain properties; these are the essence of the experiment • But every configuration detail must be specified; these are contingencies – merely choices (perhaps important to record) • Repeatability should be primarily defined with respect to explicit invariants Experiment Internals • Publications do not capture full details because increasing complexity of software, hardware and networking technologies result in (exponential?) growth in description of these aspects • Peer-review process does not provide incentives to capture full details (noted in other position papers) • Funding agencies do not provide sufficient funding to do so (how much detail can, will and should be demanded? Where is the limit on returns for dollars invested?) Granularity of Reuse • Individual researchers are interested in examining, studying and re-running different elements of any given experiment • Experiments that archive everything do not clearly delineate the various pieces

  9. Future DETER Capability & Vision • DETER is developing the capability to • Specify Experiments Declaratively • Reason about the software (or hardware) alternatives that may be available to realize each element in a testbed • Select implementations that are sufficient to perform the experiment correctly • … and ensure detection of fidelity-loss through the use of monitored invariants • Resolve global conflicts among local element to implementation mappings DETER vision is to foster • Reuse through sharing of tools, technology, results & ideas among researchers • … and to promote this vision by providing abstractions, models and elements that are supported by our experiment life-cycle framework and tools • Facilitate individuals and researchers focused on specific topics to create their own abstractions, models and elements

More Related