1 / 33

by Ezequiel Glinsky Research Assistant, University of Buenos Aires, Argentina

Overhead analysis of Discrete Event models execution. by Ezequiel Glinsky Research Assistant, University of Buenos Aires, Argentina Supervisor: Prof. Gabriel A. Wainer SCE, Carleton University. ESG Seminars. Thursday, November 15th, 2001. Seminar topics will include.

Télécharger la présentation

by Ezequiel Glinsky Research Assistant, University of Buenos Aires, Argentina

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Overhead analysis of Discrete Event models execution by Ezequiel Glinsky Research Assistant, University of Buenos Aires, Argentina Supervisor: Prof. Gabriel A. Wainer SCE, Carleton University ESG Seminars Thursday, November 15th, 2001

  2. Seminar topics will include... • Introduction to DEVS formalism • Performance analysis of different DEVS tools • RT-DEVS extension to the formalism • Development of enhancements (work-in-progress)

  3. DEVS Modeling & Simulation Framework (Introduction) • DEVS = Discrete Event System Specification • Provides sound formal M&S framework • Supports full range of dynamic system representation capability • Supports hierarchical, modular model development (Zeigler, 1976/84/90/00)

  4. DEVS Modeling & Simulation Framework (contd.) • Discrete-Event formalism: time advances using a continuous time base. • Basic models that can be coupled to build complex simulations. • Abstract simulation mechanism Sample model

  5. Testing & Performance Analysis “Performance analysis of different DEVS environments”  We need a syntheticmodel generator to represent different possible model configurations Why do we need a Model Generator? • Detect bottlenecks • Characterize tool’s overhead • Test automatically and thoroughly • Appreciate current overhead and therefore consider the possibility of RT simulation execution

  6. A model generator (contd.) Available parameters: • Depth • Width • Dhrystone code in transition functions • Model type

  7. A model generator (contd.) Available parameters: • Depth • Width • Dhrystone code in transition functions • Model type Number of levels of the modeling hierarchy.

  8. A model generator (contd.) Available parameters: • Depth • Width • Dhrystone code in transition functions • Model type Number of children belonging to each intermediate coupled component.

  9. A model generator (contd.) Available parameters: • Depth • Width • Dhrystone code in transition functions • Model type Allows us to execute time-consuming code inside both the internal and external transition functions.

  10. A model generator (contd.) Available parameters: • Depth • Width • Dhrystone code in transition functions • Model type Different types of models can be generated, with different behavior, coupling and interconnections.

  11. A model generator (contd.) Sample model Parameters DEPTH = 4 WIDTH = 3 Time in Internal Transition = 50 ms Time in External Transition = 50 ms Model Type = 1 (coupled component #1 and #2 are not shown)

  12. Testing & Performance Analysis Available environments: • Original CD++ simulator • Parallel CD++ with NoTime kernel • Parallel CD++ with TimeWarp kernel • Real-Time CD++

  13. Testing & Performance Analysis Available environments: • Original CD++ simulator • Parallel CD++ with NoTime kernel • Parallel CD++ with TimeWarp kernel • Real-Time CD++ Only provides stand-alone simulation. Doesn’t need to relay on any intermediate layer.

  14. Testing & Performance Analysis Available environments: • Original CD++ simulator • Parallel CD++ with NoTime kernel • Parallel CD++ with TimeWarp kernel • Real-Time CD++ Uses a middle-ware to allow parallel simulation. NoTime: Unsynchronized kernel

  15. Testing & Performance Analysis Available environments: • Original CD++ simulator • Parallel CD++ with NoTime kernel • Parallel CD++ with TimeWarp kernel • Real-Time CD++ Uses a middle-ware to allow parallel simulation. TimeWarp: Optimistic approach

  16. Testing & Performance Analysis Obtained Results X-Axis: Executed simulation Y-Axis: Total execution time All obtained results have been acquired using centralized model execution.

  17. Testing & Performance Analysis Conclusions: • Original CD ++ executes with minimum overhead on stand-alone simulation • Each technique induces a different overhead to the simulation that remains stable under normal conditions • Whenever parallel execution is needed, NoTime kernel outperforms TimeWarp kernel • To analyze distributed performance, testing should be done on a distributed RT simulations.

  18. Real-time DEVS (RT-DEVS) What is RT-DEVS? • RT-DEVS is an extension to the DEVS formalism Why do we need RTDEVS? • To run models interacting in a real-time environment • To study real-time performance with the designed models

  19. Main differences between usual approach and RT-DEVS Time advance is linked to the wall-clock all along the simulation. Timing constraints are checked against the wall-clock on some given checkpoints RT-DEVS (contd.) Usual approach RT-DEVS • Time is not linked to a clock at all. Instead, virtual time is used (logical clock). • No timing constraints

  20. RT-DEVS (contd.) What do we measure when executing RTDEVS? Why? • Worst-case response time • # of missed deadlines Besides... • log provides detailed information about the message passing • output results show most important timing information briefly: (wall-clock time, deadline, port, output value)

  21. RT-DEVS - Sample Model Alarm Clock • Simple timed-model design, no modification required to run under RT-DEVS • Timing performance can be easily validated • Can be seen as a component of a time-critical system • Flight-control systems • Industrial plants • Complex high-speed communications & systems Designed by Christian JacquesSCE, Carleton University

  22. Testing Real-Time performance • A new tool is needed: an event generator Testing technique: • Different model types, sizes and time-consuming transitions • Different frequencies and associated deadlines Goal: • Obtain a detailed characterization of the tool’s overhead • Performance analysis Parameters: time between events, associated deadline

  23. Testing Real-Time performance (contd.) Worst-case execution time - Obtained results (1) Model type = 1 Width = 12 Internal transition = 50 ms External transition = 50 ms Y-Axis: worst-case time (ms) X-Axis: model’s depth

  24. Testing Real-Time performance (contd.) Worst-case execution time - Obtained results (2) Y-Axis: Difference between theoretical and actual execution times X-Axis: model’s depth Y-Axis: % of overhead incurred by the RT simulator X-Axis: model’s depth

  25. Testing Real-Time performance (contd.) Missed deadlines - Obtained results (1) Width = 10 Depth = 10 Internal Function = 50 ms External Function = 50 ms Number of events = 100 Time between events = 8200 ms Theoretical Execution Time = 8200 ms Y-Axis: % of success X-Axis: Associated deadlines

  26. Testing Real-Time performance (contd.) Missed deadlines - Obtained results (2) Width = 10 Depth = 10 Internal Function = 50 ms External Function = 50 ms Number of events = 100 Associated deadline = 2 * Theoretical execution time Theoretical Execution Time = 8200 ms Y-Axis: % of success X-Axis: Time between events

  27. Testing Real-Time performance (contd.) Conclusions • Increasing complexity Increasing response times • Nevertheless, percentages of overhead remains nearly stable simulations can be carried out properly Bottom line: After thorough testing, we can say the real-time simulator is able to execute simulations properly even under difficult conditions (high workload and mid to large-scale models)

  28. Flattened Simulator Why do we need a flattened simulator?  To increase tool’s performance and simulate successfully even more complex models with higher workload (Work-in-progress)

  29. Flattened Simulator Existing hierarchical simulator: • Intermediate coordinators associated to each coupled component • High number of messages exchanged along the simulation • This induces more overhead! Model hierarchy Associated hierarchical simulator

  30. Flattened Simulator Proposed flattened simulator: • Must keep separation between model and actual simulator • Reduce number of intermediate coordinators • Simplify hierarchy and reduced message exchange along the simulation • Less overhead expected!

  31. Flattened Simulator Existing hierarchical simulator Proposed flattened simulator Hierarchical simulator Non-hierarchical flattened simulator • Only one coordinator exist, and it centralizes more responsibilities • Important reduction of exchanged messages • Simplified hierarchy • Keeps separation between model and actual simulator.

  32. Further work • Finish the Flattened simulator’s design and development • Execute overhead and performance analysis using the new flattened simulator More information: http://www.sce.carleton.ca/faculty/wainer/wbgraf/index.html

  33. Overhead analysis of Discrete Event models execution by Ezequiel Glinsky Research Assistant, University of Buenos Aires, Argentina Supervisor: Prof. Gabriel A. Wainer SCE, Carleton University Questions? Thursday, November 15th, 2001

More Related