660 likes | 1.02k Vues
Dynamic Data-Driven Application Simulation (DDDAS). Clay Harris Jay Hatcher Cindy Burklow. General Simulation. Calculations are predefined Boundary conditions are predefined Initial data is given Time step is predefined Additional data input at predetermined times
E N D
Dynamic Data-Driven Application Simulation (DDDAS) Clay Harris Jay Hatcher Cindy Burklow
General Simulation • Calculations are predefined • Boundary conditions are predefined • Initial data is given • Time step is predefined • Additional data input at predetermined times • Results are recorded and often studied later
DDDAS • Calculations may change depending upon the incoming data • Boundary conditions may be updated during the simulation • Initial data is given, but may be corrected at a later time • Time step may change depending upon incoming data values • Additional data comes in anytime and out of order • Frequently the results are monitored in real time
What is DDDAS? • A DDDAS is one where data is fed into an executing application either as the data is collected or from a data archive [1, p. 662]. • The data is then used to influence the measurements for additional data the simulation may require. [1] Frederica Darema. “Dynamic Data Driven Applications Systems: A New Paradigm for Application Simulations and Measurements”. International Conference on Computational Science. 662-669. 2004
Dynamic Predictions • Wildfire Forecasting • Tsunami Forecasting • Traffic Jam Forecasting • Weather Forecasting • Global Warming – El Nino • Ocean Modeling • Cyclone Movement Prediction • Threat Management in Urban Water Supplies • Fault Diagnosis of Wind Turbine System • Operational Control for Manufacturing • Brain Machine Interface • Landscape Biophysical Change
Keep in mind with DDDAS • Typically approximating a nonlinear time dependent partial differential equation – nontraditional convergence • Perturbations from incoming data • Inaccurate data • Propagation of error • Boundary conditions are rarely known
Traditional Simulation Infrastructure Graphical Output CPU Initial Algorithm Initial Conditions
DDDAS Infrastructure Graphical Output CPU Real-Time Sensors Initial Algorithm
NSF OLD (serialized and static) NEW PARADIGM (Dynamic Data-Driven Simulation Systems) What is DDDAS Simulations (Math.Modeling Phenomenology Observation Modeling Design) Theory (First Principles) Simulations (Math.Modeling Phenomenology) Theory (First Principles) Experiment Measurements Field-Data User Experiment Measurements Field-Data User Dynamic Feedback & Control Loop Challenges: Application Simulations Development Algorithms Computing Systems Support Frederica Darema, NSF
A DDDAS Model(Dynamic, Data-Driven Application Systems) Discover, Ingest, Interact Models Discover, Ingest, Interact Computations Loads a behavior into the infrastructure sensors & actuators sensors & actuators sensors & actuators Cosmological: 10e-20 Hz. Humans: 3 Hz. Computational Infrastructure (grids, perhaps?) Subatomic: 10e+20 Hz. Spectrum of Physical Systems Craig Lee, IPDPS panel, 2003
DDDAS Research • Data injection methods • 2-way communication with sensors • Quick methods for static simulation conversion to DDDAS • Infrastructure support for dynamic methods – including communications support, data driven technologies, and OS software
Data Determines Everything • The algorithm used • Additional data collected • Simulation restart (cold or warm) • Output correction • Communications with people • The Result!
Dynamic Work Flows • Flexible event handling system notifies appropriate recipients of relevant events • Dynamic workflow handling system coordinates and schedules actions in response to known events
Dynamic Work Flows • Events delivered using a publish/subscribe model or based on content • Decision makers receive event notification and make an appropriate response decision • Responses are executed by a workflow engine that schedules data transfer and process execution
Dynamic Work Flows • Besides events causing an initial response, subsequent events may alter an existing workflow • Current amount of workflow completed must be determined • Current tasks on the “leading edge” of the workflow must be terminated or allowed to complete • Status and disposition of data referenced by data handles must be determined • Storage management issues • Dangling references to no data or stale data • Inaccessible data referenced by no one
Data Driven Design Optimization Methodology (DDDOM) • DDDOM uses DDDAS to find an optimal solution to an engineering design problem • Used in Multi-criteria Design Optimization (MDO) problems • Uses Rapid Prototyping, Grid Computing, and other advanced technologies to perform simultaneous experimentation and simulation to achieve optimal designs
DDDOM Application – Cooling of Electronic Components • Optimize design for cooling system • Maximize heat transfer and minimize pressure drop • Increasing heat transfer also increases pressure drop, so there is no specific solution, but rather a set of good solutions • Problem is a MDO problem
DDDOM Application – Cooling of Electronic Components • Select 25 sampling points from design space • Perform computations at these 25 points on supercomputers • Get experiment data from experiments • Combine experiment and simulation data together to build Surrogate Model • Optimize SM and obtain the Pareto Set
DDDOM Application – Cooling of Electronic Components • Two Optimization methods used: • Epsilon constraint method • Multi-Objective Switching Genetic Algorithm (OSGA) • Results are comparable, with OSGA giving more data points
O’SOAP • A web services framework for DDDAS applications. • Geographically distributed set of application components • Reduces the effort required to develop DDDAS applications
O’SOAP – Advantages over traditional monolithic applications • Developer only needs to implement a program component on a single local platform • Loosely-coupled nature of the components facilitates reuse for new simulations • Allows simultaneous use for multiple research projects
O’SOAP • Current web service technologies are inadequate for DDDAS applications • They are generally geared for more interactive applications • Often have a learning curve that is steep enough to discourage computational scientists from experimenting with a remote DDDAS system
O’SOAP • DDDAS developers must consider: • Generating Interface Documentation • Data management concerns • Asynchronous Interactions • Authentication, Authorization and Accounting (AAA) • Job Scheduling • Performance
O’SOAP • Current web service technologies present the developer with a blank slate • For a novice, developing a web serviced DDDAS application is a difficult undertaking • O’SOAP provides a framework for designing DDDAS applications with minimal interface code and developer effort
O’SOAP - Implementation • Applications deployed as distributed components • Services automatically documented with WSDL • Asynchronous communication supported by sending a job ID for the remote application back to the client, which periodically checks the remote application’s status
O’SOAP - Implementation • Supports small and large data sizes: • If data size is small or programmer requests the data is included in a SOAP envelope as XML (pass by value) • If data is large or programmer requests a URL is sent in the envelope pointing to the data (pass by reference)
O’SOAP – Implementation • Performance measured with the Pipe Problem • Simulates an idealized segment of rocket engine modeled after one of NASA’s experimental rocket designs • Three different sizes of the Pipe Problem used to evaluate how performance scales
Some Characteristics of DDDAS Projects • Managing complex scenarios • Predicting high risk areas & safety • Effects large population of people • Involves natural environment • Impact on the overall economy • Needs multi-disciplined team • Real-time analysis is critical
Threat Management in Urban Water Distribution Systems Situation… • Highly interconnected water transport system • Frequent flow fluctuations • Highly dynamic transport paths • Single point of contamination can quickly spread
Contamination Threat Management of drinking H20 involves…. • Real-time characterization of contaminant source & plume • Identification of control strategies • Design of incremental data sampling schedules.
Why use DDDAS… • Requires dynamic integration of time-varying measurements of flow, pressure and contaminant concentration • Uses analytical modules are highly compute-intensive, requiring multi-level parallel processing via computer clusters
Project’s DDDAS infrastructure • Develop cyber-infrastructure system that will both adapt to and control changing needs in data, models, computer resources and management choices facilitated by a dynamic workflow design • Virtual Simulations • Field Studies
Fault Diagnosis ofWind Turbine Systems Current Situation… • Current practices are non-dynamic & non-robust for modeling, data collection, & processing strategies • Clean wind energy cannot compete with traditional energy source • High financial cost compared to other energy sources • High maintenance cost • Low confidence in the diagnosis technology • Need for enabling a cost-effective generation of wind electricity
Involves… • Development of diagnosis system for wind turbines • Fault diagnosis of blades and gearboxes • Utilizes historical & online signals • Employs novel de-noising & sensor anomaly removal algorithms
Why use DDDAS… • Involves collaborative research that is multidisciplinary • Benefits a larger range of industries such as power generation, automobile, aerospace, and engine industries. • Effects the overall general population with clean air issues • Effects energy economic costs
Project’s DDDAS infrastructure • 2 robust data pre-processing modules for highlighting fault features and removing sensor anomaly • 3 interrelated, multi-level models that describe different details of the system behaviors • 1 dynamic strategy for the robust local interrogation that allows for measurements to be adaptively taken according to specific physical conditions and the associated risk level. • Overall incorporates both historical data and on-line signals into the system modeling
Production Planning & Operational Control for Distributed Enterprise • Society depends upon many interacting large-scale dynamic systems • Too complex for mathematical analysis • Behavior of system networks depends on their linkages and the environment
Involves… • Focus on hierarchical production • Logistics planning • Control in highly capitalized discrete manufacturing system networks
Why use DDDAS… • Requires complex simulations • Needs dynamic reaction to various situations • Utilizes centralized control • High cost & financial risk involved
Project’s DDDAS infrastructure • Multi-scale federation of interwoven simulations • Decisions models for planning • Control with capability for dynamic updating through sensors • Capacity to use off-line performance testing • Integrated architecture for distributed computing • Utilizes sensors, transducers, and actuators • Web service technology
Brain-Machine Interfaces (BMI) • Brain receives & uses sensory feedback to learn & generate signals to produce purposeful motion. • Address chief problemin current BMI research: paraplegics cannot train their own network models because they cannot move their limbs.
Involves… • Cognitive brain modeling from experiments with live subjects • Design of brain-inspired assistive systems to help human beings with severe motor behavior limitations (e.g. paraplegics) through brain-machine Interfaces (BMIs). • BMI uses brain signals to directly control devices such as computers and robots.
Why use DDDAS… • Complexity of relationship between the brain & nervous system • Learning occurs simultaneously for the subject and the control models in a synergistic manner • Selective use of many computational models • Interdisciplinary team
Project’s DDDAS infrastructure • Develop models • Implement algorithms • Deploy computational architecture All the above will utilize recently proposed advanced brain models of motor control.
Sensor Networks – Enabling Measurement, Modeling & Prediction of Biophysical Change in a Landscape • Collecting environmental data is challenging • Deployed in remote locations • No access to infrastructure (e.g. power) • Wide range of sampling time variables
Involves… • Understanding how biodiversity & carbon storage are influenced by global change • Wireless sensor network • Models of tree growth & resource allocation • Adaptive sampling across diverse time & space scales