1 / 26

Verne Kaupp ICREST — University of Missouri-Columbia Chuck Hutchinson ARSC — University of Arizona-Tucson 1 April 2004

National Applications DSS Program Support Assimilation of NASA Earth Science Results and Data in National Decision Support Systems: A Guidebook. Verne Kaupp ICREST — University of Missouri-Columbia Chuck Hutchinson ARSC — University of Arizona-Tucson 1 April 2004.

connie
Télécharger la présentation

Verne Kaupp ICREST — University of Missouri-Columbia Chuck Hutchinson ARSC — University of Arizona-Tucson 1 April 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. National ApplicationsDSS Program SupportAssimilation of NASA Earth Science Results and Data in National Decision Support Systems:A Guidebook Verne Kaupp ICREST — University of Missouri-Columbia Chuck Hutchinson ARSC — University of Arizona-Tucson 1 April 2004

  2. Assimilation of NASA Earth Science Results and Data in National Decision Support Systems:A Guidebook

  3. National Application Partnering Agency Decision Making System Decision Support System—n Decision Support System—2 Decision Support System—1 Problem Processing System KnowledgeSystem Decision Making Considerations • Decision-Making Systems: • Decisions are the “finished products” of a decision-making system (DMS). • In a DMS, the user is the decision maker and the information processing system is the decision support system (DSS). • Taken together, the DSS and its user constitute a DMS. • Service of a National Application: • The DMS activates its DSSs to obtain necessary information. • The available DSSs – a DMS may incorporate multiple DSSs – search for relevant problem domain information in the KS(s) and produce outcome decision support information in the PPS(s). • These outcomes are passed to the DMS where the decision maker considers the information provided together with other pertinent information and makes an appropriate decision.

  4. HOW ABOUT INFORMING THE USER – OTAG EXPERIENCE • A generalized model for a DMS containing five fundamental parts: • (1) the User who formulates the problem and produces a decision from the DSS output and other pertinent information, • (2) the Input Interface which is designed to facilitate HCI (human-computer interaction), • (3) the Problem Processing System (the PPS) that takes an input problem statement and information from the Knowledge System (KS) to produce information that supports (enhances, or makes possible) a decision process, • (4) the KS that contains the body of knowledge available about a problem domain (the KS has the databases, background – historical information, problem domain knowledge, algorithms, etc.), and • (5) the Output Interface that allows the user to retrieve the processed information in a format that is understandable/visualizable.

  5. The Knowledge System (KS). • The KS is where the complete domain of knowledge about a particular problem, or class of problems resides. • It is in the KS where NASA will focus the majority of DSS assimilation and enhancement work (as shown in the figure) whereas the partnering agency, in upgrading its DSS to the enhanced operational State 2 will concentrate its efforts on the interfaces and PPS. • Note that this means that in order to provide a DSS enhancement, NASA’s assimilation result must function within the partnering agency’s DMS. This burdens the NASA effort with understanding the high level needs and requirements – the operating environment – and the low level status for both the DSS and the DMS. --- GOOD POINT!

  6. Information Needs • The type of low level status information needed include but is not limited to such as: • (1) data types and sources, • (2) data formats, • (3) spatial, spectral, and temporal resolution, • (4) spatial and temporal coverage, • (5) data cost, • (6) availability and timeliness of data, • (7) volume of data to process, • (8) geospatial, numerical, and thematic accuracy, • (9) etc. • The high level considerations include knowing (briefly discussed in the following): • (1) the framework, • (2) the knowledge type, • (3) the management paradigm, and • (4) the decision-making typology.

  7. A: Problem Framework • Framework—Use of a “framework” or matrix describing management decision-making activities is common in the literature. • The Table proposes a decision support framework matrix for use in this DSS work. • The degree of problem complexity is listed across the top. • The degree of structure – with the continuum of structure being quantized into the three choices listed – Is along the left. • Cell values are particular decision types. Note that it is the decision context that is unstructured, not the DSS itself. • The more structured a problem, the more amenable it is to solution by computerized algorithms in a DSS. The simplest case would be the Structured/Modeled Problem noted as a routine decision type. • The more unstructured a problem, the more a decision must rely on human judgment and intuition. Thus, the Unstructured/Novel Problem is noted as requiring a custom tailored solution. • A semi-structured problem has both structured and unstructured phases and elements. Clearly, as the decision making system is developed, the intent is to make it less idiosyncratic and more consistent – to move decisions from the bottom right toward the upper left of the framework.

  8. B: Types of Knowledge Explicit 1 2 Tacit Declarative 3 4 Procedural Esoteric 5 6 Exoteric Deep 7 8 Shallow • Types of Knowledge—Different types of knowledge are recognized in the literature: explicit vs. tacit, declarative vs. procedural, esoteric vs. exoteric, and deep vs. shallow. Each represents a different management challenge. Briefly, the types of knowledge are: • Tacit knowledge—is contained within a person’s head, and is difficult to express, • Explicit knowledge—can be readily articulated, written down, and shared, • Declarative knowledge—consists of facts about the state of the world, • Procedural knowledge—involves “how to” do something, • Esoteric knowledge—is highly specialized, and applicable to narrow domains, • Exoteric knowledge—is applicable to broad domains, and might be considered to be “common sense”, • Deep knowledge—consists of formal theories of behavior of phenomena, • Shallow knowledge—is found in social domains and is less well organized and codified than scientific domains.

  9. C: Knowledge Management Paradigm • Knowledge Management Paradigms—The following three different perspectives on knowledge management embrace the spectrum of choice: • Functional paradigm—is that most often adopted by those in practice, especially in the software industry, • Interpretive paradigm—consists of fostering communications between individuals, sharing and enriching interpretations, and coordinating actions, • Critical paradigm—is concerned primarily with social conflict and antagonistic relationship amongst various stakeholders and special interest groups.

  10. D: Decision Making System • Typology—Over the past 50 years, decision-making and decision-making environments have been extensively studied. In the classical literature, there are five types of decision making systems or inquiring organizations: Leibnizian (Type 1), Lockean (Type 2), Kantian (Type 3), Hegelian (Type 4), and Singerian (Type 5), each based on the philosophies of their respective namesakes. We find that decision making systems can be classified analogously and, thus, we will adopt these definitions for characterizing DSSs but will use the more generic labels, Type 1 – Type 5. A brief description of each follows: • Type 1—Analytic – Deductive Approach—creates knowledge by using formal logic and mathematical analysis to make inferences about cause-and-effect relationships, and maintains that everything needed to solve a problem is already contained within its boundaries. As closed systems, they have access only to knowledge generated internally. • Type 2—Consensual – Inductive Approach—creates knowledge from empirical information gathered from external observations and used inductively to build a representation of the world with a set of labels (or properties) assigned to the observations. The decision style is clearly group-oriented and open. • Type 3—Empirical – Theoretical Approach— recognizes that there may be many different perspectives on a problem, or at least many different ways of modeling it. The perspectives are all, however, based on the belief that problems can be modeled analytically. It finally chooses the model which best explains the data. • Type 4—Conflict – Synthesis Approach—is based on the belief that the most effective way to create knowledge is by observing a debate between two diametrically opposed viewpoints about a topic, a thesis and an antithesis,from which a synthesis is constructed as the worldview. • Type 5—Multiple Perspectives – Holistic Approach—views the world as a holistic system, in which everything is connected to everything else. Solving complex problems may require knowledge from any source and knowledgeable people in any discipline or profession. The multiple perspectives approach does not end with the technical, organizational, and personal perspectives. It also explicitly brings ethics and aesthetics into play.

  11. Classification Concept

  12. Decision-Making System Classification • The figure illustrates a straight-forward A-B-C-D upper level classification scheme for a decision-making system and it’s DSS that will be used to characterize all States of a DMS. It provides an explicit representation of a classification based on the scheme presented above in the section on Decision-Making Considerations. It encapsulates the problem framework, types of knowledge, knowledge management paradigm, and the typology of the decision-making system making it simple and straight forward to classify. A description of each category follows:

  13. Example - PECAD • As an example of the use of this classification scheme, the figure illustrates a possible State 1 classification of a DMS. This example classification shows the system to • function within the decision support framework as a semi-structured/complex problem [A = 5], • to require explicit, declarative, and esoteric knowledge [B = 1,3,5], • to operate in the interpretive knowledge management paradigm [C = 2], and • to be embedded in a consensual decision-making environment [D = 2].

  14. Benchmarking

  15. Benchmarking definition: Within the scope of the DSS assimilation activity, benchmarking is defined as: A continuous systematic process for evaluating the algorithms and input and output products of a DSS for the purpose of comparisons and enhancement of a DSS. (after Spendolini, 1992)

  16. (After Andersen and Petterson, 1996)

  17. Approach

  18. Initiate DSS assimilation . Obtain Agreement . — NASA — Federal partner . State 1 . Upgrade plans Start benchmark process Determine NASA assimilation components — Data — Science results — Observations . Develop enhanced DSS . — Assimilation — V & V — Operations . Examine DSS . —Characterize current state . Conduct risk analysis . —Use DDP Tool Suite —Consider models/data —Identify assimilation potential . State 0 . Development plans Conduct Zero-Order Assessment . —Review current state —Define requirements —Consider upgrade plans —Evaluate DMS Examine enhanced DSS — Characterize State 2 Measure improvements State 1 – State 2 . Identify DSS . — Federal agency — Point of contact Complete benchmark process — Apply metrics — Determine lessons learned — Prepare final report ……..….Preliminary ……... …..……………………………………Assimilation work………………………………………………..

  19. Four Steps • The following figure illustrates the approach recommended in this booklet for the assimilation process. • This generalized approach can be broken into four steps, flowing from Step 1 to Step 4. • It is not a requirement that the process be strictly serial. • It is expected that some degree of parallel activity will result in cost savings and reduced time to complete the enhancement of a DSS. • The four steps are: • (1) establish the need and develop relationships; • (2) characterize the current DSS status (known either as State 1 or State 0); • (3) identify possible NASA solutions and develop an enhanced DSS (known as State 2); and • (4) measure improvements and transition the enhanced DSS to operations. These steps define the approach for this booklet, draw from the figure, and are discussed in the following.

  20. The First Step • The first step is to conduct the set of tasks shown as the preliminary work on the left side of the figure and thereby develop the basis for an agreement with another Federal agency to upgrade its DSS. • First step—Five tasks to establish the need and develop relationships: • Identify DSS • Form assimilation team • Conduct “zeroth” order, feasibility assessment • Conduct risk analysis • Obtain agreement

  21. The Second Step • The second step is to initiate the assimilation process by formally characterizing the current operational state (State 1) of a candidate DSS and benchmarking its performance. On the figure, this is represented in part by the second vertical column from the left consisting of four blocks. • Second step—Four tasks to characterize the current DSS status • Start benchmark process • Form benchmark team • Characterize the DSS current status • Determine DSS objectives

  22. The Third Step • Third step—Eleven tasks to identify possible NASA solutions and develop an enhanced DSS: • Plan for DSS enhancement for State 2 • Determine needs/requirements for DSS upgrade/enhancement • Prepare requirements document • Brief program manager • Solicit existing results • Determine suitable NASA capabilities • Propose assimilation plans • Develop enhanced DSS—State 2 • Conduct verification & validation • Perform DSS demonstrations • Transition/benchmark testing • The third step has three major goals: • (1) identify possible NASA results, • (2) develop an enhanced DSS, and • (3) transition it to operations. • In the normal case, it may require significant resources and a major effort expended over multiple fiscal years to complete the work of this step.

  23. The Fourth Step • The fourth step is to measure the improvements achieved in the assimilation process and to transition the enhanced DSS to operations. • Refer back to general assimilation process flow figure one last time. • The bottom three blocks represents the end of the assimilation activity for a particular DSS. • These are • (1) Examine enhanced DSS, • (2) Measure improvements, and • (3) Complete benchmark process. • The first, second, and third tasks, respectively, of this task treat those blocks. • Fourth step—Five tasks to measure the improvements achieved in the assimilation process and to transition the enhanced DSS to operations: • Characterize the State 2 DSS • Evaluate metrics (States 1 &2) • Complete benchmark process • Document results • Transfer responsibility

  24. Types of Outcomes (Deltas) • Quantitative, component level • Accuracy • Utility • Cost • Scope • …….. • Qualitative, system level • Problem framework (more structured) • Knowledge management paradigm (more functional) • Types of knowledge • Type of DMS

  25. End of Process?

More Related