1 / 19

Multimodal Intelligent Interactive Development Environment

Multimodal Intelligent Interactive Development Environment. Application Programming Interface 8/12/2009. System Overview. Model & Knowledge Management. Get context. MIIDE Whiteboard (MIDOS). Context. Task Models. Device Models. Preferences& constraints. Model Reasoner. Evaluate.

veata
Télécharger la présentation

Multimodal Intelligent Interactive Development Environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multimodal Intelligent Interactive Development Environment Application Programming Interface 8/12/2009

  2. System Overview Model & Knowledge Management Get context MIIDE Whiteboard (MIDOS) Context Task Models DeviceModels Preferences& constraints Model Reasoner Evaluate Context Models InferenceRules Results & Rationale Results & rationale Deploy and Simulate Validation results PAMS LTAE Simulator

  3. MIIDE Plug-In Architecture Simulator/Validator • MIDOS – user interaction core • MKM – Models and tools • Continuous two way interaction between MIDOS and MKM • Domain specific plug-ins Generated Artifact MIDOS Model & Knowledge Management The model Model Compiler Domain perspective Domain independent Domain specific Domain meta model MIIDE User models Model Generated

  4. Challenges • Incremental interaction between MIDOS and MKM to guide the user • Intelligently narrowing down the future options • Allowing individual user to develop domain dialects for interacting with MIDOS • Evolving the domain meta model

  5. Component API MIIDE Whiteboard (MIDOS) getOntology() Model Reasoner Model & Knowledge Management evaluate() evaluate() Models deploy() deploy() S-expression Ontology.xsd PAMS LTAE Deploy Simulator SDR Simulator (Testsys)

  6. getOntology()

  7. getOntology() results • An instance of a ServiceDescription defines all of the domain objects to be used by MIDOS. • A set of components defines a system that will provide one or more capabilities.

  8. Service Description typesOfCapabilities typesOfEnvironments typesOfResources

  9. Service Description – Components environmentConstraints resourcesSupplied typesOfComponents environmentConstraints resourcesUsed

  10. getOntology() results: CapabilityGroup Mission Type Kinetic WeaponDelivery: Yes Precision Guidance: Yes Recon Close Air Support Video Recon: Standard Retransmission: Yes Destroy Target Track: Yes Audio Audio Recon: Yes Onboard Storage: Yes Video High-Definition HD Video Recon: Yes HD Onboard Storage: Yes Standard Video Recon: Yes Retransmission: Yes

  11. evaluate() Evaluate will be called each time the user selects or removes a capability (either implicitly or explicitly). This will return to MIDOS 0..n valid configurations with reasons, which will allow MIDOS to tailor the information it presents to the user.

  12. evaluate() input: Request and Context Each request has a set of required components and capabilities and a set of weighted capability preferences. The context can contain values for each resource (Power, Weight, etc.) and values for the environment (Weather: Cloudy, Terrain: Mountain, etc.)

  13. evaluate() results The ReasoningResults returned from the evaluate method call will contain an ordered list of 0..n results and the explanation of why no configuration was selected or a summary of why the n-configurations were selected. Each result will contain: • validConfiguration: This configuration will contain all of the required capabilities and components and may contain some of the preferred capabilities. • comparisons: 0..n comparisons between the validConfiguration and other configurations that details the difference between the two and why the validConfiguration ranks higher.

  14. Storyboard Example The user requests to add image filtering and sets the priority of the capabilities.

  15. Example Request capabilityPreferences left right requiredComponents requiredCapabilities

  16. Example Context resources environment Other resource constraints will be determined by what is supplied by the selected components in a configuration. In this example, the system might know that an additional 20 Watts is being provided by solar panels mounted on the Predator.

  17. Reasoning Results Result 1: Result 2:

  18. deploy() Deploy can be called multiple times to test different configurations. Each call will present a simulation to the user and run generated unit tests that return if the configuration is valid.

  19. Possible Extensions • Add a new CapabilityGroup • Reason about preferences at CapabilityGroup level • Allow the user to add a new capability and use the PAMS DTA for SDR to dynamically generate valid configurations • Use the DESERT domain constraints to provide better reasons why configurations are not valid • Drive dialog from set of potential configurations

More Related