1 / 49

Lesson Distribution Gap

Lesson Distribution Gap. David W. Aha Rosina Weber Héctor Muñoz-Avila Leonard A. Breslow Kalyan Moy Gupta. Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory Booth # 214. Outline. Introduction Contributions Context:

fern
Télécharger la présentation

Lesson Distribution Gap

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lesson Distribution Gap David W. Aha Rosina Weber Héctor Muñoz-Avila Leonard A. Breslow Kalyan Moy Gupta Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory Booth # 214

  2. Outline • Introduction • Contributions • Context: • lessons learned systems, process, organizations • Lesson distribution gap • How to bridge this gap? Monitored Distribution • Example • Evaluation, Results • Next Steps Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  3. Contributions • Describe lessons learned process • Identify gap in lesson distribution • Propose Monitored Distribution • Test hypothesis in evaluation • Monitored Distribution can improve plan quality • Plan evaluator Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  4. Knowledge management context • Three types of KM initiatives • knowledge repositories • knowledge access and transfer • knowledge environment From Davenport & Prusak’s (1998): Working Knowledge • Types of knowledge repositories • industry oriented (alert systems, best practices) • organization oriented (lessons learned systems) • for example, .. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  5. government Construction Industry Inst. Honeywell GM Hewllet Packard Bechtel Jacobs Company Lockheed Martin E. Sys, Inc DynMcDermott Petroleum Co. Xerox IBM BestBuy Siemens int’l US European Space Agency Italian (Alenia) French (CNES) Japanese (NASDA) United Nations Air Force Army Coast Guard Joint Forces Marine Corps Navy int’l US Department of Energy: SELLS NASA (Ames, Goddard) Canadian Army Lessons Learned Centre non-government non-military military

  6. KNOWLEDGE ARTIFACTS Lessons learned systems Lessons learned systems are repositories of a knowledge artifact called lessons learned Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  7. Lessons learned definition… …or organizational lessons, lessons, lessons identified Definition: A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999) Definition: A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999) Definition: A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999) Definition: A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999) Definition: A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999) Definition: A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999) Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  8. REUSE RETRIEVE REVISE RETAIN Lessons learned process Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  9. indexing elements (case problem) • reuse elements (case solution) • applicable task • preconditions • lesson suggestion • rationale Lessons learned representation Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  10. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  11. Lessons learned example applicable task Installing custom stereo speakers. preconditions The car is the Porsche Boxster. lesson suggestion Make sure you distinguish the wires leading to the speakers from the wires leading to the side airbag. rationale Somebody has cut the wrong wire because they look alike and the airbag went off with explosive force. This means spending several thousand dollars to replace the airbag in addition to be a potential hazard. From article “Learning from Mistakes” about Best Buy in knowledge management magazine, April 2001. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  12. Lessons learned process Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  13. Lesson distribution methods Broadcastingbulletins, doctrine Passivestandalone repository Pull Push Active castinglist servers,information gathering tools Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  14. Problems with lesson distribution methods • Distribution is divorced from targeted organizational processes. • Users maynot know or be reminded of the repository, as they need to access a standalone tool to search for lessons. • Users maynot beconvinced of the potential utility of lessons. • Users maynot have the time and skills to retrieve and interpret textual lessons. • Users maynotbe able to apply lessons successfully. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  15. Organization’s members Repository of lessons learned Here is the gap Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  16. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  17. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  18. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  19. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  20. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  21. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  22. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  23. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  24. Organization’s members Repository of lessons learned How to bridge this gap? Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  25. Organization’s members Repository of lessons learned Monitored distribution Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  26. Organization’s members Repository of lessons learned Monitored distribution Lesson repository is in the same context astargeted processes Organizational processes Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  27. Problems & Solutions Distribution is divorced from targeted organizational processes. Users need to access a standalone tool to search for lessons. Lessons are distributed to users in the context of the organizational processes. Users don’t need to access a standalone tool. Distribute in the same context does not suffice!

  28. Problems & Solutions Users maynot have the time and skills to retrieve relevant lessons. Users may not be convinced of the potential utility of lessons. Users maynotbe able to apply lessons successfully. Intrusive methods may cause more problems than solutions. No significant additional time or skills are required. Users can assess the potential utility of lessons easily. Whenever possible, an ‘apply’ button allows the lesson to be automatically executable. Distribution tightly integrated to the targeted processes.

  29. Monitored Distribution Characteristics • Distribution tightly integrated to the targeted processes so that lessons are distributed when and wherethey are needed. • Represent lessons as cases (knowledge modeling). • Lessons are indexed by their applicability. • Additional benefits are: • Case representation facilitates interpretation. • Users assess potential utility with lesson rationale. • Whenever possible, an ‘apply’ button allows the lesson to be automatically executable.

  30. Noncombatant Evacuation Operations (NEO) Military operations to evacuate noncombatants whose lives are in danger and rescue them to a safe haven Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  31. Assembly Point Campaign headquarters Intermediate Staging Base . safe haven NEO site

  32. Example in HICAP • HICAP is a plan authoring tool suite • Users interact with HICAP by refining an HTN (hierarchical task network) through decompositions • http://www.aic.nrl.navy.mil/hicap • Muñoz-Avila et al., 1999 Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  33. safe haven NEO site Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  34. Selecting the Suggested Case… Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  35. Expanding yields… Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  36. And the user is notified of a lesson RATIONALE: TYPE: advice Clandestine SOF should not be used alone WHY: The enemy might be able to infer that SOF are involved, exposing them. RATIONALE: TYPE: advice Clandestine SOF should not be used alone WHY: The enemy might be able to infer that SOF are involved, exposing them. RATIONALE: TYPE: advice Clandestine SOF should not be used alone WHY: The enemy might be able to infer that SOF are involved, exposing WHY: The enemy might be able to infer that SOF are involved, exposing Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  37. After applying the lesson Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  38. Evaluation • Hypothesis • Using lessons will improve plan quality • Methodology • Simulated HICAP users generated NEO plans with and without lessons • Plan evaluator implemented plans • Plan total duration • Plan duration before medical assistance • Casualties: evacuees, friendly forces, enemies Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  39. Plan evaluator • non-deterministic (100 plans 10 times each) • 30 variables: 12 random • e.g., weather, airports • length of plans 18 steps • e.g., transportation mode, supplies, team • size of planning space 3,000,000 • 13 actual lessons Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  40. Plan implementation • Plans where evacuees were transported by land modes have an increased chance of being attacked by enemies. • When an attack happens it increases the number of casualties among evacuees and friendly forces (in proportion to # of evacuees). Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  41. Results *The resulting values are averages no lessons with lessons reduction NEO plan total duration* 32h48 18 % 39h50 duration until medical assistance* 24h13 18 % 29h37 casualties among evacuees 24 % 11.48 8.69 casualties among friendly forces 6.57 30 % 9.41 casualties among enemies -2 % 3.08 3.14 Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  42. Next Steps • Collection tool • Verify methods, reasoning • Integration of informal groups’ and user’s individual features • Evaluation with human subjects (simulated users in HICAP) and let human subjects decide on applying lessons • Extend MD to other decision support systems and other knowledge artifacts • Investigate distribution of experiential knowledge with training knowledge Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  43. David W. Aha Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory, Washington, DC Rosina Weber Department of Computer Science, University of Wyoming Fall 2001 at Drexel University, PA Héctor Muñoz-Avila Department of Computer Science, University of Maryland Fall 2001 at Leehigh University, PA Leonard A. Breslow Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory,Washington, DC Kalyan Moy Gupta IIT Industries, AES Division, Alexandria, Virginia Questions?

  44. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  45. distribute CBR Cycle and Knowledge Processes Aamodt & Plaza 1994

  46. Discussion • Intrusive method requires good precision • Knowledge representation is costly and so are lives! • What’s the worth of 35,000 unused lessons? • Knowledge representation can be also support validation • Good news: collect lessons into case representation. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  47. Plan evaluator: example lesson • applicable task: • Assign security element. • Conditions for applicability: • There are hundreds or more evacuees as to justify a security effort. • Lesson suggestion: • Recommend that EOD* personnel is utilized in security element. • Rationale: • Success. • EOD two DET ten personnel were employed in a force protection role and assisted USS Nassau security teams in identifying and investigating suspect items brought aboard by evacuees. • *EXPLOSIVE ORDNANCE DISPOSAL Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  48. Clarification • How is monitored distribution (MD) different from Clippie? • In MD, case base task is applicability • MD distributes experiential knowledge collected from users in similar roles as the potential reuser • Clippie is activated by single word • Clippie distributes general instructions/information Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

  49. additional lesson • Conditions for applicability: • There are representatives of different branches assigned to participate. • Lesson suggestion: • Assign representatives of all forces to plan. • Rationale:Lack of representatives prevent good communication causing delays and miscommunication. Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA

More Related