1 / 36

Dennis Moellman, VACE Program Manager

VACE Executive Brief for MLMI. Dennis Moellman, VACE Program Manager. Briefing Outline. Introduction Phase II Evaluation Technology Transfer Phase III Conclusion. Introduction. What is ARDA/DTO/VACE?. ARDA – Advanced Research and Development Activity

yamal
Télécharger la présentation

Dennis Moellman, VACE Program Manager

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. VACE Executive Brief for MLMI • Dennis Moellman, VACE Program Manager

  2. Briefing Outline • Introduction • Phase II • Evaluation • Technology Transfer • Phase III • Conclusion

  3. Introduction What is ARDA/DTO/VACE? • ARDA – Advanced Research and Development Activity • A high-risk/high-payoff R&D effort sponsored by US DoD/IC • ARDA taking a new identity • In FY2007 under the DNI • Report to: ADNI(S&T) • Renamed: Disruptive Technology Office • VACE – Video Analysis and Content Extraction • Three Phase initiative begun in 2000 and ending 2009 • Winding down Phase II • Entering into Phase III

  4. Context Video Exploitation Barriers • Problem Creation: • Video is an ever expanding source of imagery and open source intelligence such that it commands a place in the all-source analysis. • Research Problem: • Lack of robust software automation tools to assist human analysts: • Human operators are required to manually monitor video signals • Human intervention is required to annotate video for indexing purposes • Content based routing based on automated processing is lacking • Flexible ad hoc search and browsing tools do not exist • Video Extent: • Broadcast News; Surveillance; UAV; Meetings; and Ground Reconnaissance

  5. Research Approach Video Exploitation • Research Objectives: • Basic technology breakthroughs • Video analysis system components • Video analysis systems • Formal evaluations: procedures, metrics and data sets • Evaluate Success: • Quantitative Testing Metric Current Need Accuracy <Human >>Human Speed >Real time <<Real time • Technology Transition • Over 70 technologies identified as deliverables • 50% have been delivered to the government • Over 20 undergoing government evaluation

  6. Management Approach Geared for Success Management Philosophy – NABC • N – Need • A – Approach • B – Benefit • C – Competition

  7. Source Video Language/User Technology Enhancement Filters UnderstandingEngine RecognitionEngine Visualization ExtractionEngine Intelligent Content Services 011010101010111 110100101011 01101010101 01101011 011010 0110 Concept Applications Reference Interests System View

  8. Phase 1 Phase 2 Phase 3 Future Object Detection & Tracking Object/Scene Classification Object Recognition Object Modeling Content Extraction Simple Event Detection Event Recognition Complex Event Detection Scene Modeling Event Understanding Mensuration Indexing Video Browsing Summarization Filtering Intelligent Content Services Advanced query/retrieval using Q&A technologies Content-based Routing Video Mining Change Detection Video Monitoring Image Enhancement/Stabilization Camera Parameter Estimation Multi-modal fusion Enabling Technologies Integrity Analysis Motion Analysis Motion Analysis Event Ontology Event Expression Language Automated Annotation Language Evaluation VACE Interests Technology Roadmap

  9. 4% 10% 11% 39% 36% Funding Commitment to Success FY06 Allocations FY07 Allocations 4% 12% 20% 64%

  10. Phase II Programmatics • Researcher Involvement: • Fourteen contracts • Researchers represent a cross section of industry and academia throughout the U.S. partnering to reach a common goal • Government Involvement: • Tap technical experts, analysts and COTRs from DoD/IC agencies • Each agency is represented on the VACE Advisory Committee, an advisory group to the ARDA/DTO Program Manager

  11. Carnegie Mellon Univ. (2) (Robotics Inst.) (Informedia) . IBM T. J. Watson Center Univ. of Washington Wright State Univ. Univ. of Chicago Univ. of Illinois- Urbana-Champaign (2) Univ. of Illinois- Urbana-Champaign Boeing Phantom Works Purdue Univ. Virage TASC AFIT MIT BBN SRI Salient Stills Alphatech Columbia Univ. Univ. of Southern California Sarnoff Corp (2) Univ. of Maryland Univ. of Maryland (2) Univ. of Southern California / Info. Science Inst. Georgia Inst. Of Tech. Telcordia Technologies Univ. of Central Florida Prime Contractors (14) Sub Contractors (14) Phase II Demographics

  12. Phase II Projects

  13. Phase II Projects

  14. Phase II Projects

  15. Evaluation Goals • Programmatic: • Inform ARDA/DTO management of progress/challenges • Developmental: • Speed progress via iterative self testing • Enable research and evaluation via essential data and tools – build lasting resources • Key is selecting the right tasks and metrics • Gear evaluation tasks to research suite • Collect data to support all research

  16. Evaluation The Team NIST USF Video Mining

  17. Evaluation Plan Products Planning Results Task Definitions Dry-Run shakedown Determine Sponsor Requirements Protocols/Metrics Rollout Schedule Data Identification Assess required/existing resources Formal Evaluation Evaluation Resources Training Data Development Data Evaluation Data Develop detailed plans with researcher input Technical Workshops and reports Ground Truth and other metadata Scoring and Truthing Tools Recommendations Evaluation NIST Process

  18. Algorithms System Output Video Data Results Evaluation Ground Truth Measures Annotation Evaluation NIST Mechanics

  19. Evaluation 2005-2006 Evaluations P = Person; F = Face; V = Vehicle; T = Text

  20. Evaluation Quantitative Metrics • Evaluation Metrics: • Detection: SFDA (Sequence Frame Detection Accuracy) • Metric for determining the accuracy of a detection algorithm with respect to space, time, and the number of objects • Tracking: STDA (Sequence Tracking Detection Accuracy) • Metric for determining detection accuracy along with the ability of a system to assign and track the ID of an object across frames • Text Recognition: WER (Word Error Rate) and CER (Character Error Rate) • In-scene and overlay text in video • Focused Diagnostic Metrics (11)

  21. Evaluation Phase II Best Results

  22. Evaluation Face Detection: BNews (Score Distribution)

  23. Evaluation Text Detection: BNews (SFDA Score distribution)

  24. Evaluation Open Evaluations and Workshops -- International • Benefit of open evaluations • Knowledge about others’ capabilities and community feedback • increased competition -> progress • Benefit of evaluation workshops • Encourage peer review and information exchange, minimize “wheel reinvention”, focus research on common problems, venue for publication • Current VACE-related open evaluations • VACE: Core Evaluations • CLEAR: Classification of Events, Activities, and Relationships • RT: Rich Transcription • TRECVID: Text Retrieval Conference Video Track • ETISEO: Evaluation du Traitment et de l’Interpretation de Sequences Video

  25. Evaluation Expanded

  26. Evaluation Schedule

  27. TECH TRANSFER DTO Test and Assessment Activities Purpose: Move technology from lab to operation • Technology Readiness Activity • An independent repository for test and assessment • Migrate technology out of lab environment • Assess technology maturity • Provide recommendations to DTO and researchers

  28. TECH TRANSFER DoD Technology Readiness Levels (TRL)

  29. Technology Transfer Applying TRL DOD Technology Risk Scale RISK LOW HIGH DTO Control 8 9 Production DTO Influence UNCLASSIFIED CLASSIFIED 6 7 UNCLASSIFIED CLASSIFIED Info-X Test Facility UNCLASSIFIED IC/DOD Test Facility(s) Contractor Test Facility 4 5 Use in assessing project’s • Technology maturity • Risk level • Commercializationpotential 1 2 3

  30. Technology Transfer TRA Maturity Assessments

  31. Phase III BAA Programmatics • Contracting Agency: DOI, Ft. Huachuca, AZ • DOI provides COR • ARDA/DTO retain DoD/IC agency COTR’s and add more • Currently in Proposal Review Process • Span 3 FY’s and 4 CY’s • Remains open thru 6/30/08 • Funding objective: $30M over program life • Anticipate to grow in FY07 and beyond • Address the same data source domains as Phase II • Will conduct formal evaluations • Will conduct maturity evaluations and tech transfer

  32. Phase III BAA Programmatics • Emphasis on technology and system approach • Move up technology path where applicable • Stress ubiquity • Divided into two tiers: • Tier 1: One year base with option year • Technology focus • Open to all – US and international • More awards for lesser funding • Tier 2: Two year base with option year(s) • Comprehensive component/system level initiative • Must be US prime • Fewer awards for greater funding

  33. Phase III BAA Schedule

  34. Summary Take-Aways • VACE is interested in: • Solving real problems with risky, radical approaches • Processing multiple data domains and multimodal data domains • Developing technology point solutions as well as component/system solutions • Evaluating technology process • Transferring technology into user’s space

  35. Conclusion Potential DTO Collaboration • Invitations: • Welcome to participate in VACE Phase III • Welcome to participate in VACE Phase III Evaluations

  36. Contacts Dennis Moellman, Program Manager Phones: 202-231-4453 (Dennis Moellman) 443-479-4365 (Paul Matthews) 301-688-7092 (DTO Office) 800-276-3747 (DTO Office) FAX: 202-231-4242 (Dennis Moellman) 301-688-7410 (DTO Office) E-Mail: dennis.moellman@dia.mil (Internet Mail) pmmatth@nsa.gov Location: Room 12A69 NBP #1 Suite 6644 9800 Savage Road Fort Meade, MD 20755-6644

More Related