1 / 17

Stevens-USC SERC: A DoD University Affiliated Research Center (UARC)

Stevens-USC SERC: A DoD University Affiliated Research Center (UARC). Barry Boehm, USC SERC-Internal Kickoff Tasks Workshop January 29, 2009. Outline. Nature of UARCs SERC Overview SERC organization and vision SERC research strategy Initial tasks SysE Effectiveness Measures Assessment

gunda
Télécharger la présentation

Stevens-USC SERC: A DoD University Affiliated Research Center (UARC)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stevens-USC SERC:A DoD University Affiliated Research Center (UARC) Barry Boehm, USC SERC-Internal Kickoff Tasks Workshop January 29, 2009

  2. Outline • Nature of UARCs • SERC Overview • SERC organization and vision • SERC research strategy • Initial tasks • SysE Effectiveness Measures Assessment • SysE Methods, Processes, Tools Evaluation • Workshop objectives and approach

  3. What is a UARC? • University Affiliated Research Centers are “not-for-profit, private sector organizations affiliated with, or part of, universities or colleges that maintain essential research, development and engineering capabilities needed by sponsoring DoD components.” • They maintain long-term, strategic relationships with sponsoring DoD components in specific core areas and operate in the public interest, free from real or perceived conflicts of interest. • UARCs are financed through long-term, non-competitive contracts awarded by sponsoring DoD components for specific core work.

  4. Several Existing UARCs • Johns Hopkins University APL; 4,300 people; annual funding $680M • UC Santa Cruz with NASA Ames – information technology, biotechnology, nanotechnology, computer science, aerospace operations, astrobiology, and fundamental biology. Has a Systems Teaching Institute with San Jose State University and UCSC to teach through hands on experience on research projects. • Penn State University Applied Research Laboratory for the Navy with focus on undersea missions and related areas; strategic partner with NAVSEA and ONR; established 1945; has >1000 faculty and staff • University of Washington APL – acoustic and oceanographic studies ; established in 1943 • UC Santa Barbara Institute for Collaborative Biotechnology – Army Research Office; partnered with MIT and Cal Tech – focus on biologically-derived and biologically-inspired materials, sensors, and information processing … • University of Texas UARC started in 1945 focuses on sonar, acoustics, software system research, satellite geodesy, active sonar, …; now has 600 people on staff • USC Institute for Creative Technology – US Army; focus on virtual reality multimedia applications for training, C4ISR

  5. Auburn University Air Force Institute of Technology Carnegie Mellon University Fraunhofer Center at UMD Massachusetts Institute of Technology Missouri University of Science and Technology (S&T) Naval Postgraduate School SERC Organization Lead organizations Members • Pennsylvania State University • Southern Methodist University • Texas A&M University • Texas Tech University • University of Alabama in Huntsville • University of California at San Diego • University of Maryland • University of Massachusetts • University of Virginia • Wayne State University As the DoD Systems Engineering Research-University Affiliated Research Center, SERC will be responsible for systems engineering research that supports the development, integration, testing and sustainability of complex defense systems, enterprises and services. Its members are located in 11 states, near many DoD facilities and all DAU campuses.

  6. SERC Organization - II Dr. Dinesh Verma Executive Director Dr. Art Pyster Deputy Executive Director Julie Norris (acting) Director of Operations Dr. Barry Boehm Director of Research Pool of more than 140 Senior Researchers and hundreds of research faculty and graduate students from across members Stevens’ School of Systems and Enterprises will host the SERC at Stevens’ Hoboken, NJ, campus. Stevens’ faculty engagement will be complemented by a critical mass of systems engineering faculty at USC. A fundamental tenet of SERC is its virtual nature – each of its 18 members will be a nexus of research activities. All research projects will be staffed by the best available researchers and graduate students from across the members.

  7. Rough Financial Model • Minimum of $2M/year • Can add more funded tasks within contract • First year: two specified tasks • 1. SysE Effectiveness Measurement: EM (USC lead; $500K) • 2. SysE Methods, Processes, Tools Assessment : MPT (Stevens lead; $350K) • Further tasks TBD • Other sponsors can sole-source through UARC • Procedures being worked out

  8. SERC Vision and Perspective DoD and IC systems achieving mission outcomes – enabled by research leading to transformational SE methods, processes, and tools. Vision Perspective • The SERC will be the primary engine for defense and intelligence community SE basic research. In doing so, the SERC will: • Transform SE practice throughout the DoD and IC communities by creating innovative methods, processes, and tools that address critical challenges to meeting mission outcomes (what), • Become the catalyst for community growth among SE researchers by enabling collaboration among many SE research organizations (who), • Accelerate SE competency development through rapid transfer of its research to educators and practitioners (how).

  9. Research must respond to challenges in many life cycles, activities, and attributes SUPERVISORY FUNCTIONS: SE Planning, SE Managing, SE Assessing, SE Controlling, … “Classic” Systems Weapon Platforms, System of Systems, Network Centric Services, Enterprise Systems Designing for Reliability, Security, Maintainability, Supportability, Resilience, Scalability, Manufacturability, Adaptability, …

  10. SERC Research Methodology

  11. Example: Mission and StakeholderRequirements and Expectations Can we develop a transformative, interactive, and graphical environment to bring stakeholders (warfighters and analysts) together with SEs to develop a graphical/visual conops in an extremely agile manner? Every study on failed projects, refers to inadequate requirements, and understanding of the “real” problem: large projects and small projects; defense or commercial Understand and synthesize advances in multi-media technologies and interactive graphics; gaming technologies; real options theory

  12. Technical Task Order (TTO) Submittals • Prepare draft TTOs using template that Doris Schultz will send you. • Proposed TTO funding can either be from core OSD/DoD or from other agencies. • Template is based on how government is asking us to submit information to them. • Officially, PMO proposes research to which SERC responds. They recognize, of course, that they need our help in identifying the proper research to perform and have asked for informal proposals. • Template asks what you want to do, why it is important, what previous research you are basing this work on, who will perform the work, what competencies it requires, what will be delivered, and how much it will cost.

  13. SERC Research Initiation Strategy FY09 Focus SE external Factors/Context Mission Drivers Research to Address Specific SE issues Determine Modified MPTs To address gaps Validate MPT Effectiveness Research to Address Specific SE issues Determine Modified MPTs To address gaps Validate MPT Effectiveness Baseline SE MPTs Research to Address Specific SE issues Determine Modified MPTs To address gaps Validate MPT Effectiveness Research to Address Specific SE issues Determine Modified MPTs To address gaps Validate MPT Effectiveness Determine SE Effectiveness And Value Measures Early focus on a solid baseline and quantifiable, observable Measures to enable future demonstration of improvement

  14. Outline • Nature of UARCs • SERC Overview • SERC organization and vision • SERC research strategy • Initial tasks • SysE Effectiveness Measures Assessment • SysE Methods, Processes, Tools Evaluation • Workshop objectives and approach

  15. First Two Task Orders “Assessing Systems Engineering Effectiveness in Major Defense Acquisition Programs (MDAPs)” Government lead: OUSD(AT&L)/SSE Barry Boehm (USC) task lead, with support from Stevens, Fraunhofer Center, University of Alabama at Huntsville “Evaluation of Systems Engineering Methods, Processes, and Tools (MPT) on Department of Defense and Intelligence Community Programs” Government lead: DoD Mike Pennotti (Stevens) task lead, with support from USC, University of Alabama at Huntsville, USC, Fraunhofer

  16. Coordinated approach • Coordinated management • Common collaborators, battle rhythm with regular e-meetings, shared workshops • Research Integrity Team • Coordinated Technical Approaches • Definitions, evaluation criteria and methods • Cross-feed/peer review ongoing results • Synergetic contribution to Sponsor’s SysE effectiveness goals • Best practices + progress monitoring and improvement • Common context • Domains of interest and levels of organization 16

  17. Workshop Objectives and Approach • Review, improve on EM and MPT task objectives • Utility of results to DoD constituencies • Identification of needs-capabilities gaps • Identification of promising research directions • Review, improve on EM and MPT task approaches • Survey and evaluation approaches, criteria, and instruments • Involvement of key stakeholders • Contractors, program/portfolio managers, oversight organizations • Coverage of DoD application domains • Initial EM priority: Weapons platforms • Initial MPT priority: Net-centric services • Test and evaluation of results • Capture participants’ relevant experience

More Related