1 / 30

Information Management Capacity Check NRCan Case Study: A Baseline for Success

Information Management Capacity Check NRCan Case Study: A Baseline for Success. ERPANET Workshop – Antwerpen – April 2004 Bob Provick – Library and Archives Canada. Agenda. The IMCC and Methodology – A Quick Review Case Study: The NRCan Experience

holland
Télécharger la présentation

Information Management Capacity Check NRCan Case Study: A Baseline for Success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Management Capacity CheckNRCan Case Study:A Baseline for Success ERPANET Workshop – Antwerpen – April 2004 Bob Provick – Library and Archives Canada

  2. Agenda The IMCC and Methodology – A Quick Review Case Study: The NRCan Experience Audit/Evaluation - Preservation – What’s the Hook? Questions and Discussion

  3. Intellectual Property • The IM Capacity Check Tool may only be used in accordance with the following: • The IM Capacity Check Tool has been designed for the use of federal departments and agencies, or other parties working on their behalf. This condition does not preclude third party organizations providing chargeable services utilizing this product in support of the federal government IM Capacity Check self-assessment. Third parties may utilize the IM Capacity Check for self-assessment but no third party may use this product for commercial gain outside the intended use for the federal government. • Use of the IM Capacity Check Tool must acknowledge and identify BearingPoint (formerly KPMG Consulting LP) as the owner of this product. Departments and agencies have the right to adapt the product, and could do a self-assessment on their own or engage the services of consultants to help them carry out an assessment. Any adaptation must still continue to acknowledge and identify BearingPoint (formerly KPMG Consulting LP) as a source of this product.

  4. Elements of IM Capacity 4

  5. Compliance and Quality – Defines the criteria to assess the organization’s capacity to ensure its information holdings are not compromised. Organizational Context – Defines criteria to assess an organization’s capacity to support, sustain and strengthen IM capabilities. • Information quality • Security • Privacy • Business continuity • Compliance • Culture • Change Management • External Environment Organizational Capabilities – Defines the criteria to assess an organization’s capacity to develop the people, process and technology resources required for sound IM. Records and Information Life Cycle – Defines the criteria to assess the organization’s capacity to support each phase of the records and information life cycle. • IM Community • Expert Advice • IM Tools • Technology Integration • Portfolio Management • Project Management • Relationship Management • Maintenance, protection and preservation • Disposition • Evaluation • Planning • Collect, create, receive and capture • Organization • Use and dissemination Management of IM – Defines criteria to assess an organization’s capacity to effectively manage activities in support of IM as it relates to the effective delivery of programs and services. User Perspective – Defines the criteria to assess the organization’s capacity to meet the information needs of all users. • Program Integration • Risk Management • Performance Management • Leadership • Strategic Planning • Roles and Responsibilities • Principles, policies and standards • User awareness • User training and user support • User satisfaction Element Descriptions 5

  6. Level/Scale Descriptions • Capacity 1 – Initial (No systematic or formal approach exists for this capacity. Processes and practices are fragmented or non-existent. Where processes and practices exist, they are applied in an ad-hoc manner.) • Capacity 2 – Defined (Processes and practices are defined to varying degrees and are not applied consistently. Basic management controls and disciplines for the capacity are in place.) • Capacity 3 – Repeatable (Processes and practices are defined, well understood and used consistently across the organization. Processes and practices are also well documented.) • Capacity 4 – Managed (A well-defined framework exists for this capacity. Process and practices are measured and managed to ensure delivery of desired results. Process and practices are embedded in the values of the organization and are coordinated in an integrated manner.) • Capacity 5 – Optimizing (Focus on continuous improvement of the capacity. The concepts of innovation, organizational learning and continuous improvement of the capacity are incorporated into the values of the organization and are consistently applied.)

  7. 2 Data Collection 4 5 1 Project planning Action Plan Validation 3 Consolidate findings Overall methodology and timeline for assessment (cont’d) 3 – 4 months • Core project team • Experts in: • IM • Program delivery • Information technology • Organizational context • User context Project Team: 0.5 - 1 days Project Team and Senior Management: .1- 2 days Project Team: 1 - 2 days ProjectTeam: 2 - 2.5 days Organizational managers who are knowledgeable of the organization’s IM practices 7

  8. Step 3.1 - Assessing the Capabilities- “As Is” and “To be” assessment • Current capabilities are assessed based on key elements of the IM Capacity Check, and criteria provided for each key element. • The capabilities depicted within the criteria represent different states or plateaus that the organization may strive to achieve. The descriptions are incremental. • The capability descriptions are based on generally recognized best practices, but have been customized to reflect the Government of Canada context. • The Organization identifies which level of "maturity" would be the most appropriatein support of its business needs, priorities and consistent with its capabilities. • A rating system of “1” to “5” is used. A rating of “5” does not necessarily mean “goodness”, but rather, maturity of capability. The ideal maturity rating for any area is dependent on the needs of the Organization. Existing maturity Future capability Where the organization may strive to be in the future 8

  9. As-Is: To-Be: Legend 1 2 3 4 5 Organizationbal Context Culture Change Management External Environment Organizational Capabilities IM Community Expert Advice IM Tools Technology Integration Portfolio Management Project Management Relationship Management Management of IM Leadership Strategic Planning Principles, Policies and Standards Roles and Responsibilities Program Integration Risk Management Performance Management Compliance and Quality Information Quality Security Privacy Business Continuity Compliance Records and Information Life Cycle Planning Collect, Create, Receive and Capture Organization Use and Dissemination Maintenance Protection and Preservation Disposition Evaluation User Perspective User Awareness User Training and Support User Satisfaction Step 3.1 - “As-Is” and “To-Be” Assessments overview 9

  10. Step 5 - Contents of Assessment Report • Executive Summary • Key Themes • Summary of Findings • Highlights of Findings • Projects • Action Plan • Background • Overview • Objectives of the Capacity Check • Key Characteristics • Key IM Elements Examined • The Mechanics of the Capacity Check • Project Objectives, Scope and Process Overview • Summary of IM Capacity Check Assessment findings/ opportunities (by criteria) • Lessons Learned • Appendix A - Background Information • Interviews • Workshops • Documents Reviewed 10

  11. What you get! • Does: • Assessment of all high level elements of IM • Assessment of your current state of IM and desired future state • Gap Analyses • Identification of best practices to leverage • Engages all stakeholders in process • Results in a prioritized action plan that speaks to Senior Managers • Does Not: • Tell you how to move from one level to the next (what not how) • Cost the effort required to move to desired state • Make the Business Case for IM (what not why)

  12. Case Study: The NRCan Experience

  13. IM Capacity Check Pilot at NRCanContext… • Clear mandate to disseminate information about natural resources and sustainable development • Science-based organization • Many domains requiring long-term access to data/information • Need to create new knowledge • Demographic issues • Information management lacking • resource issues • awareness of importance • Increased focus on information government-wide

  14. IM Capacity Check Pilot at NRCanResources… • 15 Project Team members • Representation from IM (corporate and sectors), IT, NRCan-on-line, Audit, Libraries, Library & Archives Canada • 13 individual interviews • 2 workshops – 37 participants together • 6 validators • Timeline – May 13 to July 26 • Project Planning (May 13 – 27) • Data Collection (June 3 – 21) • Consolidation of Findings (June 24 – July 2) • Validation (July 4 – 12) • Action Planning (July 15 – 26)

  15. IM Capacity Check Pilot at NRCanBackground… • I-Governance framework development underway. • E-Mail Guidelines • NRCan Best Practices for Information Management • Subject Classification Structure being developed. • IM Issues Action Plan • Clean-up Procedures and Disposition of Information Held in Private Offices. • NRCan Metadata Standards • IM Requirements for NRCan’s Personal Information Holdings • IM Compliance Assessment and Risk Analysis • IM in an Electronic Environment • ATIP Information Collection Guidelines • RDIMS Readiness at NRCan • Draft IM Policy and Access to Knowledge Policy • Program Integrity for both IM and IT • IM Readiness Survey

  16. “To be” levels at level 4 or higher represent those management practices where NRCan needs to excel. These are: Culture, and Information Quality. Level 5 reflects best practices, and is therefore the exception. Summary of IM maturity levels at NRCan The chart indicates the “As Is” and “To Be” capability level for each of the 30 criteria of the IM Capacity. The “as is” level represents the current assessment of NRCan’s capabilities for each criterion. The “to be” level represents the desired capability level that could realistically be achieved within the next three years. A higher capability level is not any better than a lower capability level. The ideal capability level for any criteria is dependent on the needs and goals of the organization. As-Is: To-Be: Legend 1 2 3 4 5 Organizational Context Culture Change Management External Environment Organizational Capabilities IM Community IM Tools Technology Integration Portfolio Management Project Management Relationship Management Leadership Management of IM Strategic Planning Principles, Policies & Standards Roles & Responsibilities Program Integration Risk Management Performance Management Compliance & Quality Information Quality Security Privacy Business Continuity Compliance Planning Information Life Cycle Collect, Create, Receive & Capture Organize, Use & Disseminate Maintain & Preserve Dispose User Perspective User Awareness Expert Advise User Training & Support User Satisfaction

  17. Top Priority Opportunities • Enhance and formalize IM Leadership within the Department. • Establish a current Departmental vision for IM. • Develop a business case for the IM Portfolio. • Build the momentum to enable a transformation to an IM culture. • Strengthen the IM capacity and further develop the IM competency.

  18. Top Priority Opportunities (cont’d) • Establish a formal Governance and Accountability framework to clarify and promote IM roles and responsibilities. • Develop, update and implement formal framework of IM policies, principles and standards. • Develop effective IM tools for users and IM practitioners. • Formalize a departmental strategic planning process for IM. • Develop and implement a corporate IM communications strategy.

  19. Summary of Priorities and Opportunities (cont’d) To facilitate the prioritization of the projects, we have graphed them in the chart below, based on two factors: level of effort to implement, and expected impact that the initiative will have on NRCan. Those of low effort and high impact may be likely candidates to begin with, to gain some initial successes. high Cultural Transformation Building IM Competency Building IM Capacity IM Tools EFFORT Major Change Question Mark Business Case for IM Portfolio medium Clarify IM roles and responsibilities IM Policy and Standards Framework Formalize IM Leadership Communications Strategy IM Vision Strategic Planning Framework low Administrative Low Hanging Fruit IM Tools medium low high IMPACT

  20. - Work Started Transition Map – Strengthening the IM Foundation Medium Term Short Term IM Leadership Long Term IM Roles & Responsibilities Strategic Planning Framework Sustainability IM Vision Cultural transformation Opportunities Business Case for IM Portfolio Build IM Capacity IM Competency/Skills Development IM Polices & Standards IM Tools Communications Strategy Timing Year 1 Year 2 Year 3

  21. Key Priority IM Project Opportunities • IM Governance Framework • Business Planning • Information Life-Cycle Management • IM Tool Set • IM Awareness & Communication • IM Community Development • IM Skill Development Strategy • IM Security & Privacy Review • Information Holdings • IM Monitoring

  22. Effort Impact IM Governance Framework Formalizing IM Governance model Assigning governance and accountability of IM and information stewardship throughout all levels of the Department. Staff: 3.25 FTE Prof Services: $0K Status of Progress:Low- Medium Goal: Developing a structured approach by assigning governance and accountability of IM and information stewardship throughout all levels of the Department. Benefits: Using resources more effectively, improving decision making, increasing awareness of committee mandates and responsibilities throughout the IM life-cycle, clarifying of accountabilities, and complying with MGI requirements. Key elements: • Formalizing a department wide approach to committee management • Defining committee mandates, timetables, roles and responsibilities • Integrating efforts across committees • Communicating throughout department and with stakeholders • Complying with MGI framework Proposed action items: • Establishing and implementing I-Governance framework and communicating its associated matrix of accountabilities. • Strengthening IM leadership by establishing IM Steward/Champion and formalizing an IM vision for the Department. • Integrating and evaluating IM roles and responsibilities as part of senior management and staff accountabilities. • Creating a recognition and reward program for IM stewardship. Risks of not doing it: • Inconsistent management of horizontal issues will increase risk associated with delays, lack of decision making, lack of visibility • Inability to leverage investment of resources, knowledge across the Department to meet business goals • Not complying with the MGI

  23. IM Capacity Check Pilot at NRCanBenefits • Assess the state and capabilities of current IM practices at NRCan against a common standard / best practices • Establish a mechanism to identify reasonable “end state” • Identify priority areas for improvement • Provides lead towards developing IM, IT, NOL Strategy • Use key results for supporting an IM Program • Prioritization of current and planned activities • Brought people together as a community • Positions NRCan to improve their IM capacity • Increase awareness and understanding of IM issues • Highlight risk areas and provided basis for mitigatingrisk • Established the basis for costing IM to meet legislative and business requirements • In house capability to re-assess

  24. IM Capacity Check Pilot at NRCanLessons Learned… • Broad buy-in before starting is key • Common understanding of IM is useful and could help in debate at assessment stage • Involve people with knowledge and opinions about IM throughout the project…need a full mix of participants • Involve key business people and senior management • Best if momentum continues – keep timeframes tight, yet reasonable • Ensure all participants are clear about time commitments • Truly need to review what you have in place and build from this common point • Communicate, communicate, communicate

  25. IM Capacity Check at NRCan • Key Next Steps for 2004-2005: • Separate IM – IT Advisory Committees • departmental Infostructure Committees • Create an Accountability Matrix • Employee IM Awareness Initiative (including protection/safeguarding info) • Workgroup on preservation and disposition of paper and electronic records

  26. Audit/Evaluation - PreservationWhat’s The Hook?

  27. Audit/Evaluation – PreservationWhat’s The Hook?The IMCC self-assessment approach for IM works at all levels in an organization.The IMCC approach can be used to inform the development of an IM audit/evaluation program for any organizationSeveral IMCC criteria relate directly to the issues of preservation e.g. security, quality, compliance, maintain/preserve/protect, risk management and evaluation…..

  28. Audit/Evaluation IM Criteria in the GoC – A Work in ProgressIMCC self-assessment criteriaMGI Policy compliance indicatorsPerformance measurement indicators for the management of information

  29. Questions - Discussion Bob Provick Senior Project Officer Government Records Branch Library and Archives of Canada Telephone: (613) 947-1511 Fax: (613) 947-1500 E-mail: bprovick@archives.ca

  30. Where you can find stuff on the web • www.archives.ca • www.archives.ca/06/060 e.html • How to get in touch with our Call Centre • imgi@archives.ca • or (613) 944-4644

More Related