1 / 57

Academic Research Performance and Outcome Measurement

Academic Research Performance and Outcome Measurement. INORMS 2014 . Presentation Overview. Setting the Stage: Context-dependency of Performance Measurement Pilot Assessment of the University of Ottawa Strategic Plan Collaborative approaches for collection and use of Performance information.

rowa
Télécharger la présentation

Academic Research Performance and Outcome Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Academic Research Performance and Outcome Measurement INORMS 2014

  2. Presentation Overview • Setting the Stage: Context-dependency of Performance Measurement • Pilot Assessment of the University of Ottawa Strategic Plan • Collaborative approaches for collection and use of Performance information

  3. Presenters • Dr. Tim McTiernan, President and vice-chancellor of the University of Ontario Institute of Technology (UOIT) Dr. McTiernan has held numerous academic and leadership roles throughout Canada in the university, college and government sectors. He has more than 25 years of senior-level leadership and administrative experience spanning the areas of innovation; research administration and commercialization; social and economic development; and post-secondary education. • Dr. Lorna Jean Edmonds, Vice Provost for Global Affairs and Professor, College of Health Sciences and Professions, Ohio University Previously, the Executive Advisor to the Vice President, Research, University of Ottawa. In this role she was responsible for conducting a university-wide assessment of the value of the University of Ottawa strategic plan on research performance and contribution to society. Prior to joining the University of Ottawa, Lorna Jean held leadership and academic roles at Western University, the University of Toronto and Queen’s University in research administration, international relations and international development. • Laura Hillier, Director of Evaluation and Outcome Assessment at the Canada Foundation for Innovation. Laura leads a team responsible for the assessment and analysis of the outcomes and impacts of CFI investments in research infrastructure. She has over 15 years of experience in different roles within the academic research enterprise, the last 10 have focused primarily on evaluation of research, and development of related methodologies.

  4. Setting the Stage: Context-dependency of Performance Measurement Measuring what for whom? Observations on the evolving frames of reference for performance measurement Dr. Tim McTiernan University of Ontario Institute of Technology (UOIT)

  5. Performance Measurement is Context-Dependent • In our work on research performance metrics, we operate within several related but not fully aligned micro and macro, quality and volume-based frameworks • In a world of: • global rankings, we aggregate institutionally • Academia as an instrument of economic policy, we focus on technology transfer intensity • Digital connectivity, we value collaborative science • The Explosion of Life Science Research, benchmarks for scale of research projects have shifted • The “prof” behind prof-centric metrics sometimes is rendered invisible behind his/her home institution’s aggregate metrics in the process

  6. Canadian Frames of Reference for Performance and Outcome Measurement • 3 Significant Frames of Reference and a 4th Dimension: • Government Policy Objectives – federal/provincial • Granting Agency Programmes – federal/provincial • Disciplinary Standards of Excellence • 4th Dimension: Community/Partner Expectations • Performance and Outcome Measures are Weighted Differently, explicitly and implicitly, in each of these frameworks

  7. Evolving Government Policy Objectives:past 20 years in Canada • From Enabling Research to Emphasizing Application • From Increasing Overall Capacity to Targeting Priority Fields • From Block Funding of Granting Agencies to Purpose- specific Funding of Granting Agencies • From Government Research Engaged with Academic Research to Government Research Refocused on Industry/Economic Priorities

  8. Evolving Government Research Policy Objectives:Implications for the Canadian Research Ecosystem • Granting Agencies have had to Redesign Programmes, Expand Project Evaluation Criteria and Augment Traditional Performance Metrics • Academic Disciplines Challenged to Define Relevance, discounting Intrinsic Value • Institutions Required to Engage Strategically, Marrying Top Down Frameworks to the Collective Strength of their Research Communities

  9. Evolving Government Research Policy Objectives:Implications for the Canadian Research Ecosystem • Faculties and Disciplines are Aggregate Clusters in the Evaluation of Research Performance and Output • Researchers are Required to Recontextualize their Research in Extra-disciplinary Public Policy Frameworks • Post Docs and Graduate Students are Trained with Reference to the Application of Research as well as the Process of Science

  10. Evolving Government Research Policy Objectives:Implications for the Canadian Research Ecosystem • Different Expectations of Funding Partners in Complex Research Funding Arrangements Increase the Engagement of Institutional Accountability Processes with the Flow of Research in the Lab • Evolving Funding Frameworks Create Conditions for rebalancing the Dynamic Tensions that always Exist within any Research Ecosystem

  11. Canadian Research EcosystemBalancing Dynamic Tensions • Funding Agencies must Balance between Responsiveness to Quickly Changing Public Policy Objectives while Reinforcing Academic Excellence within and Across Disciplines • Institutions must Engage with Public Policy and Funding Partner Priorities while supporting Academic Integrity • Faculties and their Researchers must learn to be Bilingual, in the Language and Performance Metrics of their Disciplines and in the Discourse of Public, Economic and Social Benefits • Researchers must balance the academic disciplinary requirements for excellence to achieve tenure and promotion with the grant application requirements for impact and engagement with external partners that will sustain funding

  12. Changing Environments - Case Studies Ontario-based Broadly Applicable Across the Canadian Research Landscape • Ontario Research Fund/CFI • Research– telescopes and digital research on early English Drama • Community Impact Report • Institutional Differentiation • Graduate Allocation

  13. Canadian Circumstances: American Models • Recent shifts in Public Policy Frameworks on Research Performance and Output Measures are informed by the US Research Ecosystem • Differences between the Canadian and US Research are Often Overlooked • Resulting Risks are High for Thwarted Expectations regarding Global Competitiveness, Faculty Incentives to “Shift Gears”, to Engage with Non-academic Partners and to Bridge the “Valley of Death” to Commercialization

  14. The Power of Data in Assessing research Planning at the University of Ottawa: A Pilot Case study Lorna Jean Edmonds, Ph.D Ohio University

  15. The Power of ‘Accurate’ Information- The ability to tell YOUR STORY- • Define actual areas of focus and quality/performance • Understand impact of government and university incentives/decision-making on actual activity • Forecasting and managing future for decision-making • Communication materials

  16. uOttawa Strategic Research Plan: 4 areas of focusThe PLAN—a requirement for funding applications since 1998

  17. The Study • Method • Inductive analysis based upon key questions provided by contractor • Retrospective review of research funding (2002/3 to 2011/12) • Pilot study of e-Society strategic area • Data sources • Institutional planning and research databases • Faculty CV’s • Web-sites • InCites • Data tools • Creation of a ‘Faculty Member’ Database profiling key descriptors and outcome metrics • Key: Categorize faculty members into strategic areas

  18. Researchers by SADR and Faculty Number of professors in each Faculty

  19. Questions to be Answered • Return on Investment • Did investments in SADRs lead to a change in research funding? • Capacity • What is the profile in terms of excellence, experience and network? • What is the capacity for training, outreach and impact relative to research input? • Sustainability • What is the capacity to sustain [e-Society] for the long term? • Productivity • How is research contributing to productivity? • Research focus • What are the areas of distinctiveness in [e-Society]?

  20. Data Collection:Descriptors and Outcome Measures • Experience and excellence • Researcher Profile • Demographic • Academic training (including country) • Previous employment (industry, university, public) • Awards • Research • Grants, contracts, industry, international, other • External and Internal • Research key words • Institutional Plan and Resources • Plan • Financial and philanthropy • Support Services • IT and Data analytics • Network • Research and Institutional Partnerships • Visiting fellowships/Delegations • Alumni • Productivity • Publications and Academic Contributions • Students • Outreach • Academic Impact • Community Impact • Inventions and disclosures Profile - Events, Media Activity, Rankings, Awards -

  21. Key indicator: Tri-Council Funding 92% increase in funding since 2002 6.7% annualized growth rate 2012 funding base: $2.815 B Research Funding Profile

  22. Question: Research Investments in SADRsDid investments in SADRs lead to a change in research funding? • Yes: but not necessarily in the way expected • SADR and Other investments, except health, increased at rates greater than overall university rate (annualized) • Research investments shifted away from industry to Tri-Council • $ Investments and ROI tied to the faculty member recruited

  23. Growth of Total Funding By SADRs

  24. Change in Total Research Funding % of Total Research FundingPeriod: between 2002 and 2011

  25. Profile of University Research Funding by Agency and Institution

  26. Questions Asked Capacity Capacity • What is the capacity to sustain research in [e-Society] for the long term • Sustainability linked to age distribution, number of researchers, alignment with trends and recruitment

  27. e-Society: Profile of Excellence and Experience and Network • Profile • Education • Collaborations • Chairs • Prizes

  28. Percentage of degrees from a QS top 200 International Institution Origin of Degrees • Percentage of degrees from an U15 Institution

  29. Questions Asked Productivity Productivity • How is research contributing to productivity in terms of publications, students, the impact on the community and academia across the sectors • The richness of this data demonstrated value of faculty far beyond what research funding alone indicates • Demonstrated need to track student engagement as well as faculty

  30. e-Society Capacity: Current Faculty2011/12: 133 faculty with 454 grants @ $19.4 million2002/03: 114 faculty with 295 grants @ $7.9 million Output and impact *not including undergraduate education Research Input • Training • Graduate programs: 18 • Graduate Students: > 821 of a total of 5,810 (14%) • Outputs • Conferences: >10,000 conferences since 2000 (av 4 per year/prof) • Books: 209 authored (86 from arts and 72 eng) and 1,241 edited • Journal publications: >5,000 • Outcome • Academic Leadership: > 2,600 • Community activities: >3,400 • Inventions: 39 professors with >150 disclosures • Grant Funding in 2011/12: $19.4 M • Total: • e-Society: 1.6% growth in researchers led to 6.4% growth in funding (annualized) • Health: 5.6% growth in faculty to a 3.3% growth in funding • University-wide growth 4.1% and 5% respectively • Tri-council: • $7M (+147% since 2002 relative to university-wide $50 M, + 58% and Tri-council 92%) • Industry: • $.9M (-5% relative to university-wide $11.2, -32%) • Institutional Infrastructure Funding: • $25.8 million CFI (23%)

  31. Questions Asked Distinctiveness • What are the areas of distinctiveness in [e-Society]?

  32. e-Society {12} Areas of Research Focus

  33. Actual focus: p/e-Society

  34. Categorization of Research AreasInfrastructure and Applications to Implications(~$20 Million/148 Faculty )

  35. Profile of p&e- Technology and Society

  36. Culture and Governancep&e - Technology and Society Clusters • Critique and Communication through e-Art and Media in transition • Cyber-justice and Information Policy Activity/Researchers • 37 faculty • $900,000 • 4 CFIs and 4 Chairs • Major Awards: Top 50 most influential in the world • Six disciplines: Arts, Education, Engineering, Law, Social Sciences and Telfer • Critique and Communication • the Research Group on Media, Identity and Culture • Media, Identity and Community Research Group • Arts • Cyberjustice and Information Policy • Center for Law, Technology and Society • Centre for Science, Society and Policy • Internal Advisory Board, Canadian Internet Policy and Public Interest Clinic • Clinic Arts: (bullying) • Law: (bullying) • Outreach and Impact: • > 500 social media, media outreach • >100 technical reports • > 50 leadership roles on external boards and committees Universal Trends The global digital knowledge economy Empowerment and globalization of information and decision making Safety and security Universalization

  37. Lessons Learned RESEARCH CAPACITY and CRC INCENTIVE FUNDING STRATEGY • Incentives drive researcher choices • CRC–Tri-Council incentive • CRC and CFI incentives to secure granting council funding • shifted funding away from industry • No incentive for international grants • Recommend: link CRC funding to overall research capacity measures

  38. Lessons Learned STRATEGIC AREAS OF FOCUS and TREND ANALYSIS • Broad thematic areas too descriptive • Power is to be able to tell the story • Cannot predict the future • Recommend: a strategic area referred to as: NON-strategic • Recruiting interesting researchers with interesting ideas

  39. Lessons Learned QUALITY • $ Research funding • Many high performing faculty had small or no research grants but heavily engaged in academic and community • Other outcomes need to be considered: • Profile • Output and Impact • AND involvement in EDUCATION

  40. Vivian Liu, Research Analyst Luc Gauthier, Chief of Staff to VP Research Mona Nemer, Vice President, Research Colleagues in the research and institutional planning portfolios Thank you and co-contributors and supportersUniversity of Ottawa

  41. Collaborative Approaches to Performance Measurement What information does the Canada Foundation for Innovation (CFI) need, and how is the CFI trying to make the information more relevant and useful to the recipient institutions as well as to other stakeholders

  42. The Canada Foundation for Innovation

  43. Evaluation & Accountability Context • Global question of how to measure and report the impact of R&D expenditures • Need for accountability to the CFI Board of Directors, the government and Canadians • performance measurement and evaluation activities help demonstrate internal and external accountability for the stewardship of public funds by showing that management is fiscally responsible, that services are being delivered in an efficient and effective manner, and that objectives are being met • The CFI’s Funding Agreement requires that the CFI carry out an overall performance evaluation of its activities and funded projects at least every five years, as well as a value-for-money audit

  44. Measurement Approaches A suite of tools to capture the progress and results of CFI and CFI-funding Organizational level • Corporate performance metrics, PERAF, OPEA Project level • application data, progress reports, financial reports Thematic or institutional level • Outcome measurement studies (OMS) • Platform outcome measurement studies (POMS) • Special studies and evaluations

  45. Performance Measurement Data Data typically originates at the most granular level and gets combined to allow for system level use

  46. Project level data CFI’s Annual Project Progress Report - projects report for a period of 5 years, using an online reporting system

  47. Reporting Overall annual report Targeted ‘thematic’ analysis

  48. Thematic & Institutional Data: CFI’s Outcome Measurement Studies (OMS) • 28 Outcome Measurement Studies completed between 2007-2012 • Categories Assessed • Strategic research planning (SRP) • Research Capacity • Highly Qualified Personnel (HQP) • Research productivity • Innovation / Extrinsic Benefits

  49. Reporting OMS Internal reports for CFI senior Management and Board of Directors 2 public summary reports & data for other analyses and success stories

  50. Thematic Data: CFI’s NEW Outcome Measurement Study Platform Outcome Measurement Studies Socioeconomic Impact Analysis • Following an Evaluation of the OMS, we are adjusting the methodology; 1 ‘NEW’ OMS planned to begin 2014-15 • 1 Platform Outcome Measurement completed in 2013, 2 more currently underway • 1 Socioeconomic Impact Analysis completed in 2013, reflecting on the approach and seeking further candidate themes or technologies for assessment

More Related