1 / 26

Crossing Boundaries to Assess Academic Research Environment

Crossing Boundaries to Assess Academic Research Environment. Laura L. Haas, Program Evaluator. New Mexico State University. Laura L. Haas SCORE Program Evaluator. New Mexico State University (NMSU) SCORE Program S upport of Co ntinuous R esearch E xcellence. Presented at the

serenity
Télécharger la présentation

Crossing Boundaries to Assess Academic Research Environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Crossing Boundaries to Assess Academic Research Environment Laura L. Haas, Program Evaluator New Mexico State University

  2. Laura L. HaasSCORE Program Evaluator New Mexico State University (NMSU) SCORE Program Support of Continuous Research Excellence Presented at the Canadian Evaluation Society / American Evaluation Association Joint Conference Toronto, Canada October 28, 2005

  3. Research Environment Study at NMSU: a collaboration • SCORE Program(funded by National Institutes of Health) • DOE Office of Basic Energy Sciences (Sandia National Laboratory) • University of Maryland(Center for Innovation)

  4. Crossing Which Boundaries? • Within the Institution • stuck with old vertical org. (command) structures • stuck with horizontally stratified political agendas • Across Research Environments • National Laboratory (focus is entire) • Industry (focus is profit, now or eventual) • Academic (focus is more basic, plus teaching & service missions) • special case of the Land Grant University

  5. Context • Institutional • Admin: Want Evidence of Strengths • Researchers: Want Evidence of Weaknesses • Funder (SCORE Program) • Want “baseline” • environmental health evidence--improvements for researchers • Collaborators • Want comparative data • Understand different research type needs • Broaden their survey design experience

  6. Study Development • Determine the attributes necessary • (compared with NL & indust.) • Measure the extent to which the attributes are present • (survey to measure)

  7. Four Environmental Areas defined in Jordan Model • Human Resource Development • Internal Support Systems • Set & Achieve Relevant Goals • Innovation & Cross Fertilization

  8. Attributes important for National Laboratories HUMAN RESOURCE DEVELOPMENT INNOVATION & CROSS-FERTILIZATION Management Adds Value To Work Management Integrity People Treated With Respect Good Professional Development Good Career Advancement Opportunities Optimal Mix of Staff High Quality Technical Staff Good Internal Project Communication Teamwork & Collaboration Sense of Challenge & Enthusiasm Time to Think & Explore Resources/ Freedom to Pursue New Ideas Commitment to Critical Thinking Cross-Fertilization of Ideas Frequent External Collaborations Authority to Make Decisions Good Identification of New Opportunities Integrated/ Relevant Research Portfolio INTERNAL SUPPORT SYSTEMS SET & ACHIEVE RELEVANT GOALS Good Research Competencies Overhead Rates Don’t Hinder Competing Lab Systems & Processes Efficient Lab Services Meet Needs Good Allocation of Internal Funds Informed & Decisive Management Rewards & Recognizes Merit Good Salaries & Benefits Good Equipment/ Physical Environment Sufficient, Stable Project Funding Good Planning & Execution of Projects Good Project-Level Measures of Success Good Relationship With Sponsors Reputation for Excellence Mgmt Champions Foundational Research Good Lab-wide Measures of Success Clear Research Vision & Strategies Invests in Future Capabilities

  9. Focus groups defined NMSU researcher needs • responded to the attributes • representatives from 5 of the 6 Colleges at NMSU • College of Agriculture • College of Arts & Sci • College of Business • College of Engineering • College of Heath & Human Services • many levels of the hierarchy represented, up to VP for Research • two days getting the needs here

  10. Major Needs — summary • Time • Resources • Personnel • Processes & Intangibles

  11. Major Needs — Time! • Most time-consuming things are not credited in P & T processes • Not enough uninterrupted “development” time • Cluster development (cross-disciplinary initiative) time now seen as another demand • Conflicting demands of teaching, service, research (need release time from teaching)

  12. Major Needs — Resources • access to indirect costs (Catch 22) • not permitted to use direct cost for “office supplies” • not permitted access to indirect costs • space (universally a problem) • additional space is located far away • no consistently applied criteria for obtaining • equipment • NOT provided by the university; must get grants • capital equipment is supposed to be maintained by the university, so you can’t use grant $ to fix

  13. Major Needs — Personnel • Personnel (varied in academic setting) • students: drive a lot of the day-to-day work and ability to attract grad students critical • post-docs: very transient, experience & training that will leave after 1-3 years • tech staff: a very overlooked group in the hierarchy. Not students, Not faculty, NO voice

  14. Major Needs — the Intangibles • to feel as though they are more than the cash cow • Admin says that teaching $ support research mission • Research Centers say that research $ support everything • to trust that the administration values the mission of research, not just the rewards (respect, $, etc) • research support processes that are fiscally responsible but not overly burdensome

  15. Differences: Human Resource Development • Management integrity--big trust problems • Financial transparency at very high levels, absent • VP Business making rules for researchers, moving target • For ex: Don’t want the data handled by campus personnel • Real Human Resource barriers • Time to hire • Raises (even if paid from grant) are stipulated/restricted • Contracts (for personnel, esp. additional comp) are hard to get approved

  16. Differences: HR continued • Teamwork & collaboration--barriers • vertical org structure places blinders on researchers to what other competencies exist • cluster concept sent down from high, instead of arising from the bottom up (ie, not driven by the research question) • faculty researchers are “encouraged” (feel forced) to work in the cluster areas (fear they will suffer P&T consequences) • Manager adds value--no “managers” • Career advancement opportunities--only for tenure-track faculty, not for professional sci or technical staff. Dead end. • Optimal mix of staff--for campus research this includes students

  17. Diffs: Innovation & Cross Fertilization • Cluster development new encouragement of interdisciplinary resrch • Barriers are many • Incentives, few (eg. meager mini-grants = $25K) • Resources to pursue new ideas • Researchers expected to get their own resources • totally thinking outside the box is tough because there is no “process” for it and the admins don’t know how to support it (or allow it) • Time to think & explore • 3+3 teaching load common • grant will only buy you out of one course each semester, if you get one • researchers have no uninterrupted time, esp. faculty

  18. Diffs: Innovation & Cross Fertilization (continued) • Sense of challenge and enthusiasm • focus group indicated our syst. rewards mediocrity • BUT, most faculty choose what research they want to do. IMPORTANT to sense of independence • Integrated & Relevant Research Portfolio --not really applicable to the way faculty choose projects. • Identification of new opportunities --Comes often from faculty working with post-docs and graduate students working together, not “the management”

  19. Diffs: Internal Support Systems • Good research competencies • sponsors evaluate depth • university needs breadth • Salary & Benefits • Salary compression pervasive • Seniority counts for very little • Health bennies very controversial; Srs maintained increase the claims and ^ the premiums • Overhead rates hinder competing? • Soc Sci, yes • Sci, no

  20. Differences: Set & Achieve Relevant Goals • Sufficient stable project funding • staff let go after projects end • Good project-level measures of success • researchers feel agencies do this in “review” of their proposals, at the front end of the work. thus, not an admin issue • Invests in future capabilities • hiring in new areas is done after others retire

  21. Diffs: Set & Achieve Goals(continued) • Good planning & execution of projects • not seen as an admin responsibility, but needed if Clusters are to work • Institution-wide measures of success • widely variable across Colleges and even across Depts within Colls. • Admin champions research • lip service is inexpensive • researchers believe it when they see the invested $

  22. Getting around to the Survey...

  23. Survey Development Challenges • Data collection—Mistrust of “the system” • Providing everyone who wants information, enough of what they need • President, Provost • VP Research • Researchers (target population) • Collaborators • Length!

  24. Acknowledgements • Dr. Gretchen Jordan,Principal Member of Technical Staff Sandia National Laboratories • Dr. Yuko Kurashina,Post-Doctoral Research Fellow Center for Innovation, University of Maryland • U.S. DOE, Office of ScienceDr. Patricia Dehmer, Associate Director; Bill Valdez, SC-5 • Dr. Glenn D. Kuehn, Director, SCORE Program at NMSU • Vonnie Reinke, Program Coordinator, SCORE Program at NMSU • NIH Grant #S06 GM08136-30, to G. D. Kuehn

  25. Questions?

  26. Contact Info Laura Haas, Program Evaluator, SCORE Program Department of Chemistry & Biochemistry Lhaas@nmsu.edu Phone: (505) 646-3110 Fax: (505) 646-6846 Presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference Toronto, Canada October 28, 2005

More Related