1 / 86

Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter

Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting Workshop December, 2006. Assessment and Review. Outline of Presentation Why review/assess graduate programs

willem
Télécharger la présentation

Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting Workshop December, 2006

  2. Assessment and Review • Outline of Presentation • Why review/assess graduate programs • A review process incorporating periodic external reviews and continuous program assessment

  3. Marilyn J. Baker Revised and Updated by: Margaret King, Duane Larick, and Michael Carter NC State University

  4. Background Information About Our Audience • How many of you are responsible for graduate program review at your institutions? • How many of you have this as a new responsibility? • How many of you have recently (or are considering) changing your procedure?

  5. Why Review/AssessGraduate Programs? • The primary purpose should be to improve in the quality of graduate education on our campuses • By creating a structured, scheduled opportunity for a program to be examined, program review provides a strategy for improvement that is well-reasoned, far-seeking, and as apolitical as possible

  6. Why Review/AssessGraduate Programs? • External Considerations • To help satisfy calls for accountability • Especially at the State level • Requirement for regional accreditation, licensure, etc.

  7. SACS Principles of Accreditation • Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

  8. Why Review/AssessGraduate Programs? • Internal Considerations • Meet long-term (strategic) College & Institutional goals • Creation of new degree programs • Elimination of existing programs • Funding allocation/reallocation • Advanced understand of graduate education and factors influencing graduate education • Aids in identification of “common” programmatic needs

  9. Why Review/AssessGraduate Programs • Internal Considerations • Creates an opportunity to focus on key issues impacting graduate education • Causes of retention/attrition among students and faculty • Meet short-term (tactical) objectives or targets at the program level • Documents achievements of faculty & students • Indicates the degree to which Program outcomes have been achieved • Suggests areas for improvement • Helps chart new programmatic directions

  10. So The Questions We Need To Ask Ourselves Are • What are we currently doing? • Why are we currently doing it? • Is what we are currently doing accomplishing the external goals described above? • Is what we are currently doing accomplishing the internal goals described above? • Is there a better way?

  11. Graduate ProgramReview – A Two Phase Process • Periodic formal review of graduate programs (external review) • Outcomes-based assessment (internal review that is a continuous and ongoing process)

  12. Key Features of Formal Reviews • Evaluative, not just descriptive • Forward-looking: focus on improvement of program, not just current status • Based on program’s academic strengths and weaknesses, not just ability to attract funding • Objective • Independent, stands on its own • Action-oriented: clear, concrete recommendations to be implemented

  13. Questions Answered by Formal Review • Is the program advancing the state of the discipline or profession? • Is its teaching and training of students effective? • Does it meet institutional goals? • Does it respond to the profession’s needs? • How is it assessed by experts in the field?

  14. Issues to be Resolved Before Beginning • Locus of control • Graduate-only or comprehensive program review • Counting—and paying—the costs • Master’s and doctoral programs • Coordination with accreditation reviews • Scheduling the reviews • Multidisciplinary and interdisciplinary programs

  15. Key Elements of a Successful Program Review • Clear, Consistent Guidelines • The purpose of graduate program review • The process to be followed • Guidelines for materials to be included in each phase • A generic agenda for the review • The use to which results will be put

  16. Key Elements of a Successful Program Review • Administrative Support • Departmental resources: time, funding, secretarial help, etc. • Central administrative support for larger review process • Adequate and accurate institutional data, consistent across programs

  17. Key Elements of a Successful Program Review • Program Self-Study • Engage the program faculty in a thoughtful evaluation of: • The program’s purpose(s) • The program’s effectiveness in achieving these purposes • The program’s overall quality • The faculty’s vision for the program

  18. Key Elements of a Successful Program Review • Surveys/Questionnaires • Surveys from current students, faculty, alumni, and employers • Factors to be considered: • Time and expense to develop, distribute and collect responses • Likely response rate • Additional burden on respondents • Uniqueness of information to be gained

  19. Key Elements of a Successful Program Review • Student Participation • Complete confidential questionnaires • Provide input into self-study • Be interviewed collectively and individually by review team • Serve on review teams and standing committees

  20. Key Elements of a Successful Program Review • Review Committee • On-Campus Representation • A representative of the Graduate School • Internal reviewer from a field that gives him/her some understanding of the program(s) being reviewed • External Reviewer(s) • Number of reviewers depends on scope and kind review • Selection process can vary – programs can have input but should not make the final decision

  21. Key Elements of a Successful Program Review • Final Report by Review Team • Brief overview of program • Strengths of program • Areas for improvement • Recommendations for improvement

  22. Key Elements of a Successful Program Review • Program Faculty’s Response to Report • Clear up errors or misunderstandings • Respond to the recommendations (have implemented, will implement, will consider implementing, cannot implement and why)

  23. Key Elements of a Successful Program Review • Implementation • One or more meetings of key administrators (department, college, graduate school, and university) to discuss recommendations • An action plan or memorandum of understanding drawn up and agreed on by all participants • Discussion of the recommendations with program faculty for implementation • Integration of the action plan into the institution’s long-range planning and budget process

  24. Key Elements of a Successful Program Review • Follow Up • An initial report on progress toward implementation of action plan (1 or 2 years out) • Follow-up reports until action plan is implemented or priorities change • Discussion of recommendations and implementation in self-study for next review

  25. Questions Relative to External Program Review?

  26. What is Outcomes-Based Assessment? • It is a process that engages program faculty in asking 3 questions about their programs • What are our expectations for the program? • To what extent is our program meeting our expectations? • How can we improve our program to better meet our expectations? • It is a process that provides program faculty the means to answer these questions • By creating objectives and outcomes for their program • By gathering and analyzing data to determine how well the program is meeting the objectives and outcomes • By applying the results of their assessment toward improving their program

  27. What is Outcomes-Based Assessment? continued • It entails a shift in emphasis from inputs to outcomes • It is continuous rather than periodic • It involves regular reports of program assessment to the institution • Its results are used by the program and institution for gauging improvement and for planning

  28. What is Outcomes-Based Assessment? continued • Faculty generate program objectives and outcomes • Faculty decide how outcomes will be assessed • Faculty assess outcomes • Faculty use assessment findings to identify ways of improving their programs

  29. Benefits of Outcomes Assessment • It provides the groundwork for increased responsiveness and agility in meeting program needs • It gives faculty a greater sense of ownership of their programs • It provides stakeholders a clearer picture of the expectations of programs • It helps institutions meet accreditation requirements

  30. SACS Criterion for Accreditation Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.”

  31. Drive Toward Greater Accountability on Our Campus • Professional accreditation agencies (e.g., engineering, social work, business) • Undergraduate assessment • Assessment of general education

  32. Outcomes Assessment: A Process • Phase I: Identifying Objectives and Outcomes • Phase II: Creating Assessment Plans • Phase III: Implementing Assessment Plans • Phase IV: Reporting Assessment Results

  33. A Procedure for Implementing Outcomes Assessment • Identify pilot programs to create assessment materials for each phase • Use pilot materials as a basis for DGP workshops for each phase • Offer individual support to DGPs as they created materials and assessed programs • Create online tools to aid DGPs

  34. Phase I: Identifying Objectives and Outcomes

  35. What Are Objectives? Program objectives are the general goals that define what it means to be an effective program.

  36. Three Common Objectives • Developing students as successful professionals in the field • Developing students as effective researchers in the field • Maintaining/enhancing the overall quality of the program

  37. What Are Outcomes? Program outcomes are specific faculty expectations for each objective that define what the program needs to achieve in order to meet the objectives.

  38. Example for Outcome 1: Professional Development 1. To enable students to develop as successful professionals for highly competitive positions in industry, government, and academic departments, the program aims to provide a variety of experiences that help students to: a. achieve the highest level of expertise in XXXX, mastery of the knowledge in their fields and the ability to apply associated technologies to novel and emerging problems b. present research to local, regional, national, and international audiences through publications in professional journals and conference papers given in a range of venues, from graduate seminars to professional meetings c. participate in professional organizations, becoming members and attending meetings d. broaden their professional foundations through activities such as teaching, internships, fellowships, and grant applications

  39. Example for Outcome 2: Effective Researchers 2. To prepare students to conduct research effectively in XXXX in a collaborative environment, the program aims to offer a variety of educational experiences that are designed to develop in students the ability to: a. read and review the literature in an area of study in such a way that reveals a comprehensive understanding of the literature b. identify research questions/problems that are pertinent to a field of study and provide a focus for making a significant contribution to the field • gather, organize, analyze, and report data using a conceptual framework appropriate to the research question and the field of study • interpret research results in a way that adds to the understanding of the field of study and relates the findings to teaching and learning in science Etc.

  40. Example for Outcome 3:Quality of Program 3. To maintain and improve the program’s leadership position nationally and internationally, the program aims to: a. continue to be nationally competitive by attracting high-quality students b. provide effective mentoring that encourages students to graduate in a timely manner c. place graduates in positions in industry and academics d. maintain a nationally recognized faculty that is large enough and appropriately distributed across XXXX disciplines to offer students a wide range of fields of expertise

  41. Phase II: Creating Assessment Plans

  42. Four Questions for Creating an Assessment Plan • What types of data should we gather for assessing outcomes? • What are the sources of the data? • How often are the data to be collected? • When do we analyze and report the data?

  43. Types of Data Used • Take advantage of what you are already doing • Preliminary exams • Proposals • Theses and dissertations • Defenses • Student progress reports • Student course evaluations • Faculty activity reports • Student exit interviews

  44. Types of Data Used 2.Use resources of Graduate School and institutional analysis unit • Enrollment statistics • Time-to-degree statistics • Student exit data • Ten-year profile reports • Alumni surveys

  45. Types of Data Used 3. Use your imagination to find other types of data • Dollar amount of support for faculty • Student activity reports • Faculty surveys

  46. Data: Two Standards to Use in Identifying Data • Meaningful: Data should provide information that is suitable for assessing the outcome • Manageable: Data should be reasonable to attain (time, effort, ability, availability, resources)

  47. Four Questions for Creating an Assessment Plan • What data should we gather for assessing outcomes? • What are the sources of the data? • How often are the data to be collected? • When do we analyze and report the data?

  48. Sources of Data • Students • Faculty • Graduate School • Graduate Program Directors • Department Heads • Registration and Records • Advisory Boards • University Planning and Analysis

  49. Four Questions for Creating an Assessment Plan • What data should we gather for assessing outcomes? • What are the sources of the data? • How often are the data to be collected? • When do we analyze and report the data?

  50. Frequency of Data Collection • Every semester • Annually • Biennially • When available from individual graduate students • At the preliminary exam • At the defense • At graduation

More Related