1 / 82

Assessing the Mission of Doctoral Research Universities

Assessing the Mission of Doctoral Research Universities. J. Joseph Hoey, Georgia Tech Lorne Kuffel, College of William and Mary North Carolina State University Workshop October 30-31, 2003. Guidelines for This Presentation. Please turn off or silence you cell phones

nanji
Télécharger la présentation

Assessing the Mission of Doctoral Research Universities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the Mission of Doctoral Research Universities J. Joseph Hoey, Georgia Tech Lorne Kuffel, College of William and Mary North Carolina State University Workshop October 30-31, 2003

  2. Guidelines for This Presentation • Please turn off or silence you cell phones • Please feel free to raise questions at anytime during the presentation, we will also leave time at the end for general discussion. • We are very interested in your participation

  3. Agenda • Introduction and Objectives • Reasons for Graduate Assessment • Comparative Data Sources • Developing Faculty Expectations for Graduate Students • Principles of Graduate Assessment • Physics Case Study • Taking Assessment Online • Summary and Discussion

  4. Objectives • Articulate motivations for undertaking graduate assessment • Increase awareness of comparative data sources • Program Linkages for Graduate Assessment • Hands-on: develop faculty expectations for student competence; utilize diverse data sources to evaluate a graduate program’s first assessment efforts; etc.

  5. Why Assess Graduate Programs? • We are all interested in the quality and improvement of graduate education • To help satisfy calls for accountability • Accreditation requirements: SACS accreditation imperatives • “To change or improve an invisible system, one must first make it visible” – Schilling and Schilling, 1993, p. 172.

  6. Common Internal Reasons for Graduate Assessment • Program marketing • Meet short-term (tactical) objectives or targets • Meet long-term (strategic) institutional/departmental goals • Funded project evaluation (GAANN, IGERT) • Understand sources of retention/attrition among students and faculty

  7. SACS Principles of Accreditation • Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

  8. SACS Principles of Accreditation • Section 3 – Comprehensive Standards: Institution Mission, Governance, And Institutional Effectiveness • “16. The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.”

  9. SACS Principles of Accreditation • Section 3 – Comprehensive Standards: Standards for All Educational Programs • “12. The institution places primary responsibility for the content, quality, and effectiveness of its curriculum with the faculty” • “18. The institution ensures that its graduate instruction and resources foster independent learning, enabling the graduate to contribute to a profession or field of study.”

  10. SACS Accreditation • The intent of the SACS procedures is to stimulate institutions to create an environment of planned change for improving the educational process.

  11. Language • Much of the assessment literature employs a fair amount of industrial or business speak • Feel free to develop and use your own • Keep it consistent across the institution • Produce and maintain a glossary of terms

  12. So What Do We Need to Do? • Do our departments have a clear mission statement? • Do we have departmental plans to evaluate the effectiveness of our degree programs? • Do our degree programs have clearly defined faculty expectations for students? • Are they published and are they measurable or observable? • Do we obtain data to assess the achievement of faculty expectations for students? • Do we document that assessment results are used to change or sustain the excellence of program activities and further student gains in professional and attitudinal skills and experiences?

  13. So What Do We Need to Do? (Cont.) • Based on assessment results, do we reevaluate the appropriateness of departmental missions as well as the expectations we hold for student competence? The amount of work needed to satisfy accreditation requirements is proportional to the number of ‘No’ responses to the above questions.

  14. IE Chart

  15. Needed to Succeed • The department should want to do this process • The department must use the information collected • The institution must use the information collected • Use participation in the process as part of faculty reviews

  16. Focusing Efforts • It is important to achieve a strategic focus for the program, decide what knowledge, skills, abilities, and experiences should characterize students who graduate from our program…

  17. What is Important to Measure? • To decide this, it is first vital to ask: • What are our strong areas? • What are our limitations? • What do we want to accomplish in • Education of students? • Research? • Service?

  18. Purpose Statement (sample) The Anthropology Department serves the institution by offering courses and scholarly experiences that contribute to the liberal education of undergraduates and the scholarly accomplishments of graduate students. Program faculty members offer courses, seminars, directed readings, and directed research studies that promote social scientific understandings of human cultures. The Department offers a bachelor’s degree major and minor, an M.A. degree, and a Ph.D.

  19. Developing a Plan to Evaluate Degree Programs • How to start a departmental plan: top down or bottom up (Palomba and Palomba, 2001) • Top Down – As a group of scholars, decide what are the important goals or objectives for the program. • Bottom Up – Identify the primary faculty expectations for student competence in core courses in the program and use this list to develop overarching expectations for student competence.

  20. Develop an Assessment Plan • Desirable characteristics for assessment plans: (Palomba and Palomba, 1999) • Identify assessment procedures to address faculty expectations for student competence; • Use procedures such as sampling student work and drawing on institutional data where appropriate; • Include multiple measures; • Describe the people, committees, and processes involved; and • Contain plans for using assessment information.

  21. Words to Remember When Starting an Assessment Plan • It may be best to tackle the modest objectives first. • Assessment plans should recognize that students are active participants and share responsibility for their learning experience along with the faculty and administration. • It takes a long time to do assessment well. So be patient and be flexible. • The overriding goal is to improve educational programs, not to fill out reports or demonstrate accountability.

  22. Use a Program Profile to get Started Related to Operational Objectives

  23. Data for Profiles • Admissions: Applications, acceptance rates, and yield rates • Standardized Test Scores • Graduate Record Examination (GRE) http://www.gre.org/edindex.html • Graduate Management Admission Test (GMAT) http://www.gmac.com/ • Law School Admission Test (LSAT) http://www.lsac.org/ • Undergraduate GPA • Headcount or Major Enrollments (Full/Part-Time) • Degrees Awarded

  24. Profiles (Cont.) • Formula Funding Elements when appropriate • Time-to-Degree and/or Graduation/Retention Rates • Support for Students (Type of Assistance) • Faculty Headcount (Full/Part, Tenure Status) • Faculty Salaries • Faculty Productivity or Workload Compliance • Research Proposals Submitted/Awarded • Research Award/Expenditure Dollars • Instructional and Research Facility Space

  25. Comparative Data • Survey of Earned Doctorates (SED) • National Center for Educational Statistics (NCES) Institutional Postsecondary Educational Data System (IPEDS) • National Research Council (NRC) Reports • Higher Education Data Sharing Consortium (HEDS) Graduate Student Survey (GSS) • American Association of University Professors (AAUP) or College and University Professional Association (CUPA) Faculty Salary Surveys

  26. SED Data • Is administered annually and has a very high annual response rate • Doctoral degrees awarded by broad field and subfield by gender, racial/ethnic group, and citizenship. • Institutional ranking by number of doctorate awards (top 20) by broad field and by racial/ethnic group • Time-to-Degree (three measures) by broad field, gender, racial/ethnic group, and citizenship

  27. SED Data (Cont.) • Financial resources for student support by broad field, gender, racial/ethnic group, and citizenship • Postgraduate plans, employment, and location by broad field, gender, racial/ethnic group, and citizenship • Reports are available at http://www.norc.uchicago.edu/issues/docdata.htm

  28. IPEDS Data • Fall enrollments by major field (2-digit CIP code) of study, race/ethnicity and citizenship, gender, attendance status (full/part-time), and level of student (undergraduate, graduate, and first professional) • The discipline field data is reported in even years only. • Annual degrees conferred by program (6-digit CIP code) or major discipline (2-digit CIP code), award level (associate degree, baccalaureate, Master’s, doctoral, and first professional), race/ethnicity and citizenship, and gender. • Reported annually

  29. IPEDS Data (Cont.) • Useful for identifying peer institutions • Available at the IPEDS Peer Analysis System http://nces.ed.gov/Ipeds/ • These data are also published in the National Center for Education Statistics (NCES), Digest of Education Statistics

  30. National Research Council Research-Doctorate Programs in the United States • This information is dated (1982 and 1993) with a new study scheduled for 2004 (?). • Benefit is rankings of programs. But some critics suggest “reputational rankings cannot accurately reflect the quality of graduate programs.” (Graham & Diamond, 1999) • The National Survey of Graduate Faculty • Scholarly quality of program faculty • Effectiveness of program in educating research scholars/scientists • Change in program quality in last five years

  31. Profile Comparison for History and Physics – NRC Ranking • History department ranked 46.5 • Physics department ranked 63 (Goldberger, Maher, and Flattau, 1995)

  32. Profile Comparison for History and Physics - Faculty

  33. Profile Comparison for History and Physics - Admissions

  34. Profile Comparison for History and Physics - Students

  35. Profile Comparison for History and Physics - Productivity

  36. Describing Faculty Expectations for Students

  37. Why Describe Faculty Expectations for Students? • To sustain program excellence and productivity • To give faculty feedback and the ability to make modifications based on measurable indicators, not anecdotes • To inform and motivate students • To meet external standards for accountability

  38. What Are Our Real Expectations? Read each question thoroughly. Answer all questions. Time limit: four hours. Begin immediately. • MUSIC: Write a piano concerto. Orchestrate it and perform it with flute and drum. You will find a piano under your seat. • MATHEMATICS: Give today's date, in metric. • CHEMISTRY. Transform lead into gold. You will find a beaker and three lead sinkers under your seat. Show all work including Feynman diagrams and quantum functions for all steps. • ECONOMICS: Develop a realistic plan for refinancing the national debt. Run for Congress. Build a political power base. Successfully pass your plan and implement it.

  39. Steps to Describing Expectations - 1 • Write down the result or desired end state as it relates to the program. • Jot down, in words and phrases, the performances that, if achieved, would cause us to agree that the expectation has been met. • Phrase these in terms of results achieved rather than activities undertaken.

  40. Steps to Describing Expectations - 2 • Sort out the words and phrases. Delete duplications and unwanted items. • Repeat first two steps for any remaining abstractions (unobservable results) considered important. • Write a complete statement for each performance, describing the nature, quality, or amount we consider acceptable. • Consider the point in the program where it would make the most sense for students to demonstrate this performance.

  41. Steps to Describing Expectations - 3 • Again, remember to distinguish results from activities. • Test the statements by asking: If someone achieved or demonstrated each of these performances, would we be willing to say the student has met the expectation? • When we can answer yes, the analysis is finished.

  42. Steps to Describing Expectations - 4 • Decide how to measure the meeting of an expectation: can we measure it directly? Indirectly through indicators? • In general, the more direct the measurement, the more content valid it is. • For more complex, higher order expectations: may need to use indicators of an unobservable result.

  43. Steps to Describing Expectations - 5 • Decide upon a preferred measurement tool or student task. • Describe the expectation in terms that measure student competence and yield useful feedback.

  44. Try it! • What Faculty Expectation? Our sample is this: Graduates will be lifelong learners • Decide: Under what condition? When and where will students demonstrate skills? • Decide: How well? What will we use as criteria?

  45. Try it! • Under what condition? • Condition: Students will give evidence of having the ability and the propensity to engage in lifelong learning prior to graduation from the program.

  46. Try it! • How well? Specify performance criteria for the extent to which students: • Display a knowledge of current disciplinary professional journals and can critique them • Are able to access sources of disciplinary knowledge • Seek opportunities to engage in further professional development activities • Other?

  47. Principles of Graduate Assessment • Clearly differentiate master’s and doctoral level expectations • Assessment must be responsive to more individualized nature of programs • Assessment of real student works is preferable • Students already create the products we can use for assessment!

  48. Principles of Graduate Assessment (continued) • Use assessment both as a self-reflection tool and an evaluative tool • Build in feedback to the student and checkpoints • Use natural points of contact with administrative processes

  49. Common Faculty Expectations at the Graduate Level • Students will demonstrate professional and attitudinal skills, including: • Oral, written and mathematical communication skills; • Knowledge of concepts in the discipline; • Critical and reflective thinking skills; • Knowledge of the social, cultural, and economic contexts of the discipline; • Ability to apply theory to professional practice; • Ability to conduct independent research;

  50. Common Faculty Expectations at the Graduate Level (continued) • Students will demonstrate professional and attitudinal skills, including: • Ability to use appropriate technologies; • Ability to work with others, especially in teams; • Ability to teach others; and • Demonstration of professional attitudes and values such as workplace ethics and lifelong learning.

More Related