1 / 61

AY 2013 LBRT Instructional Program Review Process

AY 2013 LBRT Instructional Program Review Process. Data Description Process – Timeline Rev. 10-18-13. 55 Data points to choose from…. Lockheed SR71. Covered in today’s session…. Quick review of Program and Unit Review website Navigating the Annual Reports from Program Data (ARPD) website

jadon
Télécharger la présentation

AY 2013 LBRT Instructional Program Review Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AY 2013 LBRT Instructional Program Review Process Data Description Process – Timeline Rev. 10-18-13

  2. 55 Data points to choose from… Lockheed SR71

  3. Covered in today’s session… • Quick review of Program and Unit Review website • Navigating the Annual Reports from Program Data (ARPD) website • Overview of our local program and unit review process on campus • What are we doing to improve our process? • What is different this year? • Terminology used in the review • Data definitions • Timeline • Assessment with James Kiley • New Annual Review and Budget Process with Dean of CTE Joyce Hamasaki • New CERC/Comprehensive Review Process with VCAA Joni Onishi

  4. Purpose • The purpose of this presentation is to describe the process we follow for our local Comprehensive Program Reviews, the system required Annual Program Reviews, and to provide definitions of the data used in the reviews. • We have been asked to produce an annual program review for each and every one of our instructional programs and non-instructional units. They are required of each community college in the system and will be taken to the U of H Board of Regents for their purview. • You will need to complete a comprehensive review this year this year if you are scheduled for one. Every program and unit is required to complete a comprehensive review once within 5 years time. • Click on the link below to find out if you are scheduled for a comprehensive review this year. Comprehensive Program-Unit Review Cycle and Schedule

  5. Reason for Program Review • The review of a program should be an on-going, year-round, reflective process.  • Program review processes assure educational quality and help programs to evaluate and improve their services. • Program review is an opportunity for self-study, self-renewal, and an opportunity to identify the need for improvement. • Your program review may be one of the few opportunities you have to showcase the accomplishments of your program. Take this opportunity to shine. • A robust program review process is one mechanism available to the college to improve student success.

  6. What are we doing to improve our program review process? Upon conclusion of every program/unit review cycle, the IR Office takes extra care to ensure that we are improving our program/unit review process on campus. This is accomplished by sending out questionnaires specific to the groups, and then meeting with those groups across campus and collecting their feedback. The feedback is reviewed and an action plan is put into place, which is used in the planning phase for our next review cycle. Your suggestions, and the actions taken for improving this process, are then published to the Program Review website and have been linked here for your convenience. 2012 Program-Unit Process Improvement Summary To drive accountability in the organization, the Vice Chancellor for Academic Affairs will ensure that all of the work we commit to as part of this process, is completed every year. Based on the feedback we received from everyone last year from our program-unit review process improvement focus groups, the following changes have been made and have been incorporated into the planning of this year’s review:

  7. What else are we doing to improve our program review process? In order to assure the best possible communication and attendance for our annual training, the IR Office developed the Program-Unit Review Campus Communication Plan. We found out that information about our program and unit review process on campus was not always getting to the people that need it, availability of faculty, lecturers and staff were not taken into consideration when scheduling meetings/training, and using email alone was not sufficient to reach everyone on campus. The new communication plan includes: managing a scheduling request for meetings, using the campus-wide email distribution to include faculty, lecturers and staff, and a hardcopy reminder will be printed off and placed in division/department mailboxes and posted in break areas. Additionally, the Vice Chancellor for Academic Affairs will work with the Admin Team and the DC’s to ensure that we are adequately communicating program and unit review activities across the campus.

  8. What else are we doing to improve our program review process? • There were some issues that came up last year from folks that were unable to find their assessments, which they needed in order to write their reviews. It turns out that the assessments had all been moved from the assessment website to the Intranet, but instructions were never sent out to communicate the change. All assessments can now be found in the assessment folder on the Intranet. Click here to log in and view your assessments: Intranet Assessments • Your suggestions for improving our program review process at the system level were taken to the Instructional Program Review Committee (UHCC IPRC) by our UHCC IPRC Representatives on campus. Results of that meeting are published here: 2012 UHCC IPRC Responses A special “Mahalo” goes out to our local UHCC IPRC representatives for helping us improve this important process!

  9. What else are we doing to improve our program review process? • The Academic Support Unit Review online tool and glossaries have been greatly improved since last year, based in part on the feedback that was provided. • An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that VC’s and Directors update the schedule for both comprehensive reviews and annual reviews on campus every year. This work to be completed in June before we begin the new cycle. • An additional step to our Comprehensive Program/Unit Review Process document was added to invite the IR to the first scheduled Admin meeting in September to communicate changes. This is the improved process we are using this year: Comprehensive Program-Unit Review Process

  10. What is different this year? Remember: Joni will find someone to copy your template information into the online ARPD tool. Just fill out the template you are required to submit this year! The following data elements are all new this year: • Number of majors Native Hawaiian • Fall Full-Time • Fall Part-Time • Fall Part-Time that are full-time in system • Spring Full-Time • Spring Part-Time • Spring Part-Time that are full-time in system • Persistence Fall to Fall Performance Funding • Number of Degrees and Certificates • Number of Degrees and Certificates Native Hawaiian • Number of Degrees and Certificates STEM • Number of PELL Recipients • Number of Transfers to UH 4-yr

  11. Navigating the ARPD Web Submission Tool • If you have attended the training session in-person today you can skip the next few slides (go to slide page 20) that deal with Navigating the ARPD site. I’ve included the slides here for those that may not have been able to join us today. • The Annual Reports of Program Data Web Submission Tool (or ARPD for short), is a repository for Annual program and unit reviews for: Instruction, the Academic Support Unit, and Student Support Services. Much of the data that you will need to complete your review has been provided by the Office of the Vice President for Community Colleges. • The ARPD is a home-grown tool, developed in-house, and was developed specifically to meet the needs of the community college system.

  12. Navigating the ARPD Web Submission Tool cont… • Your programs data table is available on-line within the tool by August 15th every year. You should have everything you need to begin writing your review within the web submission tool. • Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. Also plan to spell check and save your review in Word before starting. • One of the nicest features about the ARPD is that you can go back and look at reviews from previous years. You can also enhance your own review by leveraging off similar reviews at other institutions.

  13. Navigating the ARPD Web Submission Tool continued… UHCC Annual Report of Program Data Web Submission Tool • Begin by clicking on the link above. To enter information on your review, click on the button in the lower part of the screen called, “2013 Instructional Submission”. You will be asked to log in by typing in your UH username and password, in order to get to the web submission site. The default will take you to the “Status” tab, where you will be able to view whoever was the last user to modify the information in your review. • Clicking on the “Users” tab will take you to a screen that shows all of the people that have permission to update your program. For each program there is a list of people with specific roles: Program Coordinator, Div./Dept. Chair, and Dean. The VCAA determines who can update your program. If there is someone you’d like to add to the list, contact the VCAA with your request. • You can enter the information about your program in any order you wish, but moving from left to right across the tabs at the top of the screen follows the same logical path we used when we were filling out the templates in the past. Start by clicking on the “Analysis” tab. • On the Analysis screen you can either go in and just preview what has already been inputted by clicking on the “Preview” button, or you can go to the edit screen and begin entering or updating information.

  14. Navigating the ARPD Web Submission Tool continued… • On the Analysis screen begin editing by going to the bottom of the page, under the data sheet. Click the edit button and scroll down. There are 3 sections for you to edit here: Analysis of your Program, Action Plan, and Resource Implications. Simply click on the “Edit” link and you will be taken to a screen very similar to MS Word, where you can begin typing in your analysis. (Note the Save button in the left hand corner!) • In the Action Plan section you will need to include your action plans for any of your Perkins Core Indicators, where the program did not meet the goal. Check your data sheet for this. • Also very important this year…if you are requesting additional people, services, or equipment for your program, you will need to make the justification in the “Resource Implications” section. Asks for your program are no longer part of the comprehensive review as they have been in the past. • Now click on the “Description tab” and input the year and web address of your last comprehensive review. You should be able to copy and paste the link right off the program-unit review website for the year of your last review. Finish this tab off by typing in a brief description of your program and mission. • Now, click on the “P-SLOs tab” and go to the next slide…

  15. What to enter on the “P-SLO” tab in ARPD Web Submission Tool • Indicate which PLOs were assessed during the reporting year’s assessment(s). • Evidence of Industry Validation Provide documentation that the program has submitted evidence and achieved certification or accreditation from an organization granting certification in an industry or profession.  If the program/degree/certificate does not have a certifying body, the recommendations for, approval of, and/or participation in, assessment by the program’s advisory committee/board can be submitted. • Expected Level of Achievement Describe the different levels of achievement for each characteristic of the learning outcome(s) that were assessed.  What represented “excellent,” “good,” “fair,” or “poor” performance using a defined rubric and what percentages were set as goals for student success (for example: “85% of students will achieve good or excellent in the assessed activity.”) • Courses Assessed List the courses assessed during the reporting period. • Assessment Strategy/Instrument Describe what, why, where, when, and from whom assessment artifacts were collected. • Results of Program Assessment The % of students who met the outcome(s) and at what level they met the outcome(s). • Other Comments Include any information that will clarify the assessment process report. • Next Steps Describe what the program will do to improve the results.  "Next Steps" can include revision to syllabi, curriculum, teaching methods, student support, and other options.

  16. What to enter on the “Cost Per SSH” tab in ARPD Web Submission Tool • There are 4 different values that need to be added on the Cost Per SSH screen in ARPD. Basically, all of the fund amounts that will be entered are used in the calculation for the cost per SSH for your program. The costs that need to be entered are: • General Funds • Federal Funds • Other Funds • Tuition and Fees • The definitions for what comprises each of these funds, are laid out in detail in the data definitions section of this presentation, so will not be duplicated here. • Joni will be uploading all of the budget information you need this year. Once the values are loaded and saved in the tool, the values for “Total Funds” and “Cost per SSH” will auto-calculate into your datasheet.

  17. What to enter on the “External” tab in ARPD Web Submission Tool • This slide only applies to the Nursing program. Included here for “additional info” • The “External” screen is intended for programs that utilize external licensures. Currently, the only program at HawCC with external licensures is the Nursing program. • If you are not in the Nursing program you can skip this tab/screen altogether because the first question is, “Does this program utilize external licensures?” The radio button defaults to “No”. (mighty thoughtful programming, I’d say ) • If you are in the Nursing program please click the “Yes” radio button to answer the question and then enter the Number sitting for exam, and the Number passed. Do this for each program where you utilize external licensures. The percent passed will auto-calculate into your datasheet when you click the “Save External Data” button. • The Vice President of Community College’s office collects this data as part of our annual Graduate-Leaver reporting as well as the UH system, “Measuring our Progress” report.

  18. What to enter on the “Capacity” tab in ARPD Web Submission Tool • The “Capacity” screen is only intended for programs that have an externally mandated capacity. Slide Included here for “additional info” • The following criteria is used as an alternate measure for the Student/Faculty Ratio measure within the Efficiency Health call: • “If your program has an externally mandated (e.g. professional accreditation or licensing) capacity of less than 16 students per faculty, the program may be eligible for the alternative efficiency health call calculation.” • If your program fits the criteria listed above, AND your Efficiency Health call is other than Healthy, AND your efficiency health call can be improved by using the alternate method… Please contact: Cheryl Chappell-LongDirector Academic Planning, Assessment, and Policy AnalysisPhone: 808-956-4561Email: cchappel@hawaii.edu

  19. “Help” tab in ARPD Web Submission Tool • The “Help” screen is a very useful resource for some of the documentation needed to support your program within the review process. It contains all of the glossaries and health call scoring rubrics for each year we have used the online tool.

  20. Process for Completing the Liberal Arts Program Review COMPREHENSIVE REVIEWS If you are on the schedule to complete a comprehensive review this year, follow this simple 2-step process: Step 1:Complete your review in the ARPD online submission tool. Step 2: Move to the Comprehensive Review Process for further instructions. ANNUAL REVIEWS If you are on the NOT on the schedule to complete a comprehensive review this year, follow this simple 2-step process: Step 1:Complete your review in the ARPD online submission tool. Step 2: Move to the Annual Review & Budget Process for further instructions. The comprehensive review process, and annual review and budget process, and their associated templates, are brought to you this year by the VCAA and Interim Dean of CTE Programs. The template and instructions should be part of their presentation today.

  21. Process Owners • The graphic below depicts who is responsible for each part of the program review process—from ARPD through CERC. For more information please use the contact list below. Contact List Assessment = James Kiley 934-2649 IR = Shawn Flood 934-2648 Dean CTE = Joyce Hamasaki 934-2522 VCAA = Joni Onishi 934-2514 Annual Review Comprehensive Review

  22. Terminology / Timing • The Census freeze event is the fifth Friday after the first day of instruction. • The End of semester (EOS) freeze event is 10 weeks after the last day of instruction. • Degrees are conferred in the “Fiscal year”. The fiscal year value represents the ending of that fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012-2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters. • “Home Institution” is the campus where the student was admitted. • “Program Year” listed on your data sheet for “12-13” represents your programs data in semesters: Summer 2012, Fall 2012, and Spring 2013. • Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing).

  23. #1 Number of Majors • Count of program majors who are home‐institution at your college. • Count excludes students that have completely withdrawn from the semester at CENSUS. • This is an annual number. Programs receive a count of .5 for each term within the academic year that the student is a major. A maximum count of 1.0 (one) for each student. #1a Number of Majors Native Hawaiian Count of program majors who are Native Hawaiian and home‐institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. This is an annual number. Programs receive a count of .5 for each term (fall and spring) within the academic year that the Native Hawaiian student is a major. A maximum count of 1.0 (one) for each student.

  24. #1b Fall Full-Time • Percentage of majors (#1) enrolled in 12 or more credits in the college in the reporting Fall semester. #1c Fall Part-Time Percentage of majors (#1) enrolled in less than12 credits in the college in the reporting Fall semester. #1d Fall Part-Time who are Full-Time in System Percentage of majors in #1c (enrolled in less than 12 credits in the reporting Fall semester in the institution) who are enrolled in credits in other UH institutions where their total number of credits enrolled in the UH System is equal to or greater than 12.

  25. #1e Spring Full-Time #1g Spring Part-Time who are Full-Time in System Percentage of majors enrolled in 12 or more credits in the reporting Spring semester at the institution. #1f Spring Part-Time Percentage of majors enrolled in less than 12 credits at the institution in the reporting Spring semester. Percentage of majors in #1f (enrolled in less than 12 credits in the reporting Spring semester in the institution) who are enrolled in credits in other UH institutions where their total number of credits enrolled in the UH System is equal to or greater than 12.

  26. #2 Percent Change Majors from Prior Year • In alignment with UHCC Strategic Planning Goals, General and Pre‐Professional Education programs are expected to grow by 3% per year. • Difference between the number of majors (#1) in the current year from the prior year, divided by the number of majors in the prior year. • For example, last program year (1112) there were 1306 majors, this year (1213) there are 1390 majors in the program. • This methodology works for HawCC in the current reporting year as ((1390-1306)/1306)*100 = about 6%

  27. #3 SSH Program majors in Program Classes • The sum of Fall and Spring Student Semester Hours (SSH) taken by program majors in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. • Not sure what your program classes are? Click here to find out: Courses Taught Aligned to Instructional Programs

  28. #4 SSH Non-Majors in Program Classes • The sum of Fall and Spring SSH taken by non‐program majors (not counted in #3) in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Excludes Directed Studies (99 series). • Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program.

  29. #5 SSH in All Program Classes • The sum of Fall and Spring SSH taken by all students in classes linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Excludes Directed Studies (99 series). • Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.

  30. #6 FTE Enrollment in Program Classes • Sum of Student Semester Hours (SSH) taken by all students in classes linked to the program (#5) divided by 30. Undergraduate, lower division Full Time Equivalent (FTE) is calculated as 15 credits per term. • Captured at Census and excludes students who have already withdrawn (W) at this point.

  31. #7 Total Number of Classes Taught • Total number of classes taught in Fall and Spring that are linked to the program. • Concurrent and Cross listed classes are only counted once for the primary class. • Excludes Directed Studies (99 series). • Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.

  32. Liberal Arts Program Scoring Rubric Definitions Your program health is determined by 3 separate types of measures: Demand, Efficiency, and Effectiveness. This slide explains why these measures were chosen to determine program health. • Demand: A seeking or state of being sought after. i.e. your programs ability to attract new students every year based on your offering. • Efficiency: Acting or producing effectively with a minimum of waste, expense, or unnecessary effort. i.e. your programs ability to use its resources in the best possible way. • Effectiveness: Stresses the actual production of, or the power to produce an affect. i.e. your programs ability to produce the desired result.

  33. Determination of Program’s Health • This year the system office will calculate and report health calls for all instructional programs using program year 2013 data. • If you are interested in how the Liberal Arts Health Calls were determined, refer to the rubric link available on the ARPD website.

  34. #8 Average Class Size • Total number of students actively registered in Fall and Spring program classes divided by classes taught (#7). Does not include students who have already withdrawn from the class by Census. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data. Includes Cooperative Education (93 series) as there is a resource cost to the program. #9 Fill Rate Total active student registrations in program classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Captured at Census and excludes students who have already withdrawn (W) at this point.

  35. #10 FTE BOR Appointed Faculty • Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty (excludes lecturers and other non BOR appointees). • Uses the “hiring status” of the faculty member – not the teaching/work load. • Uses the Employing Agency Code (EAC) recorded in the Human Resources (HR) database to determine faculty’s program home. • Data provided by UH Human Resources Office as of 10/31/2012. • Faculty Teaching solely Remedial and/or Developmental Reading, Writing, or Mathematics are excluded from the counts in Liberal Arts and included in the counts for the Remedial/and or Developmental programs. Faculty with “split” assignments among Remedial/Developmental classes and Transfer level classes are reflected proportionally. • If your FTE BOR faculty count is off for your program, contact the VCAA. • Click here for the count of BOR Appointed Program Faculty in your program: 2013 BOR Appointed Program Faculty

  36. #11 Majors to FTE BOR Appointed Faculty • Number of majors (#1) divided by sum appointments (#10) (1.0, 0.5, etc.) of all BOR appointed program faculty. • Data shows the number of student majors in the program for each faculty member (25 majors to 1 faculty shown as “25”)

  37. #12 Majors to Analytic FTE Faculty • Number of majors (#1) divided by number of Analytic FTE faculty (#12a). #12a Analytical FTE Faculty (Workload) Calculated by sum of Semester Hours (not Student Semester Hours) taught in program classes divided by 27. Analytic FTE is useful as a comparison to FTE of BOR appointed faculty (#10). Used for analysis of program offerings covered by lecturers.

  38. #13 OverallProgram Budget Allocation • The overall program budget allocation = General Funded Budget Allocations (13a) + Special/Federal Budget Allocations (13b) + Tuition and Fees (13c) + Other fees • The overall program budget allocation is automatically calculated when you enter your general funded budget allocation, special/federal budget allocation, other funds, and/or tuition and fees, into the online tool tab called, “Cost per SSH.” Again, Joni will upload the data for you this year. • The overall program budget allocation is to be determined by the College using these guidelines from VCAA/DOI/ADOI and should include: Salaries (general funds, special funds, etc.), overload, lecturers, costs for all faculty and staff assigned to the program, supply and maintenance, amortized equipment, and tuition and fees.

  39. #13a General Funded Budget Allocation • The general funded budget allocation = actual personnel costs + b budget expenditures #13b Special/Federal Budget Allocation The dollars from Federal grants #13c Tuition and Fees The amount collected for tuition and fees in the 2013 academic year.

  40. #15 Number of Low Enrolled (<10) Classes #14 Cost per SSH Overall Program Budget Allocation (#13) divided by SSH in all program classes (#5) • Classes taught (#7) with 9 or fewer active students at Census. • Excludes students who have already withdrawn (W) at this point. • Excludes Directed Studies (99 series). • Includes Cooperative Education (93 series) as there is a resource cost to the program.

  41. #16 Successful Completion (Equivalent C or higher) • Percentage of students actively enrolled in program classes at Fall and Spring census who at end of semester (EOS) have earned a grade equivalent to C or higher. #17 Withdrawals (grade = W) Number of students actively enrolled (at this point have not withdrawn) at Fall and Spring census who at end of semester have a grade of W.

  42. #18 Persistence Fall to Spring • Count of students who are majors in program at fall census and at subsequent Spring semester census are enrolled and are still majors in the program. • Removed from the count (both numerator and denominator) are program major students to whom a program degree (or CA if highest credential awarded) has been conferred in the reporting Fall semester. • Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 = .6774 or 67.74%

  43. #18a Persistence Fall to Fall • Count of students who are majors in program at fall census and at subsequent Fall semester census are enrolled and are still majors in the program. • Removed from the count (both numerator and denominator) are program major students to whom a program degree (or CA if highest credential awarded) has been conferred in the first Fall, Spring or Summer reporting term. • Example: 31 majors start in Fall 11 majors of the original 31 persist into Fall of the next year 11/31 = .3548 or 35.48%

  44. #19 Unduplicated Degrees/Certificates Awarded • Unduplicated headcount of students in the fiscal year reported to whom a program degree or any certificate has been conferred. (Sum of 19a and 19b). • Uses most recent available freeze of fiscal year data.

  45. #19a Associate Degrees Awarded • Degrees conferred in the FISCAL_YEAR_IRO. • The count is of degrees and may show duplicate degrees received in the program by the same student. • Uses most recent available freeze of fiscal year data. • FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004‐2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…”

  46. #19b Academic Subject Certificates Awarded • The count is of program Academic Subject Certificates and may show multiple Academic Subject Certificates in the same program received by the student. • Uses most recent available freeze of fiscal year data.

  47. #19c Goal • The UH Community Colleges have established increasing the number of Associate Degrees awarded as a priority. • Beginning with the baseline of 2006 (see MAPS Degrees and Certificates Earned 2005‐2006) programs are expected to increase the number of Associate Degrees by 3% compounded annually. • The number in 19c reflects the goal for the appropriate year. • Example: 127 degrees was the goal for the 1112 program year, 130 is the goal for the 1213 program year (about a 3% increase).

  48. #19d Difference Between Unduplicated Awarded and Goal • The percent difference between the number of Associate degrees awarded (#19a) and the goal (#19c). • General Pre Professional Programs are expected to increase Associate Degrees awarded by 3% compounded annually. • Uses most recent available freeze of fiscal year data. • Example: The program’s goal for program year 1213 is 130 (19c). The actual number of Associate degrees awarded (19a) is 231. • ((231-130)/130)*100 = about 78% difference between the number of unduplicated degrees awarded and the goal for associate degrees

  49. #20 Transfers to UH 4-yr • The number of students who are home‐institution at a UH System 4‐yr institution for the first time in Fall who were in the reporting program prior to that Fall. • In the event that a student has more than one major at the college, each program/major receives the count. • An individual student may be counted in more than one program. • UH Maui College is included when students transfer from any UHCC program to a UH Maui College four year programs. Number based on Fall semester only.

  50. #20a Transfers with degree from program • Students included in #20 who have received a degree from the community college program prior to transfer. • Does not include any certificates. • Number based on Fall semester only. #20b Transfers without degree from program Students included in #20 who did not receive a degree from the community college program prior to transfer. Number based on Fall semester only.

More Related