740 likes | 901 Vues
AY 2013 CTE Instructional Program Review Process. Data Description Process – Timeline Rev. 10-18-13. 61 Data points to choose from…. Lockheed SR71. Covered in today’s session…. Quick review of Program and Unit Review website Navigating the Annual Reports from Program Data (ARPD) website
E N D
AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev. 10-18-13
61 Data points to choose from… Lockheed SR71
Covered in today’s session… • Quick review of Program and Unit Review website • Navigating the Annual Reports from Program Data (ARPD) website • Overview of our local program and unit review process on campus • What are we doing to improve our process? • What is different this year? • Terminology used in the review • Data definitions • Timeline • Assessment with James Kiley • New Annual Review and Budget Process with Dean of CTE Joyce Hamasaki • New CERC/Comprehensive Review Process with VCAA Joni Onishi
Purpose • The purpose of this presentation is to describe the process we follow for our local Comprehensive Program Reviews, the system required Annual Program Reviews, and to provide definitions of the data used in the reviews. • We have been asked to produce an annual program review for each and every one of our instructional programs and non-instructional units. They are required of each community college in the system and will be taken to the U of H Board of Regents for their purview. • You will need to complete a comprehensive review this year this year if you are scheduled for one. Every program and unit is required to complete a comprehensive review once within 5 years time. • Click on the link below to find out if you are scheduled for a comprehensive review this year. Comprehensive Program-Unit Review Cycle and Schedule
Reason for Program Review • The review of a program should be an on-going, year-round, reflective process. • Program review processes assure educational quality and help programs to evaluate and improve their services. • Program review is an opportunity for self-study, self-renewal, and an opportunity to identify the need for improvement. • Your program review may be one of the few opportunities you have to showcase the accomplishments of your program. Take this opportunity to shine. • A robust program review process is one mechanism available to the college to improve student success.
What are we doing to improve our program review process? Upon conclusion of every program/unit review cycle, the IR Office takes extra care to ensure that we are improving our program/unit review process on campus. This is accomplished by sending out questionnaires specific to the groups, and then meeting with those groups across campus and collecting their feedback. The feedback is reviewed and an action plan is put into place, which is used in the planning phase for our next review cycle. Your suggestions, and the actions taken for improving this process, are then published to the Program Review website and have been linked here for your convenience. 2012 Program-Unit Process Improvement Summary To drive accountability in the organization, the Vice Chancellor for Academic Affairs will ensure that all of the work we commit to as part of this process, is completed every year. Based on the feedback we received from everyone last year from our program-unit review process improvement focus groups, the following changes have been made and have been incorporated into the planning of this year’s review:
What else are we doing to improve our program review process? In order to assure the best possible communication and attendance for our annual training, the IR Office developed the Program-Unit Review Campus Communication Plan. We found out that information about our program and unit review process on campus was not always getting to the people that need it, availability of faculty, lecturers and staff were not taken into consideration when scheduling meetings/training, and using email alone was not sufficient to reach everyone on campus. The new communication plan includes: managing a scheduling request for meetings, using the campus-wide email distribution to include faculty, lecturers and staff, and a hardcopy reminder will be printed off and placed in division/department mailboxes and posted in break areas. Additionally, the Vice Chancellor for Academic Affairs will work with the Admin Team and the DC’s to ensure that we are adequately communicating program and unit review activities across the campus.
What else are we doing to improve our program review process? • There were some issues that came up last year from folks that were unable to find their assessments, which they needed in order to write their reviews. It turns out that the assessments had all been moved from the assessment website to the Intranet, but instructions were never sent out to communicate the change. All assessments can now be found in the assessment folder on the Intranet. Click here to log in and view your assessments: Intranet Assessments • Your suggestions for improving our program review process at the system level were taken to the Instructional Program Review Committee (UHCC IPRC) by our UHCC IPRC Representatives on campus. Results of that meeting are published here: 2012 UHCC IPRC Responses A special “Mahalo” goes out to our local UHCC IPRC representatives for helping us improve this important process!
What else are we doing to improve our program review process? • The Academic Support Unit Review online tool and glossaries have been greatly improved since last year, based in part on the feedback that was provided. • An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that VC’s and Directors update the schedule for both comprehensive reviews and annual reviews on campus every year. This work to be completed in June before we begin the new cycle. • An additional step to our Comprehensive Program/Unit Review Process document was added to invite the IR to the first scheduled Admin meeting in September to communicate changes. This is the improved process we are using this year: Comprehensive Program-Unit Review Process
What is different this year? Remember: Joni will find someone to copy your template information into the online ARPD tool. Just fill out the template you are required to submit this year! The following data elements are all new this year: • Number of majors Native Hawaiian • Fall Full-Time • Fall Part-Time • Fall Part-Time that are full-time in system • Spring Full-Time • Spring Part-Time • Spring Part-Time that are full-time in system • Persistence Fall to Fall Performance Funding • Number of Degrees and Certificates • Number of Degrees and Certificates Native Hawaiian • Number of Degrees and Certificates STEM • Number of PELL Recipients • Number of Transfers to UH 4-yr
Navigating the ARPD Web Submission Tool • If you have attended the training session in-person today you can skip the next few slides (go to slide #20) that deal with Navigating the ARPD site. I’ve included the slides here for those that may not have been able to join us today. • The Annual Reports of Program Data Web Submission Tool (or ARPD for short), is a repository for Annual program and unit reviews for: Instruction, the Academic Support Unit, and Student Support Services. Much of the data that you will need to complete your review has been provided by the Office of the Vice President for Community Colleges. • The ARPD is a home-grown tool, developed in-house, and was developed specifically to meet the needs of the community college system.
Navigating the ARPD Web Submission Tool cont… • Your programs data table is available on-line within the tool by August 15th every year. You should have everything you need to begin writing your review within the web submission tool. • Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. Also plan to spell check and save your review in Word before starting. • One of the nicest features about the ARPD is that you can go back and look at reviews from previous years. You can also enhance your own review by leveraging off similar reviews at other institutions.
Navigating the ARPD Web Submission Tool continued… UHCC Annual Report of Program Data Web Submission Tool • Begin by clicking on the link above. To enter information on your review, click on the button in the lower part of the screen called, “2013 Instructional Submission”. You will be asked to log in by typing in your UH username and password, in order to get to the web submission site. The default will take you to the “Status” tab, where you will be able to view whoever was the last user to modify the information in your review. • Clicking on the “Users” tab will take you to a screen that shows all of the people that have permission to update your program. For each program there is a list of people with specific roles: Program Coordinator, Div./Dept. Chair, and Dean. The VCAA determines who can update your program. If there is someone you’d like to add to the list, contact the VCAA with your request. • You can enter the information about your program in any order you wish, but moving from left to right across the tabs at the top of the screen follows the same logical path we used when we were filling out the templates in the past. Start by clicking on the “Analysis” tab. • On the Analysis screen you can either go in and just preview what has already been inputted by clicking on the “Preview” button, or you can go to the edit screen and begin entering or updating information.
Navigating the ARPD Web Submission Tool continued… • On the Analysis screen begin editing by going to the bottom of the page, under the data sheet. Click the edit button and scroll down. There are 3 sections for you to edit here: Analysis of your Program, Action Plan, and Resource Implications. Simply click on the “Edit” link and you will be taken to a screen very similar to MS Word, where you can begin typing in your analysis. (Note the Save button in the left hand corner!) • Note that in the Action Plan section you will need to include action plans for any of your Perkins Core Indicators, where the program did not meet the goal. Check your data sheet for this. • Also very important this year…if you are requesting additional people, services, or equipment for your program, you will need to make the justification in the “Resource Implications” section. Asks for your program are no longer part of the comprehensive review as they have been in the past. • Now click on the “Description tab” and input the year and web address of your last comprehensive review. You should be able to copy and paste the link right off the program-unit review website for the year of your last review. Finish this tab off by typing in a brief description of your program and mission. • Now, click on the “P-SLOs tab” and go to the next slide…
What to enter on the “P-SLO” tab in ARPD Web Submission Tool • Indicate which PLOs were assessed during the reporting year’s assessment(s). • Evidence of Industry Validation Provide documentation that the program has submitted evidence and achieved certification or accreditation from an organization granting certification in an industry or profession. If the program/degree/certificate does not have a certifying body, the recommendations for, approval of, and/or participation in, assessment by the program’s advisory committee/board can be submitted. • Expected Level of Achievement Describe the different levels of achievement for each characteristic of the learning outcome(s) that were assessed. What represented “excellent,” “good,” “fair,” or “poor” performance using a defined rubric and what percentages were set as goals for student success (for example: “85% of students will achieve good or excellent in the assessed activity.”) • Courses Assessed List the courses assessed during the reporting period. • Assessment Strategy/Instrument Describe what, why, where, when, and from whom assessment artifacts were collected. • Results of Program Assessment The % of students who met the outcome(s) and at what level they met the outcome(s). • Other Comments Include any information that will clarify the assessment process report. • Next Steps Describe what the program will do to improve the results. "Next Steps" can include revision to syllabi, curriculum, teaching methods, student support, and other options.
What to enter on the “Cost Per SSH” tab in ARPD Web Submission Tool • There are 4 different values that need to be added on the Cost Per SSH screen in ARPD. Basically, all of the fund amounts that will be entered are used in the calculation for the cost per SSH for your program. The costs that need to be entered are: • General Funds • Federal Funds • Other Funds • Tuition and Fees • The definitions for what comprises each of these funds, are laid out in detail in the data definitions section of this presentation, so will not be duplicated here. • Joni will be uploading all of the budget information you need this year. Once the values are loaded and saved in the tool, the values for “Total Funds” and “Cost per SSH” will auto-calculate into your datasheet.
What to enter on the “External” tab in ARPD Web Submission Tool • The “External” screen is intended for programs that utilize external licensures. Currently, the only program at HawCC with external licensures is the Nursing program. • If you are not in the Nursing program you can skip this tab/screen altogether because the first question is, “Does this program utilize external licensures?” The radio button defaults to “No”. (mighty thoughtful programming, I’d say ) • If you are in the Nursing program please click the “Yes” radio button to answer the question and then enter the Number sitting for exam, and the Number passed. Do this for each program where you utilize external licensures. The percent passed will auto-calculate into your datasheet when you click the “Save External Data” button. • The Vice President of Community College’s office collects this data as part of our annual Graduate-Leaver reporting as well as the UH system, “Measuring our Progress” report.
What to enter on the “Capacity” tab in ARPD Web Submission Tool • The “Capacity” screen is only intended for programs that have an externally mandated capacity. • The following criteria is used as an alternate measure for the Student/Faculty Ratio measure within the Efficiency Health call: • “If your program has an externally mandated (e.g. professional accreditation or licensing) capacity of less than 16 students per faculty, the program may be eligible for the alternative efficiency health call calculation.” • If your program fits the criteria listed above, AND your Efficiency Health call is other than Healthy, AND your efficiency health call can be improved by using the alternate method… Please contact: Cheryl Chappell-LongDirector Academic Planning, Assessment, and Policy AnalysisPhone: 808-956-4561Email: cchappel@hawaii.edu
“Help” tab in ARPD Web Submission Tool • The “Help” screen is a very useful resource for some of the documentation needed to support your program within the review process. It contains all of the glossaries and health call scoring rubrics for each year we have used the online tool.
Process for Completing an Instructional Program Review COMPREHENSIVE REVIEWS If you are on the schedule to complete a comprehensive review this year, follow this simple 2-step process: Step 1:Complete your review in the ARPD online submission tool. Step 2: Move to the Comprehensive Review Process for further instructions. ANNUAL REVIEWS If you are on the NOT on the schedule to complete a comprehensive review this year, follow this simple 2-step process: Step 1:Complete your review in the ARPD online submission tool. Step 2: Move to the Annual Review & Budget Process for further instructions. The comprehensive review process, and annual review and budget process, and their associated templates, are brought to you this year by the VCAA and Interim Dean of CTE Programs. The template and instructions should be part of their presentation today.
Process Owners • The graphic below depicts who is responsible for each part of the program review process—from ARPD through CERC. For more information please use the contact list below. Contact List Assessment = James Kiley 934-2649 IR = Shawn Flood 934-2648 Dean CTE = Joyce Hamasaki 934-2522 VCAA = Joni Onishi 934-2514 Annual Review Comprehensive Review
Terminology / Timing • The Census freeze event is the fifth Friday after the first day of instruction. • The End of semester (EOS) freeze event is 10 weeks after the last day of instruction. • Degrees are conferred in the “Fiscal year”. The fiscal year value represents the ending of that fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012-2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters. • “Home Institution” is the campus where the student was admitted. • “Program Year” listed on your data sheet for “12-13” represents your programs data in semesters: Summer 2012, Fall 2012, and Spring 2013. • Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing).
#1 New and Replacement Positions (State) • Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at state level. Compiles data based on Standard Occupational Classification (SOC) codes aligned to the program’s Classification of Instructional Programs (CIP) codes. • Data based on annual new/replacement position projections as of Spring 2013 • State position numbers are not pro‐rated. AY 2013 HawCC CIP Code Listing
#2 New and Replacement Positions (County prorated) • Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at county level. Compiles data based on Standard Occupational Classification (SOC) codes aligned to the program’s Classification of Instructional Program (CIP) codes. • Note: It is possible for the number of new and replacement positions in the county to be higher than the state if the projection in other counties is for a loss of new and replacement positions. • County data pro‐rated to reflect number of programs aligned to the SOC code and weighted by number of majors in each program/institution for programs that share SOC codes. • Data based on annual new/replacement position projections as of Spring 2013.
#3 Number of Majors • Count of program majors who are home‐institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. • This is an annual number. Programs receive a count of .5 for each term (fall and spring) within the academic year that the student is a major. A maximum count of 1.0 (one) for each student. #3a Number of Majors Native Hawaiian Count of program majors who are Native Hawaiian and home‐institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. This is an annual number. Programs receive a count of .5 for each term (fall and spring) within the academic year that the Native Hawaiian student is a major. A maximum count of 1.0 (one) for each student.
#3b Fall Full-Time • Percentage of majors (#3) enrolled in 12 or more credits in the college in the reporting Fall semester. #3c Fall Part-Time Percentage of majors (#3) enrolled in less than12 credits in the college in the reporting Fall semester. #3d Fall Part-Time who are Full-Time in System Percentage of majors in #3c (enrolled in less than 12 credits in the reporting Fall semester in the institution) who are enrolled in credits in other UH institutions where their total number of credits enrolled in the UH System is equal to or greater than 12.
#3e Spring Full-Time #3g Spring Part-Time who are Full-Time in System Percentage of majors enrolled in 12 or more credits in the reporting Spring semester at the institution. #3f Spring Part-Time Percentage of majors enrolled in less than 12 credits at the institution in the reporting Spring semester. Percentage of majors in #3f (enrolled in less than 12 credits in the reporting Spring semester in the institution) who are enrolled in credits in other UH institutions where their total number of credits enrolled in the UH System is equal to or greater than 12.
#4 SSH Program majors in Program Classes • The sum of Fall and Spring Student Semester Hours (SSH) taken by program majors in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, Summer SSH are included. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. • Not sure what your program classes are? Click here to find out: Courses Taught Aligned to Instructional Programs
#5 SSH Non-Majors in Program Classes • The sum of Fall and Spring Student Semester Hours (SSH) taken by non‐program majors (not counted in #4) in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, Summer SSH are included. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.
#6 SSH in All Program Classes • The sum of Fall and Spring Student Semester Hours (SSH) taken by all students in classes linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, Summer SSH are included. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.
#7 FTE Enrollment in Program Classes • Sum of Student Semester Hours (SSH) taken by all students in classes linked to the program (#6) divided by 30. Undergraduate, lower division Full Time Equivalent (FTE) is calculated as 15 credits per term. • Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, summer SSH are included.
#8 Total Number of Classes Taught • Total number of classes taught in Fall and Spring that are linked to the program. Includes Summer classes if year‐round attendance is mandatory. • Concurrent and Cross listed classes are only counted once for the primary class. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.
CTE Program Scoring Rubric Definitions Your program health is determined by 3 separate types of measures: Demand, Efficiency, and Effectiveness. This slide explains why these measures were chosen to determine program health. • Demand: A seeking or state of being sought after. i.e. your programs ability to attract new students every year based on your offering. • Efficiency: Acting or producing effectively with a minimum of waste, expense, or unnecessary effort. i.e. your programs ability to use its resources in the best possible way. • Effectiveness: Stresses the actual production of, or the power to produce an affect. i.e. your programs ability to produce the desired result.
Determination of program’s health based on demand • This year the system office will calculate and report health calls for all instructional programs using academic year 2013 data. The following instructions illustrate how those calls are made. • Program Demand is determined by taking the number of majors (#3) and dividing them by the number of New and Replacement Positions by County (#2). • The following benchmarks are used to determine demand health: Healthy: 1.5 - 4.0 Cautionary: .5 – 1.49; 4.1 – 5.0 Unhealthy: <.5; >5.0 • Finally, an Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy
#9 Average Class Size • Total number of students actively registered in Fall and Spring program classes divided by classes taught (#8). Does not include students who have already withdrawn from the class by Census. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data. Includes Cooperative Education (93 series) as there is a resource cost to the program. #10 Fill Rate Total active student registrations in program classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Captured at Census and excludes students who have already withdrawn (W) at this point.
#11 FTE BOR Appointed Faculty • Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty (excludes lecturers and other non BOR appointees). • Uses the “hiring status” of the faculty member – not the teaching/work load. • Uses the Employing Agency Code (EAC) recorded in the Human Resources (HR) database to determine faculty’s program home. • Data provided by UH Human Resources Office as of 10/31/2012. • If your FTE BOR faculty count is off for your program, contact the VCAA. • Click here for the count of BOR Appointed Program Faculty in your program: 2013 BOR Appointed Program Faculty
#12 Majors to FTE BOR Appointed Faculty • Number of majors (#3) divided by sum appointments (#11) (1.0, 0.5, etc.) of all BOR appointed program faculty. • Data shows the number of student majors in the program for each faculty member (25 majors to 1 faculty shown as “25”)
#13 Majors to Analytic FTE Faculty • Number of majors (#3) divided by number of Analytic FTE faculty (13a). #13a Analytical FTE Faculty (Workload) Calculated by sum of Semester Hours (not Student Semester Hours) taught in program classes divided by 27. Analytic FTE is useful as a comparison to FTE of BOR appointed faculty (#11). Used for analysis of program offerings covered by lecturers.
#14 OverallProgram Budget Allocation • The overall program budget allocation = General Funded Budget Allocations (14a) + Special/Federal Budget Allocations (14b) + Tuition and Fees (14c) + Other fees • The overall program budget allocation is automatically calculated when you enter your general funded budget allocation, special/federal budget allocation, other funds, and/or tuition and fees, into the online tool tab called, “Cost per SSH.” Again, Joni will upload the data for you this year. • The overall program budget allocation is to be determined by the College using these guidelines from VCAA/DOI/ADOI and should include: Salaries (general funds, special funds, etc.), overload, lecturers, costs for all faculty and staff assigned to the program, supply and maintenance, amortized equipment, and tuition and fees.
#14a General Funded Budget Allocation • The general funded budget allocation = actual personnel costs + b budget expenditures #14b Special/Federal Budget Allocation The dollars from Federal grants #14c Tuition and Fees The amount collected for tuition and fees in the 2013 academic year.
#16 Number of Low Enrolled (<10) Classes #15 Cost per SSH Overall Program Budget Allocation (#13) divided by SSH in all program classes (#5) • Classes taught (#7) with 9 or fewer active students at Census. • Excludes students who have already withdrawn (W) at this point. • Excludes Directed Studies (99 series). • Includes Cooperative Education (93 series) as there is a resource cost to the program.
Determination of program’s health based on efficiency • This year the system office will calculate and report health calls for all instructional programs using AY 2013 data. The following instructions illustrate how those calls are made. • Program Efficiency is calculated using 2 separate measures…Fill rate (#10), and Majors to FTE BOR Appointed Faculty (#12). • The following benchmarks are used to determine health for Fill Rate: Healthy: 75 – 100% Cautionary: 60 – 74% Unhealthy: < 60% • An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy
Determination of program’s health based on efficiency cont… • The following benchmarks are used to determine health for Majors/FTE BOR Appointed Faculty : Healthy: 15 - 35 Cautionary: 30 – 60; 7 - 14 Unhealthy: 61 +; 6 or fewer All programs are automatically calculated using the measure above unless they have an externally mandated (e.g. professional accreditation or licensing) capacity of less than 16 students per faculty. • An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy • Finally, average the 2 overall health scores for Class fill rate and Majors/FTE BOR Appointed Faculty then use the following rubric: 1.5 - 2 = Healthy 0.5 - 1 = Cautionary 0 = Unhealthy
#17 Successful Completion (Equivalent C or higher) • Percentage of students actively enrolled in program classes at Fall and Spring census who at end of semester (EOS) have earned a grade equivalent to C or higher. #18 Withdrawals (grade = W) Number of students actively enrolled (at this point have not withdrawn) at Fall and Spring census who at end of semester have a grade of W.
#19 Persistence Fall to Spring • Count of students who are majors in program at fall census (from Fall semester #3) and at subsequent Spring semester census are enrolled and are still majors in the program. • Removed from the count (both numerator and denominator) are program major students to whom a program degree (or CA if highest credential awarded) has been conferred in the reporting Fall semester. • Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 = .6774 or 67.74%
#19a Persistence Fall to Fall • Count of students who are majors in program at fall census (from Fall semester #3) and at subsequent Fall semester census are enrolled and are still majors in the program. • Removed from the count (both numerator and denominator) are program major students to whom a program degree (or CA if highest credential awarded) has been conferred in the first Fall, Spring or Summer reporting term. • Example: 31 majors start in Fall 11 majors of the original 31 persist into Fall of the next year 11/31 = .3548 or 35.48%
#20 Unduplicated Degrees/Certificates Awarded • Unduplicated headcount of students in the fiscal year reported to whom a program degree or any certificate has been conferred. (Sum of 20a, 20b, 20c, and 20d). • Uses most recent available freeze of fiscal year data.
#20a Degrees Awarded • Degrees conferred in the FISCAL_YEAR_IRO. • The count is of degrees and may show duplicate degrees received in the program by the same student if the program offers more than one degree. • Uses most recent available freeze of fiscal year data. • FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012‐2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters…”
#20b Certificates of Achievement Awarded • Certificates of achievement conferred in the FISCAL_YEAR_IRO. • The count is of program certificates of achievement and may show multiple certificates of achievement in the same program received by the same student. • Uses most recent available freeze of fiscal year data. • FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012‐2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters…”
#20c Advanced Professional Certificates Awarded • The count is of program Advanced Professional Certificates and may show multiple Advanced Professional Certificates in the same program received by the student. • Uses most recent available freeze of fiscal year data. #20d OtherCertificates Awarded The count is of other program certificates and will show multiples received by the same student. Uses most recent available freeze of fiscal year data.