1 / 98

Survey Manager Information Forum 2014 Friday 18 th July UNIVERSITY EXPERIENCE Survey

Survey Manager Information Forum 2014 Friday 18 th July UNIVERSITY EXPERIENCE Survey. SMIF 2014 Program: FRIDAY - UES. Welcome Implementing the QILT and Q&A Dr Andrew Taylor, Phil Aungles (Department of Education) Morning tea Live Reporting System

dusty
Télécharger la présentation

Survey Manager Information Forum 2014 Friday 18 th July UNIVERSITY EXPERIENCE Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Survey Manager Information Forum 2014 Friday 18thJuly UNIVERSITY EXPERIENCE Survey

  2. SMIF 2014 Program: FRIDAY - UES • Welcome • Implementing the QILT and Q&A • Dr Andrew Taylor, Phil Aungles (Department of Education) • Morning tea • Live Reporting System • Sonia Whitely and Eric Skuja, The Social Research Centre • UES operations • Lunch • Inclusion of the Workplace Relevance Scale • UES Query Demonstration • Sonia Whitely, The Social Research Centre • Afternoon Tea • Q&A and wrap up.

  3. UES 2014 • Project commenced May 2014 • Submission 1 HEIMS file first received June 2014 • Survey Manager Forum mid July 2014 • Fieldwork to begin in August 2014 • 40 Table A and Table B institutions • Up to 15 NUHEIs • Interest from NZ • Draft data files - October 2014 • Draft Institutional & National reports – November 2014 • Final data files – November 2014 • Final Institutional & National reports – December 2014 • UES Query needs analysis completed January 2015

  4. Housekeeping • Cue cards • Please mute mobile phones / tablets • Bathrooms • Morning tea / afternoon tea • Lunch

  5. Welcome to DAY 2 We encourage you all to participate, ask questions, share ideas, and make contacts and… enjoy!!

  6. Quality Indicators for Learning and Teaching2014 Survey Managers’ Information Forum18 July 2014Dr Andrew TaylorBranch Manager, Higher Education Data and Analysis

  7. 2014-15 Budget Measure • 2014-15 Budget measure Upholding Quality – Quality Indicators for Learning and Teaching (QILT) • QILT – suite of surveys over the student life cycle, from commencement to employment • New, purpose built website to publish results and inform student choice

  8. Development of Indicators • Indicator development for performance funding • AQHE Reference Group, chaired by Professor Ian O’Connor • Consultation process: • Discussion papers • Submissions • Roundtables

  9. Key issues • Survey burden on students and institutions • Centralised administration • Employer surveys ad hoc and not well done • Need for robust measures of teaching performance

  10. AQHE Report Key Reference Group recommendations: • Centralised administration of the suite of performance measurement instruments under a single third-party provider • The suite of instruments to initially comprise the University Experience Survey (UES) and a new Graduate Outcomes Survey (GOS) • Scoping study on a new instrument to assess employer needs and satisfaction with graduates Not recommended: • Composite Teaching Quality Indicator • Collegiate Learning Assessment

  11. MyUniversity website • Live in April 2012 • Clear, meaningful and transparent information • Confidence intervals • Subject by institution level data

  12. Data sharing • Amendments made to the Higher Education Support Act 2003 (HESA) in November 2013 • Personal information can now be provided to a range of bodies, including • TEQSA • higher education providers and some peak bodies • state governments • Safeguards remain • Limited uses • No on-sharing • Provider consent • Universities Australia data sharing agreement allows access for member institutions

  13. QILT Components • Three surveys: • University Experience Survey (UES) • Graduate Outcomes Survey (GOS) • Employer Satisfaction Survey (ESS) • Plus new purpose built QILT website to publish results from the surveys • QILT contractor selected through tender to administer all elements

  14. Timing

  15. University Experience Survey • 2014 UES administered by GCA/SRC consortium • 2015 UES and 2016 UES comprise part of QILT • Methodology in line with previous administrations

  16. National results, key items, 2011-2013

  17. National results, scales, 2012-2013

  18. International benchmarking

  19. Graduate Outcomes Survey • GOS will provide timely labour market information • Review existing AGS instrument to develop new GOS instrument and methodology • Reporting methods need review • Survey administered in two rounds October and April (following year), though alternative proposals considered • 2015 AGS administered by GCA • GOSto replace AGS from the second half of 2015 • 2016 GOS and 2017 GOS comprise part of QILT

  20. Employer Satisfaction Survey • ESS measures employer satisfaction with generic skills, technical skills and work readiness of graduates • Recent graduates provide contact details of direct supervisor • Pilot ESS in 2013-14 across 4 universities and 5 fields of education • Pilot ESS report available from: www.education.gov.au/employer-satisfaction-survey • Further development of ESS in October-December 2014 • instrument and survey methodology to be reviewed and tested on larger scale • universities participate opt-in basis • 2014 ESS trial, 2015 ESS and 2016 ESS comprise part of QILT

  21. ESS – key results Overall 92 per cent of employers stated they would be confident in recommending someone with the same qualification from the same university for a similar position to the graduate’s role Overall satisfaction by key factors (skill clusters) • Foundation skills — 95 per cent • Adaptive skills — 89 per cent • Teamwork and interpersonal skills — 96 per cent • Disciplinary skills — 93 per cent • Employability skills — 93 per cent • Enterprise skills — 79 per cent

  22. QILT Website • QILT website aims to inform student choice through provision of survey results by institution and course • Two phases: • Initial release • August 2014 release • Existing data from the AGS and UES, to be provided by department • Limited scope – undergraduates at Table A and B universities • Further development • Developed progressively to August 2015 • Expanded scope – all institutions and levels of study • Enhanced functionality, e.g. search and comparison functions enhanced • Development and release of mobile app • Website updated every six months from August 2014 with latest available QILT survey data

  23. Governance • QILT Working Group will provide advice to department on implementation of QILT • QILT Working Group largely comprises former members of AQHE Reference Group • QILT Working Group will include representation from non-university higher education institutions (NUHEIs)

  24. General methodology • Independent administration • Random stratified sample • Stratification variables are specified for each instrument • HEIMS data to be made available for constructing sample frames • Additional information for the construction of sample frames, along with student and graduate contact details, to be collected from institutions

  25. Quality assurance • Biases and unrepresentativeness minimised • Quality assurance is based on achieving required confidence interval, on specific items, for a set proportion of strata • UES (entire educational experience) and GOS (further full-time study) • Interval of + or – 7.5 pp, at 90% confidence level • 75% of strata at the institution by subject area level, excluding strata with a population of fewer than six • ESS (overall supervisor rating) • Interval of + or – 7.5 pp, at 90% confidence level • 8 out of 10 broad fields of education • 25 responses for each field at each institution • Confidence intervals calculated as per 2013 UES

  26. NUHEIs • Currently around 85 Non University Higher Education Institutions (NUHEIs) accredited under HESA • Limited participation in AGS to date, not in scope for UES • NUHEIs in scope for QILT: • 2015 ESS • 2015 UES(trial of NUHEIs in 2014 UES) • 2016 GOS (from October 2015 round) • Survey results for NUHEIs to be included on QILT website subject to data being robust

  27. Tender • Tender opened 13June • Tender closed 11 July • Professor Shirley Alexander, UTS, to assist Department in assessing tenders • Expect to announce outcome by the end of July • Market testing through tender for 2014-16 triennium

  28. Confidence interval paper - anu http://unistats.anu.edu.au/surveys/toolkit/Confidence_Intervals_Sheet.pdf

  29. Population file updates

  30. Contents of the population file • Student background variables will be used for reporting purposes and to verify the representativeness of the sample. • Course-related variables are required to develop the sample frame and to pre-populate sections of the survey. • Student address details will only be used to send letters to students who do not respond the email invitation.

  31. Derived variables • Age • Concurrent /major course indicator • Cumulative EFTSL since commencement • Enrolment status • Exclusions • Sample frame (set to blank in population file) • Defining commencing and final year students • Strata • Subject area

  32. Sample frame exclusions • Enrolled in a postgraduate or non-award course • Offshore undergraduates • Onshore undergraduates in middle year of a course • Onshore undergraduates enrolled concurrently in another course • Onshore undergraduates in strata with six or fewer students • Onshore undergraduates randomly excluded from very large strata

  33. Defining final year students • The ratio of ‘EFTSL completed successfully’ (E355) and ‘currently in progress’ (E339) to the total EFTSL for the course (E350) represents a student’s progression to date. • The standard solution adjusts for attendance mode (E330) and course duration (E350) and requires a greater proportion of cumulative EFTSL for longer courses. • Final year enrolment estimates for 2014 accord reasonably well with course completions in 2013

  34. Sample strata • The sampling strata were built on the 45 Subject Areas reported on the MyUniversity website. • The code ‘2236_LY_29’ refers to Curtin University of Technology (2236) where final year students (LY) were enrolled in the Subject Area Business Management (29). • Students in combined/double degrees are allocated to the Subject Area with the fewest students.

  35. What we need you to do The information required to conduct the survey has already been compiled - well almost. We need you to do four things: • Inspect the data file for correctness, but not forensically. • Update students’ current enrolment status to make sure we don’t contact students who are not currently enrolled. • Provide email addresses for all students (and mobile phone numbers if you want us to SMS). • Provide term addresses for onshore international students.

  36. What we need you to tell us about Were there any significant changes in your student profile between 2013 and 2014 that might affect subject areas and stages of enrolment? • Have different fields of education been allocated to the same courses in 2013 and 2014? • Are course duration (E350 Course of Study Load) values correct? • Cumulative EFTSL…. • Is there anything else we need to know about your data?

  37. Sampling

  38. (not so) Random terms… • Confidence level – the percentage of all possible samples that can be expected to include the true population parameter • Confidence interval – a measure of the reliability of an estimate (margin of error) • Sample factor – the number of records required to achieve an completed survey

  39. Sampling & percentages • The accuracy of an estimate also depends on the percentage of a sample that selects a particular answer. • In general, it is easier to be sure about extreme answers so if 95% of respondents agree with a statement, the chances of making an error are small. • If 51% of people agree then the changes are greater that an error will be made because of the general ambivalence in relation to the statement. • Sampling theory suggests that percentages be treated conservatively and estimates be made from the worst case, middle of the road scenario of 50%

  40. Previous approach to sampling • Focus on obtaining a 35% response rate • Estimates of the sample factor based on 2012 UES • Difficult to determine accurately due to sample frame, response rate and mode issues • Sampling focused at an institutional level not a strata (subject area) level • Large strata capped at 1,333 records irrespective of size

  41. 2014 approach to sampling • Focus on a response rate that supports reporting at a 90% confidence level +/- 7.5% • Required response rates are different for different strata. The smaller the subject area, the better the response rate needs to be. • Sample factor estimates informed by the 2013 UES allowing for better estimation of differential response rates across strata • For large strata, the required sample records depends on the number of students and the 2013 response rate

  42. Complexity… • This approach means that the sample factor is calculated separately for each strata (ie institution by subject area) taking into account whether the student is commencing or in their final year • Our assumptions are conservative estimating that responses to key items are 50% rather than the 70% to 80% we actually obtain in response to key items • We are also sampling on the basis of a stretch goal which based on a reporting standard of 90% confidence level +/- 5% (rather than +/-7.5%)

  43. What does this mean? • We have sampled more records than we theoretically need (but we’re still a bit paranoid, it’s early days with the survey) • Despite this, there are a larger number of strata that are samples rather than a census in 2014 (due to the retirement of the 1,333 rule) • Quotas will be set at 90%, +/-7.5% but we are aiming for 5%

  44. These workings are from SRC. If you would like access to the full version of this excel file, please contact SRC.

  45. Research framework & data collection

  46. What’s ‘the same’? • HEIMS provides the sample frame • Surveying at the level of the course, not the student • 100% online data collection • Letters used as the primary non-response follow-up strategy • Incentives and prize draw are unchanged • Live reporting of data collection Footer Text

  47. What’s new (questionnaire pt 1) ? • Using a standard rather than a rotated presentation of the questionnaire modules, • Removal of the Graduate Qualities Scale (GQS) and the Learning Community Scale (LCS) from the CEQ • Presentation of the CEQ scales to a sample of students in their final years across all institutions rather than undertaking a census of all students at a selection of universities (n=400 completed surveys)

  48. What’s new (questionnaire pt 2)? • Removal of the Student Support focus area item ‘At university during (year x), to what extent have you used university services to support your study?’, and • Adding two additional items at the conclusion of the UES confirming graduation intentions and collecting a private email address to facilitate interfacing the UES with the 14/15 AGS.

  49. What’s new (fieldwork)? • Option to use SMS instead of or as a supplement to the email reminder

  50. Live Reporting Module

More Related