1 / 61

John Higton, Research Director, CFE Research Josie Harrison, Associate Director, CFE Research

Access and Participation Review Survey. John Higton, Research Director, CFE Research Josie Harrison, Associate Director, CFE Research Irshad Mulla, Senior Research Executive, CFE Research. Aims and objectives. Purpose of the study.

vtirado
Télécharger la présentation

John Higton, Research Director, CFE Research Josie Harrison, Associate Director, CFE Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Access and Participation Review Survey John Higton, Research Director, CFE Research Josie Harrison, Associate Director, CFE Research Irshad Mulla, Senior Research Executive, CFE Research

  2. Aims and objectives

  3. Purpose of the study • This research informs the Office for Students’ review of access and participation plans, student premium funding, and the National Collaborative Outreach Programme. The aims of the access and participation review are to: • Consider how OfS can best regulate access and participation to significantly reduce gaps in access, success and progression; and • To deliver regulatory outcomes that are focused, proportionate and risk-based, underpinned by evidence and joined up with other regulatory activities. • This study collects the views of the sector on how access and participation plans can most effectively work with OfS funding and other regulatory levers to improve equality of opportunity in student access, success and progression for underrepresented groups.

  4. Methodology

  5. Online survey methodology • The findings of this report are based on an online survey issued to sector representatives in two ways: • An targeted email invite to a contact list of OFS access and participation leads in 243 organisations in England. This list is primarily composed of higher and further education providers; • An open survey link available to other interested parties including sector stakeholders, sector staff and representative bodies. • The survey is designed to understand the views of those with a direct interest in higher education access and participation. • The data is unweighted as no overall population figures are available that defines sector representation. The views collected are primarily, but not solely, institutional i.e. respondents were mostly asked about how organisations design, implement and manage access and participation policy.

  6. Profile of survey respondents

  7. The survey audience are an even mix of leaders and practitioners • CEOs, Principals, Senior Directors, etc. classed as “leaders”. • Members of staff with widening access or participation responsibilities labelled as “practitioners”.

  8. A mix of roles within HEPs is reflected in the responses Job role

  9. Respondents’ backgrounds may influence survey responses • In the main, the raw data in the following slides shows that leaders are less positive than practitioners. • However, some of the differences are driven by practitioners providing equivocal responses; leaders are less likely to give a “don’t know” response to questions. • There are some differences in opinion based on whether respondents work for a Higher Education Provider (HEP) or not. • The role for half of those not working for HEPs only covers National Collaborative Outreach Programme activities. • However, it is critical to note no weighting is applied to account for non-response as suitable population data is unavailable: the data are indicative only.

  10. Views on guidelines and targets

  11. The current guidelines are at least “fairly effective” for most Effectiveness of the current guidelines in... Base: Have some responsibility or knowledge of access and participation plan (APP) targets;n=160

  12. Issues comparing performance across the sector are the main concern • These questions were posed to those with some responsibility or knowledge of access and participation plan (APP) targets. • Around half say the current guidelines are ineffective in allowing comparability of performance in access and/or participation across the sector. • 57% of those working for HEPs say this compared to 35% of those working for other types of organisations. • The other “significant”* difference is in “helping your organisation to measure the progress you are making in improving access and/or participation”. 84% of those representing HEPs felt current guidelines are effective compared to two-thirds (63%) of those working for other organisations.* * The data is unweighted and sub-groups have low base sizes.

  13. When guidelines are effective, it is because they are clear, provide direction and encourage challenge • The answer most respondents give in relation to the effectiveness of guidelines is that they provide good direction for the organisation in a clear way. • Here, direction means the guidelines outline what is to be covered in plans and how plans can be delivered and measured. The “how” encompasses the range of target groups, target setting and measurement, and how to focus the work of staff to improve access and participation. • Clarity covers how easy the guidelines are to read and apply to access and participation activity. • The guidelines also express how important ambitious targets are and, in some cases, what ambitious means (although some who think planning is less effective say they want a clearer definition of ambitious). • The data shows respondents rate guidance as “fairly effective”. The open responses are often qualify positives in some way.

  14. Providing direction and challenge with clarity They ask us to review our internal data and compare this with national data to set targets that are meaningful to the university and that represent significant ambition for us in our context. Senior Leader, HEP [Guidelines are effective] …by providing guidance on key priority groups and stages of the student cycle, however the way that the targets are set and monitored make it difficult to review and update them meaning that targets can easily become outdated and based on historic data. Senior Director, Other organisation [The guidance] provides a strategic and collective national focus that facilitates us re-thinking our targets. Our data collection will become smarter and more granular so we can fine tune the work we do for under-represented groups, and where we can make the biggest impact. Senior Director, Other organisation Current guidelines are clear about the type of targets that are required and that they should be ambitious. The problem with targets in a changing market is that they can become more difficult to achieve - for example targets on mature students - they might have seemed stretching but achievable - but since mature numbers are dropping they then become very unlikely to be achieved. Academic staff, HEP

  15. Where guidelines are ineffective, it is because they are unclear or contain inappropriate measures • The criticism of guidelines concerns aspects of perceived quality. • This includes views that they are too restrictive in measurement, provide too little guidance (or that guidance is too long) and that they discourage risk-taking. • Many of these criticisms appear subjective and seem to reflect a particular circumstance faced by an organisation. • The targets and measures used also receive some criticism. Access and participation targets can clash with other performance targets, do not change with contextual circumstances, or are not comparable across the sector. • Targets also introduce measurement effect which some feel work against the wider purpose of improving representation in higher education. Some believe targets lead providers to concentrate on activities that are easy to measure. A few also think targets are unrealistic and hence difficult to meet.

  16. Issues with guidelines and targets Targets for our institution were set many years ago. We have not been permitted to remove any targets, even though some are now holding us back from changing focus to pursue what we would regard as greater priorities for improving access. Senior Director, HEP Current guidance tends to emphasise activity that can be directly or quantitatively reported. This can [lead] institutions to prioritise some activities at the expense of others... We may also be discouraged from reporting successful or effective activities which do not result in outcomes that can be presented quantitatively. Senior Director, HEP [The guidance is] designed for a large multi-faculty university. Requirement to have targets across multiple areas isn’t helpful for small institutions which might do better focusing their activity. Senior Director, Other organisation [Guidelines] lack of specific definitions (including of issues such as 'ambitious' 'stretching' 'progressive' 'sustained'), agreed measures and data sources and lack of clarity in terms of rate of expected improvement. Senior Director, HEP

  17. Common targets are also seen as an effective tool by most If common targets were used, how effective would they be in... Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  18. Common target setting is broadly viewed as useful • The only sub-group difference of note is for the whether common targets effectively “allow you to set ambitious targets for underrepresented and disadvantaged students”. • More than four in five practitioners (84%) felt common targets were at least fairly effective compared with two thirds of leaders (68%). • When organisations’ representatives value ambitious targets, they say they provide direction and encourage challenge. They ask us to review our internal data and compare this with national data to set targets that are meaningful to the university and that represent significant ambition for us in our context. Senior Director, HEP

  19. Respondents tend towards giving OfS responsibility for consistent measures, but… Who should be responsible for devising each of the following elements of targets so that their measurement is consistent across the sector? ◊ 6.4 ◊ 5.5 ◊ 5.5 ◊ 5.4 Base: n=198 ◊ denotes mean score Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  20. … there are differences of opinion between organisation types • HEPs are more likely to place the responsibility for consistent measures with OfS whereas other organisations suggest responsibility lies with them, or is shared. • Practitioners are more likely than leaders to say OfS should be responsible for the consistency of the units of measurement (48% versus 24%). Leaders are more likely to say responsibility of data source consistency should be shared (40% versus 28%). Base: Have some responsibility or knowledge of APP targets OR work for a HEP; n=198 Code 0 – 3 = Mainly organisational responsibility Code 4 – 6 = Shared Code 7 – 10 = Mainly OfS responsibility

  21. Responsibility for the comparability of the same targets lies more with OfS Who should be responsible for devising ... targets for them to be comparable across the sector? ◊ 6.6 ◊ 6.2 ◊ 5.9 ◊ 5.7 Base: n ◊ denotes mean score Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  22. HEPs are more likely to say OfS should be responsible for targets that are comparable • Practitioners are more likely than leaders to say OfS should be responsible for the comparability of the units of measurement (55% versus 40%) and the metrics used (58% versus 43%). • In both cases, leaders are more likely to say these responsibilities should be shared. Base: Have some responsibility or knowledge of APP targets OR work for a HEP; n=198 Code 0 – 3 = Mainly organisational responsibility Code 4 – 6 = Shared Code 7 – 10 = Mainly OfS responsibility

  23. Responsibility for useable targets is a little more shared compared to the previous measures but… Who should be responsible for devising ... targets for them to be easy to understand and use across the sector? ◊ 6.0 ◊ 5.5 ◊ 5.4 ◊ 5.2 Base: n ◊ denotes mean score Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  24. … there are differences by role as practitioners place more responsibility on OfS than leaders • Practitioners are more likely than leaders to say OfS should be responsible for useable targets, except for their number. • Leaders are more likely to say their organisation should be mostly responsible for setting useable targets, although overall, opinion is split on all measures for leaders. • There were no differences found in this question based on the type of organisation for which the respondent worked. Base: Have some responsibility or knowledge of APP targets OR work for a HEP; n=198 Code 0 – 3 = Mainly organisational responsibility Code 4 – 6 = Shared Code 7 – 10 = Mainly OfS responsibility

  25. Most agree that linking student premium funding to targets would support progress To what extent do you agree with the following statements about the Office for Students’ student premium funding? Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  26. Leaders disagree more than practitioners on all statements • Half of leaders (49%) disagree that student premium funding should be used to “support the providers with the most ambitious plans to improve access and/or participation”. Less than a third of practitioners (30%) disagree. • A similar difference of opinion is found for the statement that student premium funding “should be reduced where providers do not make sufficient progress towards improving access and/or participation” (45% leaders; 31% practitioners). • Although general agreement was higher for the other two statements, leaders are more likely than practitioners to disagree with them. • Nearly three in ten leaders (28%) disagree that “Linking student premium funding to access / participation targets would support progress in the sector”; one in ten (11%) practitioners disagree. • More than a third of leaders (37%) disagree that linking data “would encourage providers to be ambitious when setting targets”; a quarter of practitioners (24%) disagree. • No differences exist by organisation type.

  27. Views on the cycle of plans

  28. Most providers favour a planning cycle of at least two years Frequency of submitting A&P plans

  29. Overall, practitioners favour a shorter cycle compared to leaders • Just 5% of leaders favour the current annual cycle compared to three in ten (29%) practitioners. • Leaders clearly favour a three year cycle whereas the views of practitioners is split. The current cycle of an annual submission of a plan and annual monitoring returns diverts key resources away from delivery of activity to effect change, therefore targets cannot be as ambitious as we would like as resource availability is a limiting factor. Senior Director, HEP

  30. The rationale for increasing the reporting cycle is to improve planning and analyses

  31. An annual reporting period makes it harder to measure impact Two years will give sufficient time for implementation and evaluation without the constant churn of writing the APP. The amount of change in government policy means that any longer than two years would mean the document could be out of date quickly. Senior Director, HEP • The most frequent reason given for increasing the period between reporting was to improve the quality of outcome and impact measures. • Some favoured the shift to a two-year reporting cycle in order to balance the need for contemporaneous data with enough time to realise impact. • However, most respondents citing impact measurement felt three years between submissions was needed to effectively measure impact. It would enable providers to implement and evaluate measures effectively before having to submit a further iteration of the plan. We have had to submit two further plans before we have even implemented our first which means we have no real data to work with. Business support staff, Other organisation

  32. Both a shorter and longer time period can help organisations plan Annual review would allow institutions to reflect on what is working well, make small changes to things that could be improved or develop new approaches in response to emerging needs. Regular review and monitoring … would also ensure risks were well managed and issues did not escalate.. Academic staff, HEP • Short cycles mean that organisations are planning using current data. In turn, this means they can make adjustments to plans regularly and report on those changes regularly. • However, others think that could still be achieved using summary data, or through focussing on specific targets – a full, redrafted plan each year is not deemed necessary. • More frequent reporting could be requested where performance could improve, although some feel this provides an incentive to set limiting targets. • There is a connection between better institutional planning, impact analyses and being given time to develop, test and implement new ideas noted by a number of those advocating a longer planning cycle. It would allow institutions to plan, implement and evaluate interventions to meet identified challenges in access and participation. However I would suggest more frequent submissions for institutions with the poorest performance in Access and participation. Academic Staff, HEP

  33. Time to develop, test and implement activity is a further suggested benefit of a longer cycle Three years would allow for long-term programmes of work to be established with continuity of funding, and cement relationships with schools. It's also a long enough period of time to establish how effective approaches are, and to refine them for future submissions as required. Senior Director, Other organisation • Access and participation activity is designed, testing and implemented over a number of years and, for some, the planning and reporting cycle should take that into account. • Responses of this nature often tie in with others wishes to spend more time on delivery over administration, to allow more time for impact analysis and to improve the quality of monitoring metrics. • Some also relate a longer-term reporting cycle to better strategic planning for their organisation and/or partnerships. Two years will give sufficient time for implementation and evaluation without the constant churn of writing the APP. The amount of change in government policy means that any longer than two years would mean the document could be out of date quickly. Senior Director, HEP

  34. Monitoring arrangements

  35. The current monitoring arrangements are viewed as ineffective for monitoring progress Effectiveness of the existing monitoring arrangements for… Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  36. The key driver of difference is knowledge • Respondents working for organisations other than Higher Education Providers are more likely to answer “don’t know” to the questions posed. As a result, the sub-group differences are driven by limited knowledge rather than by a difference in opinion. • Between five and six in ten of those working for HEPs rate all four aspects of monitoring arrangements as ineffective. Of particular note is “assessing your institution’s progress compared to other institutions”; 63% of HEP representatives said existing monitoring arrangements are, at best, not very effective. • Practitioners are more likely than leaders to respond “don’t know” to each statement. • Leaders are also especially critical of monitoring to “assess your institution’s progress compared to other institutions” with 72% rating such monitoring as ineffective. • 64% of leaders are also critical of “identifying how you can improve access and/or participation in a more cost-effective way”; 59% of leaders also say monitoring to “assess your institution’s progress compared to the sector as a whole” is ineffective.

  37. The monitoring process and return receives mixed reviews To what extent do you agree with the following statements? Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  38. The reporting cycle is an issue with the monitoring return Annual submissions don't allow for time to be spent developing, delivering and evaluating the true impact of any activity before we have to do the next plan or submit the next return. Senior Director, HEP • Responses to the question “how often do you think access and participation plans should be submitted to the Director of Fair Access and Participation for approval?” provide some explanation why the monitoring return is deemed ineffective by some. • For some, the annual reporting cycle does not allow time to evaluate impact as more time is required to deliver and enact plans. • For others, the metrics used to measure progress are biased to those that are easier to measure, hence monitoring does not provide a full report of a plan’s impact. Reviewing the access and participation plan on a two year basis would allow a more realistic assessment of the impact of measures taken by a provider. Academic staff, Other organisation

  39. Although burdensome, annual monitoring is broadly effective for monitoring progress To what extent do you agree with the following statements? Base: Have some responsibility or knowledge of access and participation plan (APP) targets OR work for a HEP; n=198

  40. For some, burden relates to the reporting requirements • Qualitative responses are useful on explaining the relationship between burden and reporting. A number of respondents note that annual comprehensive planning is burdensome and could be replaced with a less frequent full plan supported by annual summary reporting. • Leaders are again more critical: two-thirds (68%) agree the monitoring process is burdensome and over a third (37%) disagree annual reporting is an effective way of reporting outcomes. The comparative figures for practitioners is 49% and 19%. Every two years for formal reporting allows for HEIs to develop activities and refine approaches. Every other year an interim report could be required (slimmed down, and perhaps targeted to key areas). Senior Director, HEP Annual tracking is fine, but a full strategic re-write every year is pretty meaningless. A three year cycle gives institutions and the OfS the opportunity to let more ambitious ideas fully germinate. Senior Director, Other organisation

  41. Views on funding and investments

  42. Respondents hold mixed views on the effectiveness of student premium funding • There is little difference by sub-group on either measure. • Respondents providing an opinion are more likely to agree than disagree with each statement. • However around one in five disagree that investment guidance (22%) or student premium funding arrangements (20%) lead to improvements in equality of opportunity. Base: All respondents; n=245

  43. Where an opinion is expressed on student premium funding, it is broadly positive How effective is the Office for Students’s student premium funding in … Base: All respondents; n=245

  44. Respondents working for HEPs are more critical in two measures • Respondents working for HEPs are less likely than those working in other organisations to agree that student premium funding is effective in “supporting the collaborative access and participation activities of higher education providers” (40% agree versus 53%). • HEP representatives are much more likely as others to disagree that student premium funding is effective in: • “supporting the development of robust evaluation plans” (32% disagree versus 18%); or • “adding value to the investment providers make through their access and participation plans” (17% disagree versus 5%). • The disagreement found amongst HEP representatives is mirrored by leaders, who are more likely that practitioners to disagree that student premium funding is effective in: • “supporting the development of robust evaluation plans” (33% disagree versus 19%); or • “adding value to the investment providers make through their access and participation plans” (18% disagree versus 6%). • However, bar the views about evaluation plans, respondents are more likely than not to say student premium funding is effective on the other test measures.

  45. Supplementary questions on how to improve funding provided a varied response

  46. Targeting or channelling funding Funding could be weighted more heavily towards those institutions with a high proportion of WP students, as the challenge of supporting say 40% from WP backgrounds is completely different from the challenge of supporting 5%. Business support staff, HEP • Around a quarter of the comments suggest improvements in targeting funding. • A common view in this category is to ensure those with more eligible students get more funding because they say they face a greater administrative, management and wider support burden. • A view expressed by mostly non-HEP respondents is to channel some funding directly to individual students rather than via organisations. • Counter to this, there is also a view that FE institutions should be funded directly because they have different requirements to HEPs. Offer travel and accommodation bursaries/scholarships for students from NCOP areas. Capture students based on criteria other than location such as parental income and families who have not attended university. Academic staff, Other organisation Give more consideration to vocational colleges which are very different to schools and sixth form. Further, more consideration needs to be given to the less academic and more vocational NCOP students. Academic staff, Other organisation

  47. Accountability and measurement are sometimes perceived as imperfect Look at the "rate of return" - institutions that are genuinely committed to access and are successful, should be supported more to achieve more successful participation outcomes for all students. Senior Director, Other organisation • A number of respondents advocate performance measurements whereby funding is channelled to those performing well. • However, there is a divergent view on what constitutes good performance. As noted earlier, some consider this to be about volume, others about harder-to-measure issues such as the quality and effectiveness of support structures. • In terms of measures, a variety are suggested (volumes, rates, progression, diversity of targets) and which supports the view from some that targets should reflect the contextual circumstances of a given provider. Currently, a few institutions are leading the way for the sector and bearing the costs involved in piloting, implementing and evaluating innovative approaches in order to identify what works and share this effective practice more widely. Senior Director, HEP The OfS needs to maintain a funding commitment to supporting colleges and universities who actively recruit a high proportion of students from low participation groups. Senior Director, Other organisation

  48. Widening targets or allowing more flexible use of funding [OfS] could consider introducing more categories of vulnerable student group for explicit consideration under the student premium scheme. Academic staff, HEP • A number of respondents ask for more freedom or recognition that targets can miss important support work improving equality of opportunity. • A number of comments here consider the target audience and how existing eligibility criteria do not always meet the most need, especially those with a geographical focus (as opposed to the need of the specific students). • Several respondents want funding to recognise the longer-term purpose of access and participation activity and feel this is missing in the current arrangements. In particular, the narrow measures used to monitor performance do not always take wider support work into account. Instead of targeting students in specific postcode areas with additional funding for separate activities… introduce more funding across the cohort so that all students can benefit from the opportunities. Academic staff, Other organisation More focused programme-level funding over longer term horizons for outreach to ensure stability and sustained investment in infrastructure to support delivery… will have longer term benefits and return for the whole sector. Senior manager, HEP

  49. Considering wider factors that affect access and participation [Provide] more CPD to school and college staff… to help facilitate participation in outreach activity. One of the biggest barriers to the effectiveness of WP schemes is staff within schools having time to coordinate... Activity. Business support staff, Other organisation • These comments link to the previous page about wider factors. The consider the contextual factors that have an impact on equality of opportunity. • These include geographical factors outside of those that are perceived to govern NCOP such as how funding relates to the local or regional socio-economic contexts in which providers operate. • Other comments relate to wider management and delivery in which suggestions are made to link funding to support activity, or recognise that collaborative activity some requires greater financial support to deliver effectively. • A few also think that interventions need to come earlier in potential students’ lives because by the time they leave school, many issues limiting access to HE are deeply engrained. This is a huge challenge due to the cultural norms and behaviours of Universities, schools and colleges. We would argue collaborative work, funded by projects such as NCOP makes a significant difference when we work in partnership. Senior manager, Other organisation To truly widen participation, universities and colleges should be working with young people in disadvantaged areas from as early a stage as is possible (KS1 onwards). Senior manager, HEP

  50. Respondents broadly agree that plans and funding work together effectively To what extent do you agree with the following statements about the total investment made by higher education providers and the OfS to improve access and participation? Base: All respondents; n=245

More Related