1 / 16

Portfolio Committee

Portfolio Committee. Report on the Investigation into Delays in the Release of Examination Results in 2008 24 th March 2009. Problem Statement.

sevita
Télécharger la présentation

Portfolio Committee

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Portfolio Committee Report on the Investigation into Delays in the Release of Examination Results in 2008 24th March 2009

  2. Problem Statement • On 30th December, it was announced that the results of 56 351 candidates had not been processed. The Department has apologised to all these candidates, and invited anyone who has been prejudiced in any way to contact us so we can try to assist • The Minister directed the National Examinations Irregularities Committee (NEIC), including representatives of SAQA, HESA, SADTU, SAOU, and the GITO Council, together with Umalusi, to investigate the causes and make recommendations • The Committee conducted on-site investigations into all key processes leading up to the release of results

  3. Context • Examinations commenced one month later (October to November) to maximise teaching time; this reduced the time to process results from 47 days to just 27 days • Of the 56 351 results reported to be outstanding, 29 614 candidates were absent from the exam, leaving 26 737 results delayed (4.5% of the total) • Of these, many were genuine “irregularities”, where students had incorrect exam or ID numbers, had written for a subject other than that they had registered for, or were suspected of cheating • By 16 February (before university commenced) 92 results were outstanding.

  4. Context • Tightening up of the system • Previous programme did not specify the components of school-based assessment – resulted in casual approach by learners, teachers and schools. New system has revealed this practice • IECS only accepted computer generated mark sheets this year • Problem of candidates changing courses at a late stage

  5. Findings The overall findings of the Ministerial Committee were that: “the delay in the release of the results of the 26 737 candidates could not be attributed to a single factor but a collection of variables that impacted on the examination processes and procedures”.

  6. Findings System level factors that impacted included the following: • First exam based on NSC policy - lack of compliance with policy • The decision to delay the start of the exams without adjusting the release date • Technical issues with the new exam system • Financial and human resources not expanded to address new demands

  7. Findings Detailed factors that impacted included the following: • Incorrect registration of learners (Wrong ID or exam numbers; wrong subject registrations) • Problems with collection, capture and control of SBA and exam mark sheets from school or examination centre to District and to province • Other irregularities such as incorrect student data

  8. Findings • 95% of the results were processed on time: no “system breakdown” • Large provinces (Eastern Cape, KwaZulu Natal and Limpopo) all processed on time • Based on information to hand, all tender processes above board; partnered with SITA throughout. Magna FS provides examination services for 23 countries or organisations, including the University of London, British Council, Sri Lanka, Zambia and others

  9. Stepstaken: Departments • Provinces conducting compliance audits to determine possible neglect of duty by school principals or officials • Cases currently in process in affected provinces: • Mpumalanga: Officials relieved of exam duties pending further investigation • Gauteng: one official charged for not ensuring compliance; some principals being investigated

  10. Steps taken: Departments • Report does not identify misconduct by officials; capacity concerns to be addressed through PMDS processes • Corrective plan prepared and presented to CEM • Caution against “reckless” approaches in regard to scarce skills (especially ICT skills) • All Departments committed to taking action as required

  11. Advocacy • Copies of the report to be provided to Public Protector and Human Rights Commission to decide if further action should be taken • Work against “game playing” by various forces: • Claims of no results but no names provided - by principals, political formations, parents, unions and others – played by media • Claims that examiners were told to inflate marks (matter investigated – nobody could substantiate claim) • Claims that the maths paper was of a poor standard (investigated by independent team who judged it a proper standard)

  12. Corrective action for 2009 • Lessons learnt from 2009 have led to a national plan to correct all the findings of the report. This includes the following: • A further investment in resources, especially skilled human resources, at national and provincial levels • A comprehensive review of capabilities of the IECS • Policy clarity in regard to procedures and regulations, and tight monitoring of every step • A review of the date of announcement, given the later commencement of exams

  13. Corrective action for 2009 • Tighten all regulations relating to examinations, including dates for course changes and submission of school based marks, and ensure compliance with these deadlines at all levels • Increase capacity of examination units at national and provincial levels • Advocacy among teachers and candidates regarding importance of SBA (especially for Life Orientation!)

  14. Conclusions • Most learners (over 500 000) were well served by the system • To those not well served, we repeat our apology and offer to assist wherever possible

  15. Conclusions • The report found that there had been: • Policy implementation failures, with people not being sure of the rules, and/or not complying with these • Computer problems, with no opportunity for a full-scale testing of the system • Capacity constraints, in terms of unfilled posts, and insufficiently trained administrators • Steps have been or are being taken to address all of the matters raised in the report

  16. Conclusions • These steps have the promise of a significant improvement in 2009,while recognising that in any exam system there will always be “irregularities” which arise and which delay the release of some results • To succeed we need the compliance of all layers: learners doing their portfolios; teachers handing these in; Districts checking and submitting schedules; and provinces loading into the computer • The Department then commits to collate and report timeously on all results for the 2009 National Senior Certificate Examinations

More Related