1 / 80

Select Committee on Education and Recreation Report on the Quality Assurance of the NSC

19 February 2014 Dr Mafu S Rakometsi. Select Committee on Education and Recreation Report on the Quality Assurance of the NSC. WHAT CAN WE LEARN FROM THE NSC RESULTS? Role of Umalusi, principles, approaches and processes of QA Dr Mafu Rakometsi - CEO of Umalusi. Role of Umalusi.

pier
Télécharger la présentation

Select Committee on Education and Recreation Report on the Quality Assurance of the NSC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 19 February 2014 Dr Mafu S Rakometsi Select Committee on Education and RecreationReport on the Quality Assurance of the NSC

  2. WHAT CAN WE LEARN FROM THE NSC RESULTS? Role of Umalusi, principles, approaches and processes of QA Dr Mafu Rakometsi - CEO of Umalusi

  3. Role of Umalusi • Umalusi is the quality assurer in general and further education and training of the National Qualifications Framework (NQF) • The Council ensures that the qualifications and curricula within GFET are of quality, that providers of education and training have the capacity to deliver and assess qualifications and are doing so to expected standards of quality, and that assessments are of the required standard

  4. Establishment of Umalusi • Established through the promulgation of the General and Further Education and Training Quality Assurance Act, 2001 (GENFETQA Act number 58 of 2001, as amended 2008) • Two predecessors, namely the Joint Matriculation Board - JMB (1918) and the South African Certification Council - SAFCERT (1986) • Umalusi started work in 2002 having taken over from SAFCERT

  5. Establishment of Umalusi • Umalusi was established as a band education and training quality assurance body under the GENFETQA Act in 2001 (NQF levels 1-4): • Quality Assuring exit point assessments for qualifications in schools (National Senior Certificate), FET Colleges (N3, NCV) and for Adult (GETC) • Accrediting Independent schools, private FET Colleges and Adult Learning Centres as well as private assessment bodies

  6. Umalusi’s brief • In 2007 the review of the Implementation of the NQF was completed and in 2008 the GENFETQA Act was amended creating Umalusi as one of three Quality Councils with extended mandates, the other two being Council on Higher Education and the Quality Council for Trades and Occupations • The object of the amended Act is to enhance the quality of general and further education and training through: • Development and management of a sub-framework of qualifications for GFET

  7. Umalusi’s brief • Quality assurance of: • Qualifications and curricula • Provision through the accreditation of private • providers of education and assessment, to provide • and assess these qualifications • Exit point assessments of the qualifications

  8. Umalusi’s brief • Certifying learner attainments for these qualifications • Conducting research on matters pertaining to the GFET sub-framework of qualifications • Advising the Minister on matters related to the GFET sub-framework of qualifications

  9. Quality Assurance of the DBE 2013 National Senior Certificate Examination Emmanuel Sibanda – Acting Sen. Manager : Quality Assurance of Assessments

  10. PURPOSE: • To ensure that the question papers are of the required standard (standard captured in the NCS and SAG’s) • To ensure that the question papers are relatively: - fair - reliable - representative of an adequate sample of the curriculum - representative of relevant conceptual domains - representative of relevant levels of cognitive challenge Moderation of question papers

  11. Moderation of the question papers Approach: • Question papers set by panel of examiners – DBE • Internally moderated by DBE • Externally moderated by Umalusi • Subsequent moderations and approval

  12. Moderation of the question papers Criteria : • Technical criteria • Internal moderation • Content coverage • Text selection, types and quality of questions • Predictability • Cognitive skills • Marking memorandum • or guidelines • Language and bias

  13. Moderation of the question papers Findings: Areas of Good Practice • Percentage of question papers and memoranda approved after first and second moderation ( Nov 2013- 70% ; Mar 14 – 78%) (out of 130 papers) • Simultaneous moderation of final and supplementary question papers.

  14. Moderation of the question papers Findings: Areas of Concern • Adherence to timeframes and impact on quality of setting and moderation. • Question papers requiring more than four moderations. • 2 papers for November 2013 (Isizulu HP P1, Isixhosa HL P1) and 2 for March 2014 (Business Studies, Isizulu HL P1)

  15. Definition: • Internal assessment refers to any assessment conducted by the provider , the outcome of which count towards the achievement of the qualifications • Umalusi appoints panels of moderators / subject specialist to carry out this mandate Moderation of internal assessment

  16. Purpose of Umalusi’s verification: • To verify the rigour and appropriateness of the DBE moderation process – linked to DBE plans • Ascertain the degree to which assessment bodies/provinces are attempting to ensure standardisation across • Ascertain the standard and quality of the tasks • Determine the extent and quality of internal moderation and feedback. • Determine the reliability and validity of the assessment outcomes Moderation of internal assessment

  17. Moderation of Internal Assessment • Approach 1 (June/July 2013)- verifying the DBE SBA • moderation

  18. Moderation of Internal Assessment • Approach 2 (June/July 2013)- Umalusi independent • moderation (own sample)

  19. Moderation of Internal Assessment • Areas of good practice: • DBE conducted very rigorous moderation and provided useful verbal feedback to PDE’s at the end of each moderation session • General adherence to policy in terms of number of tasks done, presentation of learner evidence/portfolios.

  20. Moderation of Internal Assessment • Areas of concern: • While internal moderation is being done in most schools, much of the focus is on compliance (monitoring) and not on qualitative issues (actual moderation) • Persistent problem of lack of constructive feedback given back to learners after moderation • Teachers are still unable to develop tasks pitched at appropriate cognitive levels: focus is more on lower cognitive level.

  21. Monitoring of Examinations • “State of readiness” • Conduct of examinations • Marking

  22. Monitoring of Examinations • “State of readiness” • Findings: • All provinces have working examination systems in place • Of concern: • Many vacant posts and use of contract staff • In some PDE’s closer monitoring printing of question papers needs attention • Inefficient or lack of coordination with districts wrt exam related processes

  23. Monitoring of the writing phase Findings: • Generally examinations conducted in line with policy • Isolated instances of non-compliance (suitability of venue, identification of learners)

  24. Monitoring of the marking process

  25. Monitoring of marking process • Findings: • Marking centres were generally well organised and suitable for the task. • Inadequate and inexperienced security and in some provinces, • Absence of communication facilities reported at one centre in one province

  26. PURPOSE: • Moderation of marking determines the standard and quality of marking and ensures that marking is conducted in accordance with agreed practices • Umalusi engages the following during the moderation of marking • Pre-marking/memorandum discussion: centralised memo discussions recommended - this will ensure consistency across marking centres • Moderation of marking (centralised and on-site) Verification of marking

  27. Marking verification • Memo discussion meetings: • Areas of good practice: • The memo discussions for the approval of final memoranda went relatively well in 2013. Provision of an extra day of training for marking was very welcome.

  28. Memo discussion meetings • Areas of concern: • The time between the examination dates and the memo discussions was generally far too short to allow pre-marking to take place. This was reported in several subjects, and seriously compromised the validity of the process, as meaningful discussion depends on the pre-marking of scripts. • Some provinces sent only one representative or none at all to the memo discussions.

  29. Centralised & on-site marking verification • Areas of good practice: • Many external moderators expressed the opinion that the accuracy of marking had improved slightly. • External moderators unanimous in appreciation of the impact that thorough training at the memo discussion meetings had on the quality of marking.

  30. Centralised & on-site marking verification • Areas of concern: • Markers still experience problems with regard to interpreting answers to open-ended and higher order questions • Use of rubrics continues to be a serious concern e.g. • the inappropriateness of the rubrics used for P3 of HL • and FAL: the descriptors do not facilitate good • marking. • There are still markers marking literature questions, who do not have a thorough knowledge of the stories/dramas/novels/poems they are marking.

  31. Monitoring of the writing phase Areas of good practice: • The DBE exam was generally administered in line with policy. No major concerns were reported. One can see growth in the administration and conduct of the exams by PDEs.

  32. Examination Irregularities • The majority of irregularities were of a technical nature and these were reported to Umalusi according to the established channels. • Some irregularities were as a result of registration-related problems, e.g. candidates nor appearing on mark sheets, some registered for incorrect subjects. • Umalusi represented on NEIC

  33. WHAT CAN WE LEARN FROM THE NSC RESULTS? STANDARDISATION PROCESS

  34. Why Umalusi standardises results, and how • Provision of GENFETQA – Council may adjust raw marks. • International practice – large scale assessment systems • Standardisation – process used to mitigate the effect of factors other than learners knowledge and aptitude on the learners performance. • Sources of variability – difficulty in question paper, undetected errors, learner interpretation of questions

  35. Objectives for Standardisation • To ensure that a cohort of learners is not advantaged or disadvantaged by extraneous factors other than their knowledge of the subject, abilities and their aptitude. • To achieve comparability and consistency from one year to the next.

  36. Why Umalusi standardises results& how • Assumptions – for large populations the distribution of aptitude and intelligence does not change appreciably • Process of standardisation • Moderation of question papers • Review of learner performance against historical performance of candidates in each subject. • Historical average (norm) constructed using past 3 to 5 years data. • Pairs analyses provides further comparisons of raw means • Statistical moderation of Internal assessment

  37. Why Umalusi standardises results & how • Qualitative input meetings • Reports (Moderator, Chief Marker and Internal Moderator) • Umalusi research (maintaining standards & post exam analysis) • Responsibility of Assessment Standards Committee • Committee of Council • Responsible for setting and maintaining assessment standards • Observers (SAQA, HESA, Teacher Unions)

  38. Process for Standardisation • Continuation of JMB and SAFCERT model • Assessment Standards Committee • Qualitative Reports • Pre-standardisation and Standardisation meetings • Standardisation booklets (data) – subject raw mark distributions (external written component only) of entire cohort. • Subjects are standardised individually, in a linear and non-iterative manner

  39. Principles applied in the standardisation of • examination marks • In general no adjustment should exceed 10% or the historical average • In the case of the individual candidate, the adjustment effected should not exceed 50%of the raw mark obtained by the candidate • If the distribution of the raw marks is below the historical average, the marks may be adjusted upwards subject to the limitations

  40. Statistical moderation Scope of standardisation 2013 • 59 standardised • Raw marks accepted: 38 subjects • Moderated upward : 5 subjects • Moderated downward : 16 subjects

  41. WHAT CAN WE LEARN FROM THE NSC RESULTS? STANDARDISATION DECISIONS DBE NSC 2013

More Related