320 likes | 417 Vues
This paper explores the development of computer-based assessment (CBA) in UK higher education, focusing on quality assurance methods and challenges. It covers the use of CBA, item-based nature, QA processes in different contexts, inhibitors of quality, and possible solutions. The study examines the growth of CAA, discrepancies in discipline uptake, and the implementation of quality MCQs. It also delves into the QA frameworks in UK HE, other educational sectors, and international standards, with a specific focus on CBA. Core QA tasks, inhibitors, and training needs for assessment developers are discussed.
E N D
Precepts • All Higher Education Institutions (HEIs) must demonstrate adherence to precepts • General statements, not specific prescriptions • Clear: • HEIs must ensure that ‘ … assessment policies and practices are responsive and provide for the effective monitoring of the validity, equity and reliability of assessment.’
Assuring quality Computer-Based Assessment developmentin UK Higher Education Andrew Boyle (NFER) Dave O’Hare (CIAD)
Structure of the paper • Use of CBA • Item-based nature of CBA • QA in different contexts • Core tasks for QA • Inhibitors of quality • Possible solutions to QA problems • Conclusions
Use of CBA • CAA has grown • Survey data (Stephens and Mascia; CAA Centre national survey) • Equivocal quantitative results • Discrepancy of take up between disciplines • Other evidence of growth • Increase in computer use • Virtual Learning Environments (VLEs)
Item-based nature of CBA • Traditional assessment in UK • ‘essay and problem-type final examinations and similarly constructed coursework’ • CBA based on discrete items • Lecturers must devote time to development • High-quality MCQs difficult to write • QA in pre-administration (development) phase
QA in different contexts • UK Higher Education (HE) • Other UK educational sectors • Other countries • Documents that refer specifically to CBA
UK Higher Education 1 • Quality Assurance Agency (QAA) • Oversees quality and responsible for standards • Section 6 of Code of Practice – assessment • Assessment Code consists of precepts • Other documents • Audit Handbook • Subject Benchmark Statements • Framework for Higher Education Qualifications
UK Higher Education 2 • QA documents do not: • mandate universities to carry out specific steps in assessment development • refer specifically to CBA as a particular assessment method whose QA demands are quite different to traditional assessment
Other UK educational sectors • Qualifications and Curriculum Authority (QCA) • Mandatory Common Code of Practice
QCA Code of Practice • Specific and rigorous • Detailed prescriptions • Specific tasks • Specific staff • Awarding bodies must accept, and be able to implement, the Code • Implementation monitored annually by QCA • Shortcomings identified and assurances sought
Documents from other countries • Joint Standards (AERA, APA, NCME) • Assessment development methodology • Qualitative and quantitative control stages • Qualitative stages: from assessment specification to item review • Statistical analyses address issues of fairness, item and instrument quality • Reputable developers provide information to users, allowing them to evaluate quality
Documents that refer to CBA • Association of Test Publishers (ATP) Guidelines • British Standards Institution - BS 7988: 2002 • Code of Practice for the Use of Information Technology in the Delivery of Assessments • International Test Commission (ITC) • Draft Guidelines on Computer-Based and Internet-Delivered Testing • Scottish Qualifications Authority (SQA) • Guidelines for Online Assessment for Further Education
ATP Guidelines • Planning for a CBA similar to planning for other kinds of assessments • Purpose must be ascertained • Needs analysis – description of population • Computer literacy of population • Avoid construct irrelevant variance
BS 7988 • Preparation of content outside the scope of Standard • A poor assessment, delivered appropriately, would conform to BS 7988
ITC draft guidelines • Internationally-recognised set of guidelines describing good practice in CBA • Developers should: • Document constructs to be measured • Understand professional and ethical issues • Make appropriate use of psychometric models and theories
SQA Guidelines for FE • Ensure institutional commitment • Define reasons for using online assessment and select appropriate delivery methods • Developers should: • Choose items that are appropriate to the knowledge or skill being assessed • Collect metadata, examine item quality • SQA Guidelines strongly favour use of item banks • Share high-quality materials between institutions
Core tasks for QA • Analysis of need and description of purpose • Detailed documentation • Iterative process to produce content • Field testing prior to live administration • Administration under controlled conditions • Review of outcomes to consider fairness and validity
Inhibitors of the core tasks • Assessment developers’ lack of training • Inexperience at writing ‘objective’ items • CBA has been an ad hoc innovation • Hindrances to field testing
Lack of training • All lecturers are responsible for assessment • Standard of training in assessment is low • New staff are ‘thrown in at the deep end’ • Lack skill at writing essay prompts of comparable difficulty • Contrast the mandatory training in other sectors
Inexperience of ‘objective’ items • Shift to new formats ‘time consuming initially’ • ‘[Writing MCQs] is so difficult that you almost have to be a professional at creating these questions.’ • You do have to be a professional at producing items • Accountability of CBA - possibility of challenge in the courts
CBA has been an ad hoc innovation • Enthusiastic individuals working alone or in small groups • Difficult to embed CBA in wider QA practices • Developments proceed in ‘anarchic fashion’ • Difficult to disseminate innovation effectively
Hindrances to field testing • Inhibitors in UK HE • Security fears • Non-stable programmes and small cohorts of students • Contrast between US and UK HE research • US: ‘massive data sets’ permit sophisticated analysis • UK: ‘small-scale craft activities’
Possible solutions to QA problems • Institutional support • Staff training and accreditation • Sharing materials • Qualitative research to assure quality
Institutional support • Examples of institutional support • CAA Support Unit, Luton; FLI, Loughborough; CIAD, Derby • CBA protocols • Need to include assessment development • Provide for staff training • Staff training only part of the solution • Use of item-based assessment experts • Remove ownership of assessment from subject expert • A major shift in institutional culture
Staff training and accreditation • Developing ‘objective’ questions • Professional skill different to teaching or conducting research • CBA must be accompanied by training • Early writers consider training voluntary or optional • Given impact of assessments, mandatory training essential • More formal systems • University of Derby – CPD module in assessment • SQA – Advanced Certificate in e-assessment • Mandatory certification using national qualification preferred
Sharing materials • ‘What will really make CAA work … is the development of large assessment item banks … where colleges can combine their efforts to create high quality, peer-reviewed questions.’ • Item banking • Large structured database of assessment items • Based on sophisticated software • Used to generate many forms of assessment
Item banking (IB) • Some suggest using items from textbooks • Others feel this approach inappropriate • Item bank technology embeds QA processes • Items move within the bank if QA stage is complete • IB not a ‘quick fix’ – emphasises need for QA • IB dependent on Item Response Theory (IRT) • Permits the reassembly of assessment forms
Item banking in UK HE • Previous attempts foundered due to pragmatic difficulties • Relative contributions of partner institutions • Lack of IRT use in HE • In short term IB unlikely to guarantee quality
Qualitative research to assure quality • Qualitative techniques • Focus groups of ‘academics-assessment developers’ • ‘CBA expert panel’ • Research on assessments • Student opinions (questionnaires) • Verbal Protocol Analysis
Conclusions 1 • Mandatory training and certification of staff via nationally recognised qualifications • Expert panels to review assessments, members include assessment development experts • Piloting of new items in assessments (with scores not contributing to grades) • Full evaluation of CBA within institutions, using quantitative and qualitative approaches
Conclusions 2 • Subject networks to discuss methods and arrangements for sharing validated items within national assessment banks • Wider dissemination of QA procedures within the CBA community
a.boyle@nfer.ac.uk www.nfer.ac.uk d.ohare@derby.ac.uk www.derby.ac.uk/ciad Contacts