Progress Testing with SIR A Case Study Based on theMcMaster Undergraduate MD Programme SIR UK User Group Conference Aberdeen, UK, 21 June 2002 David Keane, Research Associate (email@example.com) Programme for Educational Research and Development Faculty of Health Sciences, McMaster University Hamilton, Ontario, Canada www.fhs.mcmaster.ca/perd
Objectives 1. Introduction to progress testing • definition • purpose • method • goals • special features • basic patterns in performance data 2. Using SIR for progress testing
Progress testing \ a definition • longitudinal testing of knowledge acquisition
Progress testing \ a definition • longitudinal testing of knowledge acquisition • an objective method for assessing the acquisition and retention of knowledge over time relative to curriculum-wide goals
Progress testing \ definition detail • objective • use multiple choice questions • knowledge • test what learners know • over time • test repeatedly at regular intervals • curriculum-wide • address end-of-programme learning objectives an objective method for assessing the acquisition and retention of knowledge over time relative to curriculum-wide goals
Progress testing \ the purpose • to determine whether the learner is progressing • learning enough? • retaining enough? • doing so quickly enough?
Progress testing \ the method • progress is relative • compare learner to his/her peer group • current class or past n classes • standardized scores (z-scores) • review performance on multiple tests • current and past • assessed with one measure • percentage correct, whole test • adjust for guessing?
Progress testing \ goals • help the learner (formative evaluation) • constructive feedback • about their knowledge base • about their ability to self-assess • has to be specific/detailed • timely feedback • reassure those who are progressing • alert those who are not • do so early enough to facilitate effective remediation
Progress testing \ goals • help the Programme (summative evaluation) • provide defensible evidence to support critical decisions pertaining to individuals • mandated remediation • pass / fail / conditional advancement • the emphasis.. • on formative aspects • minimize negative impact on learning behaviours • tutorial functioning • self/peer-assessment
Progress testing \ special features • the item bank • a sample of the knowledge that a good student will likely encounter by the time that student. . • graduates? • is six months / a year beyond graduation? • content encompasses nearly the 'entire' domain of the field in question • cf. course/curriculum 'core' knowledge
Progress testing \ special features • instructions to examinees • don't study for this test • 180 items, randomly selected from 2,600+ • don't guess • test your ability to self-assess • penalty for guessing (optional) • attempt only those items for which you have some knowledge and are reasonably confident you know the best/correct answer
Basic patterns in performance data • Class means on whole test for.. • % attempted • % correct • not adjusted for assumed guesses • look at patterns.. • across time • week in programme (x of 138) • across classes at week x
Basic patterns in performance data • % attempted, % correct • patterns are relatively stable across tests and classes • means at week x are relatively consistent across tests and classes • examinee performance is relatively consistent across tests and classes • overall test reliability 0.6 - 0.7 (8 tests) • test-retest correlation 0.6 - 0.8 (2 tests)
End of Part I Introduction to Progress Testing Any questions?
Objectives • Introduce progress testing 2. Using SIR for progress testing • what's in an item? • data management tasks • managing dm tasks • software • databases and pql • SIR \ valued features • future enhancements
What's in an item? (1) • the examinee sees.. • nn. An elderly woman has been showing signs of forgetfulness, poor concentration, and decreased energy and appetite. On exami-nation her cognitive functioning seems quite good and her mini-mental (Folstein) score is 27/30. The most likely diagnosis is: • AAnxiety disorder • BMulti-infarct dementia • CAlzheimer disease • DPersonality disorder • EDepression Stem Options
What's in an item? (2) • the data manager sees.. • stem and options (text) and • unique item identifier • correct response code • content codes (6 fields, 1, 2 or 3 sets) • item performance data • stats on usage, power to discriminate • by test, class; across tests, classes • and more.. • date last used, don't use flag
Data management tasks 1. store, retrieve, manipulate and print large volumes of textual information • pre-test: test booklets • 180 items, 21–22 pages/booklet • post-test: performance reports • for examinees: 2 reports x 1–2 pages/rep • for administrator: re. items, tests, classes and examinees who are not progressing • accommodate special needs re. • special characters – Greek letters, math symbols • page layout, fonts, typeface style • merge data into report templates
Data management tasks \ post-test 2. read examinees' responses • 100-item optical mark response sheets • tab-delimited ascii records • Mac: 2 sheets X approx. 280|420 examinees 3. score examinees' responses • requires item, test, class, examinee info • compute and retain performance stats • key measures: % attempted, % correct • mean & sd re. whole test (and major subdomains?) • for: each examinee, each class|peer group
Data management tasks \ post-test 4. compute and retain item performance stats • requires item, test, class, examinee info 5. compute/retrieve data needed in standard reports • re. examinees, classes, tests, items 6. assemble and print reports
Data management tasks 7. enable support staff to do all of the above with relative ease • minimal reliance on the application programmer after everything is up and running
What's the best tool for the job? • SIR is not a word processor • SIR is a record management and stats-oriented reporting tool • allows user to build powerful custom applications • vendor provides exceptional support beyond the installed Help files • prompt, relevant and practical replies
Solutions \ the best tool • the MD Programme's solution • for text-intensive tasks.. Corel WordPerfect - for numeric data and stats-intensive tasks.. SIR
Solutions \ Corel WordPerfect • a set of merge data files (database) • case-based by item id • item stems, options and other fixed info • export data via csv or fixed-format records • import data via csv-format records • into merge data files • multiple merge forms (report templates) • extensive use of merge and macro commands to produce pre/post-test reports • custom-build merge|macro applications
Solutions \ SIR ver. 3.2 - 2000 • 2 databases, case-based • ITEMS re. items • TEEX re. tests, examinees, classes • major reliance on (vb) PQL • custom applications • csv-format records • add/update records/fields (eg, from WP) • write records/values (eg, for WP) • PQL procedures • csv save, tabulate, save table, spss save
SIR 2000 \ valued features (1) • DBMS • case-based option for db type • system-maintained • easy access to any case's records • case id is on all dumped records • global variables • pass user settings to applications • utilities • Data \ File Dump, File Input • tabfiles and tables • create, index, populate, delete tables
SIR 2000 \ valued features (2) • PQL • nested access to cases • read csv-format records • vb dialog boxes • PQL Procedures • write csv-format records • xtab tables, flexibility re. headers (columns) • write SPSS system files
Future enhancements • upgrade to SIR 2002 (from SIR 2000) • update custom applications (to vb pql) • add secondary indexes • examinees by name, current class • web access • for examinees: performance reports • method? • ColdFusion (CF –– SQL –– ODBC driver –– SIR db) • CGI scripts
End of Part II Using SIR for progress testing Any questions?