1 / 20

Developing and Implementing Online Test Processing Software

Developing and Implementing Online Test Processing Software. E’lise Flood Testing Center Specialist Jenna Anderson Testing Center Specialist Ryan Brainerd Lead, Interactive Design Services. Agenda. About Franklin University About Our Testing Office (SLC) Previous Testing Process

hisa
Télécharger la présentation

Developing and Implementing Online Test Processing Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing and Implementing Online Test Processing Software E’lise FloodTesting Center Specialist Jenna AndersonTesting Center Specialist Ryan BrainerdLead, Interactive Design Services

  2. Agenda • About Franklin University • About Our Testing Office (SLC) • Previous Testing Process • The Problems • Our Solution • Software Overview • Results • Future Improvements • Will This Work At Your Institution?

  3. About Franklin University • Private, not-for profit institution • Approx. 11,000 students • Central Ohio • National • International • Undergraduate and Graduate programs • Five physical locations • Online, traditional, hybrid formats

  4. About Our Testing Office (SLC) • There are currently two testing specialists with 3-4 internal test proctors • The highest volume of tests we process are out of class tests • In 2009, we processed 13,683 exams • 56% of those were administered by distance proctors

  5. Previous Testing Process • Receive request by email from instructor • Proctor information for all students • Test and related materials • Manual roster pull and verification • Create cover sheets • Individual e-mails for all proctors • Pre paid envelopes for returning tests • Paperwork, sorting, filing when tests returned • Process took 45 minutes average per test

  6. The Problems • Manual process was too time consuming to scale with University’s growth • Lack of enforceable deadlines • 5 day window • Late submissions were problematic • Proctors were not verified in any meaningful way • Lack of centralized system made record keeping difficult • Manual process was e-mail and spreadsheets

  7. Our Solution • Centralized web application to handle all requests and automate processing as much as possible • Automatic emails, rosters, mail merge • System enforces deadlines • System has searchable proctor database • System enforces some proctor guidelines • Email blacklist • Standard survey

  8. STUDENT

  9. PROCTOR

  10. PROCTOR

  11. FACULTY

  12. FACULTY

  13. SLC

  14. SLC

  15. Results Manual Process Online Test Administration Proctor approval centralized by testing professionals Tests emailed automatically with specific student information automatically populated Instructor submits request through software Roster automatically populated Coversheet automatically populated Automated convergence at all locations Automated emails to all other parties 8 minutes average per test • Proctor approval at instructor discretion • Individually send emails with tests attached to proctor with specific student information • Instructor had to email test request • Log into software to pull roster • Manually created coversheet • Dublin testing center had to have materials saved in separate folder • Manual emails to all other parties • 45 minutes average per test

  16. Future Improvements • Additional Transparency • More status for student, faculty, proctor, SLC • ETAs • Integrated Online Test Environment • Automated Proctor Location • Support for additional physical locations

  17. Will This Work At Your Institution? • Software design and execution was time consuming • 12 months initial investment • 2 years ongoing support and upgrades • Franklin has an internal software team (AIS/ID) • Business support • Academic Standards • Proctor Guidelines • Faculty Approval • Testing office (SLC) • Information Technology

  18. Get Something Started Franklin University Columbus, OH E’lise FloodTesting Center Specialist floode@franklin.edu Jenna AndersonTesting Center Specialist andersje@franklin.edu Ryan BrainerdLead, Interactive Design Services brainerr@franklin.edu

More Related