1 / 59

Technology-Mediated Assessment

Technology-Mediated Assessment. Jack McGourty, Columbia University John Merrill, Ohio State University Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh. Gateway Engineering Education Coalition. Technology-Mediated Assessment. Introduction Your Expectations Applications

amie
Télécharger la présentation

Technology-Mediated Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Technology-Mediated Assessment Jack McGourty, Columbia University John Merrill, Ohio State University Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh Gateway Engineering Education Coalition

  2. Technology-Mediated Assessment • Introduction • Your Expectations • Applications • Drexel and Columbia’s Course Evaluation • Ohio State’s Activities • Team Evaluator • Your Experiences • Enablers and Barriers (Break-out Groups) • Conclusions

  3. Introduction • Reasons for Online-Assessment • Common Applications • Design and Development • Things to Think About

  4. Reasons for On-Line Assessment • Customized development • Targeted communication • Ease of distribution/no boundaries • Automatic data collection and analyses • Real time response monitoring • Timely feedback

  5. Common Applications • Attitude Surveys • Multisource assessment and feedback • Course evaluations • Portfolios • Technology-mediated interviews • Tests

  6. Design and Development • Item/Question development • Adaptive testing/expert systems • Multimedia tutorials • Dialogue boxes • Reporting wizards

  7. Confidentiality/Privacy Response rates Reliability/Validity Ease of use Administrators, end users System growth Can it easily be upgraded? Adding modules System flexibility Survey/test construction Data flexibility Item databases Reporting wizards Data storage Platforms Specific vs. combination Reporting Various levels Dissemination mechanisms Real time vs. delayed Things to Think About

  8. Technology in Education Technology Enabled Assessment The Wave of The Future Dr. John Merrill The Ohio State University Introduction To Engineering Program

  9. Objectives • Explanation of web-based assessment tools • Uses of assessment tools • Virtual run-through of student actions • Lessons learned • Q&A

  10. Web-Based Assessment Tools • Course Sorcerer (through WebCT) • Online Journal Entries • Course Evaluations • Team Evaluator • Peer Evaluations

  11. WebCT • WebCT is a commercial web-based tool used for course management. • IE Program uses/capabilities: • Electronic grade book, chat rooms, bulletin boards, calendars • Provides links to • Course Material • Course Sorcerer • Team Evaluations (Team Evaluator)

  12. Course Sorcerer • A simple, web-based evaluation tool created by Scott Cantor at University Technology Services • Technical Specifications: • Written in Cold Fusion • Run on Windows NT with a Netscape Enterprise Web Server • Uses a MS SQL Server database with 15 tables • Server Machine: PII-450 w/ 512M of RAM • Accesses Sybase running on Solaris 2.6 as a warehouse for roster data. • Used for Journal Entries & Course Evaluations

  13. Team Evaluator (Peer Evaluation) • Used by team members to provide confidential assessment • System Requirements: • Operating System: Windows 2000 with ActivePerl or UNIX with Perl 5.004 or • higher • Perl Modules: CGI, DBI (plus SQL drivers), POSIX • SQL Server: MySQL 3.23 or higher • Web Server: IIS (Windows) or Apache 1.3 (UNIX) • CPU: Pentium II 400 or better recommended • Memory: 128 MB or higher recommended • Disk Space: 100 MB for adequate database space

  14. Journal Entries • Students complete journal entries online every two weeks. • Submissions are anonymous. • All entries are read and summarized by a staff member and shared with the instructional team. • Instructional team members share the summaries with their classes.

  15. Course Evaluations • Students in 181 & 182 complete online course evaluations at the end of each quarter. • Questions designed to evaluate courses based on items a-k of Criterion 3, Program Outcomes & Assessment, in the ABET Engineering Criteria, 2000.

  16. Short Term Uses Journal Entries & Course Evaluations • Address immediate student concerns/questions about class, labs, or projects. • Inquire about student problems with specific topics and labs. • Discover general information from students in regards to interests, influences, and attitudes.

  17. ExampleAddressing Immediate Student Concerns • “How are the figures supposed to be done? Strictly isometric or just drawn so you can see everything? What pieces need to be labeled?” • “What are we doing in labs 6 & 7? I know it says in the syllabus that we are incorporating the sorting mechanism, but is that going to take two weeks?”

  18. Long-Term Uses Journal Entries & Course Evaluations • Improve program content • Improve course materials • Modify teaching styles • Evaluate course based on ABET criteria

  19. ExampleImproving Course Content • “Positive: I... - Gained knowledge about circuits in general - Learned how to read schematics - Learned how to use breadboards - Further developed team working skills Negative: - The circuits did not work the first time. - Time ran short for both labs, but we did finish each circuit.”

  20. How It Works Start: WebCT site: http://courses2.telr.ohio-state.edu

  21. Completion Tracking

  22. Lessons LearnedJournal Entries & Course Evaluations • Students are more likely to complete if given credit. • Students are extremely responsive to the anonymity of the online survey. • Students respond positively when asked for suggestions/solutions to problems in the class.

  23. Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University

  24. Overview • A little history • How does course assessment fit into the “big picture”? • Why use web technology? • How is it being done? • Does it work?

  25. History • Columbia’s Fu Foundation School of Engineering and Applied Science began using the web for course assessment about four years ago starting with a student administered web site for results • Designed and developed state-of-the-art system using student teams • Now building on current infrastructure to include on-line tutorials and increased flexibility for administration

  26. Student Web Site • Search by course or faculty • Current and past results • No comments

  27. The Big Picture • Why are we assessing courses and programs? • Continuous improvement of the education process • What are we doing right, and what can we do better? • Integral part of our ABET EC2000 Compliance • Develop a process • Collect and evaluate data • Close the loop • Document/Archive results • Course evaluation one of several outcome assessment measures such as senior exit surveys, enrolled student surveys, and alumni surveys

  28. How WCES Fits in

  29. Pro Students have the time to consider their responses Timely feedback Responses are easily analyzed, archived and distributed Less paper Lower cost/efficient administration Con You lose the “captive audience” You can’t guarantee a diversity of opinions Motivated/Non-motivated Like course/Dislike course Not necessarily less effort Using Technology

  30. Course Assessment Details • 10 Core Items • Course Quality • Instructor Quality • Relevant ABET EC2000 Items • Pre-selected by faculty member • Customized questions for specific course objectives

  31. Selecting EC2000 Questions

  32. Monitoring Faculty Usage One of our culture change metrics is the percentage of faculty who are capitalizing on the system and adding custom and EC2000 questions. Currently around 15%.

  33. Course Evaluation Results • Web page access • Current term’s assessment • Limited time window • Limited access • Secure site • Previous terms results • Open access to numerical results; not comments • Email Results • Individual faculty • Aggregate Data – Department Chairs

  34. Reporting

  35. Promoting Responses • Student-driven results website • Multiple targeted emails to students and faculty from Dean • Announcements in classes • Posters all over the school • Random prize drawing

  36. Closing the Loop

  37. Does it Work? • Student response rates have steadily increased over past two years from 72% to 85% • More detail in student written comments in course assessments • Data is available that we have never had before • Faculty use of ABET EC2000 and Customized question features increasing but still limited (15%)

  38. Cross Institutional Assessment with a Customized Web-Based Survey SystemMary Besterfield-Sacre & Larry Shuman University of Pittsburgh This work is sponsored by two grants by the Engineering Information Foundation, EiF 98-01, Perception versus Performance: The Effects of Gender and Ethnicity Across Engineering Programs, and the National Science Foundation, Action Agenda - EEC-9872498, Engineering Education: Assessment Methodologies and Curricula Innovations

  39. Why a Web-Based Survey System for Assessment? • Need for a mechanism to routinely • Elicit student self-assessments and evaluations • Facilitate both tracking and benchmarking • Most engineering schools lack sufficient resources to conduct requisite program assessments • Expertise • Time • Funds • Triangulation of multiple measures • Multiple measures

  40. Pitt On-line Student Survey System (Pitt-OS3) • Allows multiple engineering schools to conduct routine program evaluations using EC 2000 related web-based survey instruments. • Assess and track students at appropriate points in their academic careers via questionnaires • Survey students throughout their undergraduate career • Freshman Pre and Post • Sophomore • Junior • Senior • Alumni • Freshman orientation expanded to include • Math Placement Examinations • Mathematics Inventory Self-Assessment

  41. Attitudes and Valuing Can Take on Complexity Application Area Opportunity and Application Knowledge-Based Competence Synthesize multiple areas Work Experience Accept Ambiguity Welcome Environment Develop Comfort Preparation EC Outcomes Confidence Student-Focused Model

  42. System-Focused Model

  43. Pitt OS3 • Conduct routine program evaluation via surveys through the web • Data collection • Report generation (under development) • Web versus paper surveys • Pros • Administration ease • Minimize obtrusiveness • Data is “cleaner” • Cons • Lower response than paper-pencil surveys • User/Technical issues

  44. Pitt OS3System Components

  45. Pitt OS3Local Administrator • Individual at the school where the surveys are being conducted • Responsible for the administering the surveys through a web-interface • Controls the appearance of the survey • Selects school colors • Uploads school emblem/logo • Selects survey survey beginning and ending dates • Composes initial and reminder email letter(s) to students • Cut-and-pastes user login names and email address • Manages surveys in progress • Extends surveys beyond original dates

  46. Pitt OS3Local Administrator

  47. Pitt OS3Local Administrator

  48. Pitt OS3Local Administrator

  49. Pitt OS3Local Administrator

  50. Pitt OS3Student • Java Applet running on a web browser • One question per screen minimizes scroll bar confusion • Once student submits questionnaire, results are compressed and sent to the OS3 server • Results stored and student’s password is invalidated • Confirmation screen thanks the student for taking the survey • Can accommodate users who do not have email accounts

More Related