1 / 39

We Teach … But are they Learning?

We Teach … But are they Learning?. Assessing Student Learning in D.L. Lab Science Courses Peter Jeschofnig, Ph.D. Colorado Mountain College 12 th Sloan-C International Conference Nov 9, 2006. Atlantic Monthly : Nov 2005 . "What Does College Teach?

tivona
Télécharger la présentation

We Teach … But are they Learning?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. We Teach … But are they Learning? Assessing Student Learning in D.L. Lab Science Courses Peter Jeschofnig, Ph.D. Colorado Mountain College 12th Sloan-C International Conference Nov 9, 2006

  2. Atlantic Monthly: Nov 2005 . "What Does College Teach? It's time to put an end to 'faith-based' acceptance of higher education's quality“ Richard Hersh, former president of Hobart & William Smith Colleges and Trinity College Co-director of the "Collegiate Learning Assessment" Hersh & Merrow – 2005 Declining by Degrees: Higher Education at Risk

  3. Institutional Level Assessment All institutions have some form of assessment, often linked to accreditation Of 1,393 public and private institutions surveyed in 1999, 82% listed “Excellence in Undergraduate Education” as part of mission statement. However: Direct measures of student learning remain rare!

  4. Secretary of Education’s Commission on The Future of Higher Education Chairman Charles Miller, believes that colleges must better measure the skills and knowledge they impart to students, and openly share that information with the public. “We need to assure that the American public understand through access to sufficient information, particularly in the area of student learning, what they are getting for their investment in a college education.” "No College Left Behind?" Inside Higher Ed, February 15, 2006

  5. Secretary of Education’s Commission on The Future of Higher Education Feb. 3, 2006 meeting in San Diego: Higher education institutions must be more “transparent” in collecting and giving the public useful information about their activities and their performance. Topics about which colleges should provide more and better information, the various panels suggested, are on their costs and prices, how their graduates fare in the employment market, and their success in imparting knowledge and, more importantly, skills like critical thinking, to their students.

  6. National Forum on College-Level Learning (NFCLL) "I'm disappointed that NASULGC (National Association of State Universities and Land-Grant Colleges) seems to be interested in describing ‘the skills and knowledge that students bring to college’ but evidently not the ones they leave with. While it's considering publishing ‘data on graduation rates, admissions, applications, student demographics and faculty demographics,’ THE OBVIOUS OMISSION OF EVIDENCE OF STUDENT LEARNING SUGGESTS THAT THEY HAVEN'T HEARD THE CLEAR MESSAGE THAT'S BEEN COMING FROM THE STATES AND THE FEDS FOR ABOUT 20 YEARS: THAT THEY WANT TO KNOW WHAT OUR GRADUATES KNOW AND CAN DO.“ Margaret (Peg) Miller 2006 Director of NFCLL

  7. Threat of Mandated Assessment Government mandated assessments have many negative ramifications that should be avoided. However, that should not deter academic professionals from designing valid and reliable assessment of learning. A pro-active academic approach to assessment is surely better than a government mandated one!

  8. Assessment Approaches • Actuarial Data and Expert Ratings:Graduation rates; research funding; student-teacher ratios, ACT & SAT scores; admissions selectivity • Student Ratings:Survey of Student Engagement (NSSE) Students rate educational experience on quantity and quality of faculty contact; homework, assignments, etc. • Direct Assessment:Via grades & grade point averages • Nearly useless as learning indicators • Grade Inflation • Tell nothing about knowledge gained, retained, or if knowledge can be applied to new situations • Reflect lower level objectives such as facts and definitions rather than higher objectives & critical thinking skills

  9. A Learning College Approach “The Learning College and its learning facilitators succeed only when improved and expanded learning can be documented for its learners.” O’Banion, 1997 The Primary Questions For Every Learning College Action: • Does this action improve or expand learning? • How do we know?

  10. Assessment ... ... is the systematic, on-going, iterative process of monitoring learning in order to determine what we are doing well and what we must improve.

  11. We Want and Need to Know… … and we want students, employers, peers, policy makers, and the public to know … how well students are able to use the complex knowledge and abilities articulated as important to their learning.

  12. Basic Assessment Types • Formative Assessment: Ongoing assessment used to modify instruction and improve learning • Summative Assessment: End of class or program assessment to verify that learning objectives have been met. This is compared to other classes and institutions to gage the effectiveness of courses and instructional programs.

  13. Assessment is Especially Effective When It ... • is student centered • is congruent with instructional objectives • is relevant • is comprehensive • is clear in purpose, directions, and expectations • is objective and fair • simulates "end" behavior/product/performance • incites active responses • shows progress/development over time

  14. Assessment Challenges • Disagreements on what should be taught • Well-conceived programs take time, energy, and money • Avoiding the problems experienced in “No child left behind” assessment • Threat to academic freedom? • Classes looking too much alike?

  15. Distance Learning is Here to Stay! Online enrollments continue to grow at rates faster than for the overall student body… Schools expect the rate of growth to further increase and believe that online learning is critical to their long term strategy… Three quarters of academic leaders at public colleges and universities believe that online learning qualityis equal to or superior to face-to-face instruction. And they expect online offerings to continue to get better. The Sloan Consortium, Nov. 2004 Entering the Mainstream The Quality and Extent of Online Education in the United States, 2003 and 2004

  16. Specific Assessment Challenges for Distance Education Using Nationally Normed Exams: • Exam content must remain confidential • Require proctored exams • Cost to institution • Inconvenience to student Institution Designed Exams: • Validity • Reliability

  17. What About Lab Sciences? If we are to avoid seeing a continuing decline in science literacy in America, lab sciences MUST be fully included in the increasing mix of online course offerings. However, there are still many instructors and institutions that do not believe lab sciences can effectively be taught at a distance. Valid and reliable assessment data is required to dispel this misconception.

  18. Assessing Online Science There is ample anecdotal evidence of student learning and satisfaction in distance science courses that use home-based lab kits to fulfill laboratory requirements. However, there has been little quantitative data to support this positive conclusion.

  19. Objective of this Study To quantitatively assess and compare the performance of my chemistry students In a face-to-face (F2F) chemistry course with an on-campus laboratory and In an online chemistry (DL) course with a home-based laboratory kit

  20. Process of Assessing Outcomes Administer and Compare Results for Campus-Based CHE-111 Students and Online CHE-111 Students: 1. American Chemical Society Standardized Exam Pre-test at the beginning of the semester and Post-test at the end of the course 2. Traditional homework, quiz, and exam grades 3. Laboratory reports graded via a specific rubric

  21. American Chemical Society (ACS) General Chemistry I Exam Scores based on 3007 students in 20 colleges and universities, including: University of Alabama Albuquerque Technical College Mercer University Kennesaw State University Jamestown CC Florida Southern University Miami University Monroe CC; University Pittsburgh University of N.Carolina

  22. ACS Exam Results: F2F

  23. ACS Exam Results: DL

  24. ACS Exam Results: DL vs. F2F

  25. ACS Score Comparisons

  26. ACS Post-test Scores: F2F

  27. ACS Post-test Scores: DL

  28. Final Exam vs. ACS Exam – F2F

  29. Final Exam vs. ACS Exam – DL

  30. Lab Report Grading Rubric Title Page - 5 points Succinct and descriptive title and experiment number, author’s name, partners’ names, course name and number; date of experiment, date of lab report. Abstract – 10 points A brief one to two paragraph statement of the purpose of the experiment and the results (i.e. relative yield, identification of unknown, etc). 10 points Purpose/Hypothesis - 10 points A detailed statement about the experiment’s purpose and hypothesis (your predictions). Describe what you think the likely outcome of the experiment will be; what scientific principle or law will be tested; what scientific relationship will be shown. Hypothesis should include any relationship between variables and should mention the independent vs. dependent variable. Procedure – 10 points: If you followed a detailed procedure from the lab manual, a very short procedure summary will be enough. If you used a procedure not detailed in the lab report, you need to write the procedure in enough detail, using numbered steps, so that someone should be able to repeat your procedure. This section should include any alterations or errors you made to your operating methodology. Be thorough and specific! Another chemist should be able to read your procedure and reproduce your experiment with precision. This section should not contain any results/data from the exercise. 10 points

  31. Data/Observations - 25 points A detailed presentation of all hard data gathered during the experiment, as well as an organized presentation of all your observations during the lab. Describe not only what you see, but what it signifies in chemical terms. What is really happening? Also, answer any questions the lab poses here. Tables should be used whenever possible. Graphs need to be complete with titles, axes must be labeled; the independent variable should be on the x-axis, the dependent variable on the y-axis. Any calculations should be in this section as well. They should be presented clearly and explained. The source of all numbers used in calculations should be included. Results/Analysis - 20points A comprehensive, thoughtful discussion of what your data means and how it proved or disproved your hypothesis, including the relationship of the variables as presented by the data. Explain any trends in the data that will be used. You should present the evidence that will support your conclusions. Error analysis, including percent error, should be included here. Are the results consistent within the limitations of equipment and random error. All questions should be answered here. Conclusion -20 points A detailed discussion of how the lab results compare with your predictions, and what the results mean in a practical, real world sense. How can your discoveries be applied? Have you verified your hypothesis? Did you demonstrate the scientific principle of this experiment adequately?

  32. Laboratory Report Grading Rubric Title of Report: Authors’ names: Grading Rubric Point Allocation for Lab Reports

  33. Lab Report Scores

  34. Summary of Findings • DL and F2F exam scores were basically equivalent • DL, F2F, and national ACS exam scores were basically equivalent • DL lab grades averaged 5% and course grades 1% higher than F2F • Institution exams were as effective as ACS exams for assessment Conclusions: Student learning in DL science courses with home based lab kits is at least equivalent to and usually a little better than in face-to-face courses with a campus based lab. Valid assessment can be achieved via institution exams.

  35. Summary of Presentation • Assessment is a Major and Growing Concern of Government, Educators, and Institutions • Assessment, especially DL assessment, presents challenges which must be met • DL Science Lab Courses Require Valid Assessment Methods to be Included in Online Course Offerings • DL and F2F Science Lab Course Learning Can Be Reliably Assessed and Compared • DL Science Lab Course Learning is Equivalent To or Better Than That in F2F Courses.

  36. Future Plans • Design secure online end-of-course assessment tools • Develop lab skills and safety exam • Correlate ACS scores with ACT/SAT math scores Feedback and Suggestions are Welcome! pjeschofnig@coloradomtn.edu

  37. Abstract of Presentation • This Colorado Mountain College Chemistry Professor reviews current complaints about college assessment and discusses the importance of and means of achieving valid and reliable assessment tools. He utilized standardized pre and post semester exams, traditional testing, and a lab report grading rubric to quantitatively assess and compare the learning of students in both his face-to-face and DL online CHE-111 courses. His findings reflect that student learning in his DL online Chemistry course with home based lab kits is at least equivalent to and actually a little better than student learning in his face-to-face Chemistry course with a traditional campus based lab.

More Related