1 / 28

OKAIRP Fall Conference

OKAIRP Fall Conference. NSU @ Broken Arrow November 4, 2005. Designing Methods for Determining Educational Gain: Involving Faculty in Assessment. Mark L. Giese Northeastern State University Tahlequah, Oklahoma. Measuring Program Dispositions as a Model. Knowledge Skills Dispositions.

mandrade
Télécharger la présentation

OKAIRP Fall Conference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OKAIRP Fall Conference NSU @ Broken Arrow November 4, 2005

  2. Designing Methods for Determining Educational Gain: Involving Faculty in Assessment Mark L. Giese Northeastern State University Tahlequah, Oklahoma

  3. Measuring Program Dispositions as a Model • Knowledge • Skills • Dispositions

  4. What are dispositions? • What is the big deal about dispositions? • Dispositions are: • Concepts we inherently value as a profession. • Esoteric to the discipline.

  5. What are dispositions in law? • Ethical • Passion for service • Accessible

  6. What are dispositions in Education? In Health and Kinesiology, dispositions may include: • Hard work • Teamwork • Our students or clients can improve with practice

  7. Who is involved? • Use all persons with a stake: • Faculty • Administrators • Let faculty make mistakes

  8. Why measure? • Looks at the Affective Learning Domain. • In the past, typical areas of measurement have only included knowledge and skills. • Accrediting agencies (learned societies) require measurement in this domain.

  9. Steps in developing a measurement tool for your unit • Planning • Review the literature and ask your friends what they are doing. • Decide on which dispositions to measure (committee). • Decide if you intend to use this exercise as an exit competency or if you are looking for a change from entry to exit. • Determine an a priori standard.

  10. Steps in developing a measurement tool for your unit • Implementation • Devise a measurement tool (rubric) • Collect your data • Analyze your data • Determine if an intervention is necessary • REMEASURE!

  11. Examples • Example of a college rubric for measuring dispositions in pre-service teacher education candidates in the COE at NSU.

  12. Examples • Example of a Sociology department rubric for measuring dispositions: • Attempt to measure “Social Distance”

  13. Importance • Measuring dispositions is important because it gives us another piece of information to determine our effectiveness and quality.

  14. Examples of Presentations on Social Distance • “Social Distance as a Disposition of Teacher Candidates.” Presented at AACTE, Chicago, Illinois, February 8, 2004.

  15. Examples of Presentations on Social Distance • “Attitudes Towards Social Distance of Pre-Service Teacher Education Students.” Presented at Higher Education Research Day, Edmond, Oklahoma, November 2003.

  16. Examples of Presentations on Social Distance • “Attitudes Towards Social Distance of Pre-Service Teacher Education Students.” Presented at the Oklahoma Association of Colleges for Teacher Education Conference, Norman, Oklahoma, November 9, 2000.

  17. Two Types of Measurement • One-time (barrier) • Demonstrate program effectiveness (gain)

  18. Measurement Goals • One Time: Graduation requirement etc. • Using a pre/post test design in an attempt to determine program gain. May be done at entry and exit of any major.

  19. Does your data fit the model? • Parametric (t, ANOVA) • Non-parametric

  20. Determining Program Gain • Measure at pretest. • Measure at post test. • Determine the difference. • Use hypothesis testing to determine if a significant difference exists.

  21. How ANCOVA works • The variable under study is the dependent variable and the covariate is the “other” data point. Both must be known for each subject (matched).

  22. How ANCOVA works • The influence of the covariate is removed using a straight-forward linear regression method. • The ANCOVA produces an adjusted post test score for each subject and an adjusted mean as well.

  23. ANCOVA Example • Three Methods of teaching French: • Pretest……..instruction……Post test • Measures of intelligence available • Remove the effect of intelligence so the French method may be the cause of the change from pre/post score.

  24. ANCOVA Group 1 Pre Post . . . . . . . . . . . . . . . . X X Group 3 Pre Post . . . . . . . . . . . . . . . . X X Group 2 Pre Post . . . . . . . . . . . . . . . . X X

  25. Assumptions of ANOVA • Homogenity of variance • Equal distribution • Random sampling • Additive effects

  26. Extra Assumptions of ANCOVA • IV does not influence the covariate variable • Equal regression slopes • Linearity • r of .60 (Shavelson, 1996)

  27. Review • Ask faculty if they need/desire to measure dispositions. • Assist with choosing an instrument and measurement model. • Assist in determining gain and the magnitude of the effect.

  28. Questions ??QUESTIONS??

More Related