Assessment Decisions in ABC: Types, Instruments, and Considerations
E N D
Presentation Transcript
Chapter 7 Assessment
Three Types Assessment Decisions Teacher’s Make • Identification • Placement* • Instruction
Definitions • Assessment-the process teachers use to make decisions • Instruments-procedures teachers use to collect information • Measurement-type of information collected by the instrument • Evaluation-comparing pre-assessment data to post-assessment data
Matching Assessment Instrument to Assessment Decision • What do you want to measure? • What criteria are important to measure the level of proficiency of the skill? • Where do you look to determine what you will measure?
Norm Referenced • Standardized • Ensures that all students being tested are tested under the exact same conditions • Specifies all details of testing situation • Almost exclusively measures “Product” • PACER TEST, ACT, BEST, etc. • Student scores compared to set of standards provided with the instrument • Objective • Don’t need training
Criterion Referenced (CRI) • Judged on Performance Criteria • Standardized or non-standardized • Designed to collect process measures • Teacher designed, more adaptable and flexible, teacher makes more decisions about how to conduct the test, or score the test
What Can the student DO or Not Do • More accurate measure of what the student knows • Process • Need training to accurately record • Test administrator must have knowledge of the skill • HOW, how well, did the student perform the skill
Reference Standards • Data collected is interpreted according to standardized NORMS • Usually by Gender and Age • NRI are usually numerical • CRI are usually identify criteria for performance
Valid • Face validity • What can confound validity in physical education? • Complexity of measure • Pre-requisite skills • Complexity of instructions • Reliable • Intra-rater; Inter-rater
Objective vs. Subjective • More Objective measures, easier to achieve reliability • Objective measures? • Subjective?
Administrative Feasibility • Cost • Time to administer • Setting needed to administer • Training and Preparation of administrator
General Guidelines for Assessment • Choose (or develop) the right instrument • Know how! • Set up in advance • Optimal view • Organization of the class • Quick and efficient checklist, recording procedure, etc.
“ACE” Factors of Assessment • In order to make sure the data you collected actually represents the content the student has learned/skill acquired, what must you make sure your students are doing?
ACE • Attention • Comprehension • Effort • How do you know the students paid attention to the correct cues, understood the request, performed to the best of their ability? • Are you measuring ability or motivation to perform?
Basic Rules • Make sure the students know • Be Aware of Audience effects • Don’t teach during assessment • General feedback only • If in doubt, don’t give credit • Make sure testing situation doesn’t confound results • Be thoroughly familiar with the components of the skill, memorize the skill and the test procedure
What do we do with Data? • Decide focus of next lesson, unit, etc. • Decide how to group students • Formation, organization for instruction • Determine if students are progressing • Is your teaching effective • Are the methods you chose appropriate for the content
Creating Assessment Instruments • How many assessments should an ABC K-12 curriculum have? • ONE Per Objective • Assess more than one objective per instrument • Your project only requires one per goal area (choose one particular objective), can assess multiple skills in that one assessment instrument. • Make sure it is clear which objective the assessment relates to! • When should the assessments be created/chosen? • During curriculum planning
Authentic Assessment • Observing a student playing a game of tennis in a natural setting • Reduces audience effects • Actually measure the students ability to “play a functional game of tennis”
Rubrics • Visible Signs of Agreed Upon Items • Must be Built for specific evaluation • Can be flexible, may need change • Should provide students with Rubrics
Authentic Tennis Forehand Assessment Rubric: Placed on Checklist with multiple columns for forehand strokes, checked while watching student actually play a game 4 – Hips perpendicular to net; level swing; ball contacted waist high just behind front foot; weight transferred from back to front as contacted and follow through 3 – Hips slanted to net; swing exaggerated high to low, or low to high; ball contact in front of front foot or behind back foot; weight transfer step after contact; shortened follow through 2 – Hips open, some angle; swing from elbow rather than shoulder; reach for ball rather than moving to the ball; hit off back foot or little weight transfer 1- Hips square, no angle; choppy, uncoordinated swing; reach or reach and miss; standing flat footed, no attempt at weight transfer
Guidelines For Developing Rubrics • Determine Learning Outcome • Keep it short and simple! • Each Rubric Item should focus on a different skill • Evaluate only measurable criteria • Clearly state the criteria • One sheet of paper if possible • Multiple names; multiple skills
Curriculum Embedded Assessment • Must continually assess! • Throughout instructional unit, not just at end • Do we do a good job of this in P.E? • Must use instruments that directly align with the instruction. • Must assess authentically when possible.
1st Clearly Define Objective • To effectively manage continual authentic assessment: • Key components of the skill • By Task Analysis • Must be appropriate developmental level
2nd clearly define conditions for administering the assessment • Criteria should already be imbedded from when we wrote the objective. • Under what conditions, size of object, how it is thrown, what the catch should look like, how many times completed (3/5) • Want to make sure any changes are due to student learning, NOT a change in testing procedure!
How could you assess throwing in an “authentic” setting? • What happens when we put students in a contrived testing situation in which they must perform individually? • What other factors must you consider if do individual testing?
What do you give up if you try to assess basketball skills in a regular game? • Would that be standardized and repeatable?
Do the rules for your game illicit the skill, or behavior that you want to assess? • What are some of the problems you might encounter if you are trying to assess dribbling skills, passing skills, shooting skills, offensive tactics (decisions), and defensive stance and positioning in a 5 on 5 game? • How could you organize the testing situation to make it more feasible, and practical to obtain the data you want?
3rd Must Determine How to Score the Assessment • What “number” or “score” will sufficiently describe the level of performance? • How will that score translate into a “grade?” • How can we design the instrument to be simple enough to use efficiently, but complex enough to detect any changes in student performance?
4th Efficient method of recording data • What are some ways to design an efficient/quick way to record the score on the instrument? • What items must be included on the actual score sheet? • How to use Technology! • JOPERD 77 (1), January 2006 Integrating Assessment and Instruction: Easing the Process with PDA’s
Concepts • Presence vs. absence • Complete vs. Incomplete • Many, Some, None • Major, minor • Consistent vs. inconsistent • Frequency: • Always, Usually, Sometimes, Rarely, Never
Terms and Ranges • Needs Improvement, Satisfactory, Good, Exemplary • Beginning, Developing, Accomplished, Exemplary • Needs Work, Good, Excellent • Novice, Apprentice, Proficient, Distinguished • Pee Wee, Club Team, NCAA, Professional • Bogey, Par, Birdie, Eagle • Numeric
Assessment Data is only valuable if . . . • You use it! • Is easy enough to record, so you will use it. • Is organized enough to manage, so you will use it! • Use it to . . . • Make instructional decisions • Evaluate instructional effectiveness • Determine if students are mastering objectives! • Determine if the program/curriculum is working.