1 / 43

USING CEM SECONDARY DATA

USING CEM SECONDARY DATA. PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference. Geoff Davies. CEM Secondary data includes: Baseline test data (developed ability) ‘Predictive’ data including chances graphs Value-added data Attitudinal data Curriculum assessments (Insight/Sosca)

heba
Télécharger la présentation

USING CEM SECONDARY DATA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. USING CEM SECONDARY DATA PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference Geoff Davies

  2. CEM Secondary data includes: • Baseline test data (developed ability) • ‘Predictive’ data including chances graphs • Value-added data • Attitudinal data • Curriculum assessments (Insight/Sosca) • PARIS software programmes

  3. The use of this data needs to allow us to do our best to help every pupil to at least achieve if not exceed their potential. It may challenge • The culture of ‘my’ school • Accountability policy • Expectations • Staff training in use of data and ability to cope with data (data overload) • Integrating the data into school procedures, storage, retrieval, distribution and access • Roles and Responsibilities

  4. Carol Fitz-Gibbon 2001 British Psychological Society It gradually dawned on me that providing the data to schools was the most important outcome of the effort, far more important than writing research papers….. The provision of data to practitioners meant that they participated in the research. Indeed they were the only ones who knew the surrounding circumstances for their classrooms, their department, each pupil, each family, etc. They were the major players: the ones who could interpret and learn from the detailed data.

  5. …..there is a need for teacher researcher posts on the Senior Management team with a brief to develop research that is useful. Given time in the timetable thousands of teachers could become active researchers…. Educational research should be a practical reality contacting scientific enlightenment not a mathematical weight lifting exercise The sense and usefulness of what we are doing induces creative thoughtfulness.

  6. QUESTION IS THIS STIFLED IN THE PRESENT EDUCATIONAL CLIMATE? creative thoughtfulness

  7. PRIORITISING Can school leaders find the time? Important but not urgent Is it more like? 65%-80% or 15% Urgent but not important Is it more like? 15% or 60% I keep six honest serving men Who taught me all I know Their names are What and Why and When And How and Where and Who Rudyard Kipling The Elephant’s Child

  8. QUESTIONS INSPIRED BY CEM CENTRE DATA • How do we improve our professional judgement? • Do MIDYIS scores tell us more than we think? • Which ability range of pupils gets the best deal from our school? • Which ability range of pupils is served best by a particular subject department? • Can we do action research in our school using historical data? • Can standardised residuals tell us more about a department than just the + or -? • Does learning support work? • What do attitudinal surveys over time give us? • Can we compare standardised scores with comments made by teachers on reports? • Drilling down from SPC charts. • Which pupils are we succeeding with? • Which pupils are we not succeeding with? • What can be done about it? • Is there a pattern? • What are the gender issues? • What are the prior learning issues? • What can we learn from plotting standardised scores over time? • Can SOSCA help with boy/girl issues? • Is key stage 3 teacher assessment sufficiently rigorous? • How can ALIS be used to inform teaching and learning?

  9. WHAT information do I want? WHY do I want it? WHEN do I need it? HOW do I collect it? WHERE can I find it? From WHO do I get it? The CEM centre ticks many of these boxes for schools

  10. A small selection of questions we will look at • Do Midyis scores tell us more than we think? • 2. Which ability of students do most or least well in our school (using YELLIS data)? • 3. Can we review post 16 pedagogy (using ALIS data)? • 4. What did SOSCA teach us about Key Stage 3 assessment?

  11. Do Midyis scores tell us more than we think? Using Midyis baseline data Using MidYIS IPRs Booklet.pdf

  12. What do the sections of the test measure? Vocabulary Score The Vocabulary component of the test is generally an important element for most subjects. For English, History and some Foreign Languages it is the best. However the Vocabulary score is perhaps the most culturally linked of all the scores. Those who have not been exposed to vocabulary-rich talk or a wide variety of reading material or whose first language is not English are unlikely to have developed as high a vocabulary score as they would have developed in a different environment. Maths Score The Maths score is well correlated with most subjects but is particularly important when predicting Maths, Statistics, ICT, Design Technology and Economics. The Maths section has been designed with the emphasis on speed and fluency, rather than knowledge of Maths. Like the Vocabulary score, the Maths score is a good predictor of later academic performance.

  13. Non-Verbal Score The Non-Verbal score is composed of the three sub tests: Cross-Sections, Block Counting and Pictures. The Non-verbal score is important when predicting Maths, Science, Design Technology Geography, Art and Drama. It provides a measure of the pupil’s ability in 3-D visualisation, spatial aptitude, pattern recognition and logical thinking. It can give an insight in to the developed ability for pupils for whom English is a second language

  14. Skills Score In the Proof Reading section pupils are asked to spot mistakes in the spelling, punctuation and grammar of a passage of text. eg mis-spelling of words like ‘there’ and ‘their’. The PSA (Perceptual speed and accuracy) section asks pupils to look for matches between a sequence of symbols on the left and a number of possible choices on the right. Given enough time most pupils would probably get the answers correct but we are measuring how quickly pupils can find a correct match. The PSA section allows speed to be demonstrated free from the demands of memory. The Proof Reading and PSA tests are tests for the modern world, and are designed to measure fluency and speed. They rely on a pupil’s scanning and skimming skills, skills that are desirable in examination situations.

  15. Some pupils will display an IPR pattern with significant differences between one or two components of the MidYIS Test. These can be the most interesting and possibly the most challenging pupils for mainstream classroom teachers. Scenarios and anecdotal findings It is when the IPR is placed in the hands of a teacher who knows that pupil that it becomes a powerful tool.

  16. Confidence Limits The pupil scored 114 on the Vocabulary section. The error bars range from about 105 to 123, about 9 points either side of the pupil’s score. If this pupil was to take this test afresh 100 times, we would expect that 95 of those times the pupil’s score would fall within the range denoted by the error bars

  17. Comparing Maths and Vocabulary Scores Are the scores significant? Relative to the Average Performance? The error bars for Vocabulary and Maths do not cross the line at 100 ( av. performance). Performance in Vocabulary is significantly better than average performance and Maths performance is significantly below average. The error bars for the Non-verbal, Skills and Overall MidYIS scores do cross the line at 100 and hence the pupil cannot be considered to have performed significantly different to the average pupil overall.

  18. A SELECTION OF MIDYIS SCORES FOR ‘WATERLOO ROAD’ !! Why would this be a very challenging class to teach? What do I need to know/do to teach this (difficult) class of twelve pupils These are real anonymous scores from a number of schools around the UK

  19. Vocabulary scores significantly lower than other component scores Second language? Deprived areas? Difficulty accessing curriculum.? Targeted help does work. Seen in nearly all schools. Worth further diagnosis Vocabulary scores significantly higher than other component scores Good communicators. Get on. Put Maths problems in words? Mathematics significantly higher than other scores From Far East? Done entrance tests? Primary experience? Mathematics significantly lower than other scores Primary experience. Use words and diagrams? Sometimes difficult to change attitude.. Low Mathematics scores with High Non-verbal Scores Use diagrams. Confidence building often needed Pupils with non-verbal scores different from others – High Non-verbal Scores Frustration? Behaviour problems? Don’t do as well as good communicators or numerate pupils? Pupils with non verbal scores different from others – Low Non-verbal Scores Peak at GCSE? A level ? Pupils with low Skills scores Exams a difficulty after good coursework? High Skills Scores Do well in exams compared with classwork? The Average Pupil They do exist! High scores throughout Above a score of 130 puts the pupil in the top 2% nationally Low scores throughout Below a score of 70 puts the puil in the bottom 2% nationally

  20. Sharing the MidYIS Information within School Once you have received your MidYIS feedback you need to decide who will be privy to which information. Some schools decide to keep the data within the senior management team, others with Heads of Department and/or Heads of Year, some share with all staff and what about pupils and their parents? Use you MIS systems to put the data where it matters MidYIS data can be useful: •to indicate reasons for student learning difficulties and may go some way to explain lack of progress, flag up causes for underachievement and even behaviour problems. •for all teachers and support staff. It can help to support professional judgement and give a better understanding of the progress students make at school and their potential later performance. •to refer to for pupil reviews, writing reports, meeting parents, monitoring progress and interim assessments.

  21. 2. Which ability of students do most or least well in our school (using YELLIS data) Which ability range of pupils gets the best deal from our school? Which ability range of pupils is served best by a particular subject department? Can standardised residuals tell us more about a department than just the + or -? Using standardised residuals in a different way

  22. CONTRAST THIS

  23. WITH THIS

  24. As on the last two slides CEM provides value added charts using average standardised residuals for departments We are going to show how standardised residuals can be used in a different way in your school yellis exercise.doc

  25. Review of post 16 pedagogy (using ALIS data)

  26. SUBJECT A 2008 Why the improvement?

  27. PEDAGOGY….ALIS surveys

  28. SUBJECT A 2005 SUBJECT A 2007

  29. SUBJECT A 2008 DO NOT GET TOO EXCITED!

  30. We have made a comparison of perceived teaching methods as analysed by ALIS in 2004-5 with those in 2007-8. Some subject areas have appeared to change their methods radically. Others have not. Though the samples are small it is an interesting exercise to try to correlate it with the departments statistical process charts over that period. One would like to say that changes in the variety of teaching methods result in improvement but the evidence is a little tenuous so far.

  31. 4. What have we learned from SOSCA?

  32. GRAPH 1 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO KEY STAGE THREE TEACHER ASSESSMENT

  33. GRAPH 2 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO SOSCA

  34. GRAPH 1 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO KEY STAGE THREE TEACHER ASSESSMENT GRAPH 2 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO SOSCA

  35. Using MIDYIS and SOSCA puts the school in a strong position to improve its professional judgment of teacher assessments at Key stage 3. Statutory testing disappeared in Wales some five years ago. Comparing the value added for MIDYIS to SOSCA and MIDYIS to KS 3 teacher assessment shows up some interesting data Schools who depend on teacher assessment data only to measure value added from Key stage 3 to Key stage 4 need to be aware of the pitfalls. The use of SOSCA data in this exercise highlights that http://www.tes.co.uk/article.aspx?storycode=6062337 see Subjective views of pupils - as well as pressure from parents - make model unreliable, warns Professor Tymms.

  36. The differences appear to relate to the types of assessment used in the various subject areas. English and Welsh use extended writing for teacher assessment which is more likely to have subjective judgments. What we have learnt from this is that despite a moderation process built on portfolios of work for teacher assessment, it is not sufficient in isolation. Computer adaptive tests such as SOSCA and the resulting value added information from MIDYIS are more informative in a diagnostic sense than levels produced by teachers for statutory assessment. SOSCA has also been used to investigate any changes in reading from baseline testing. A high correlation was found between the London Reading score given to pupils on entry the MIDYIS score and the SOSCA reading test.

  37. PITFALLS • Tracking developed ability measures over time. • Looking at average standardised residuals for teaching sets. • Effect of one result in a small group of students

  38. REGRESSION TOWARDS THE MEAN Pupils with high MidYIS scores tend to have high SOSCA scores but not quite as high. Similarly pupils with low MidYIS scores tend to have low SOSCA scores, but not quite as low.  It is a phenomenon seen in any matched dataset of correlated and normally-distributed scores, the classic example is a comparison of fathers' and sons' heights.  Regression lines reflect this phenomenon - if you look at the predictions used in the SOSCA value-added you can see that for pupils with high MidYIS scores their predicted SOSCA scores are lower than their MidYIS scores, whereas for pupils with low MidYIS scores their predicted SOSCA scores are higher than their MidYIS scores.  

  39. CLASS REVIEW BEWARE PITFALLS INTERPRETATION

  40. SUBJECT M

  41. PLEA DON’T LET THE SYSTEM DESTROY THIS creative thoughtfulness

  42. USING CEM SECONDARY DATA PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference Geoff Davies

More Related