1 / 50

The Impact of No Child Left Behind (NCLB) on School Achievement and Accountability.

The Impact of No Child Left Behind (NCLB) on School Achievement and Accountability. Glenn Maleyko Wayne State University Detroit, MI October 17, 2011, Dissertation Defense. No Child Left Behind (NCLB).

aine
Télécharger la présentation

The Impact of No Child Left Behind (NCLB) on School Achievement and Accountability.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Impact of No Child Left Behind (NCLB) on School Achievement and Accountability. Glenn Maleyko Wayne State University Detroit, MI October 17, 2011, Dissertation Defense

  2. No Child Left Behind (NCLB) • The No Child Left Behind (NCLB) reform may be the most significant legislation affecting public education that has been enacted by the federal government during the past 35 years (Peterson & West, 2003). • The Implementation of public education legislation has historically and constitutionally been the responsibility of individual states; however NCLB educational policy at the federal level sets mandates for performance standards with required consequences if they are unmet (Hess & Finn, 2004).

  3. Statement of the Problem • The purpose of this study was to evaluate the effectiveness of AYP at measuring school success in order to establish the conditions for school reform at the school level and classroom level. • This included an analysis of both quantitative data and qualitative data.

  4. Refer to Handout Packet #1

  5. Sample Population • Quantitative: All schools that took the NAEP assessment in reading and mathematics at the 4th and 8th grade level during the years 2005 and 2007 from the sample states: • 1. California • 2. Michigan • 3. North Carolina • 4. Texas • Qualitative: 4 schools from the Quantitative Dataset. • 4 teachers and one administrator were selected from each school for the semi-structured interview protocol.

  6. National Assessment of Educational Progress (NAEP) Assessment • Complicated formula (Plausible Values) • This data was difficult to use and it is where I spent the majority of my time with data analysis in this study • A formula was created to develop school level proficiency on the NAEP which is the first of its kind • Refer to Handout Packet #2 for variable definitions.

  7. Pearson Correlation Analysis r value defined Terminology Approx. Range Minor (+ or -) .000 to .099 Moderate (+ or -) .100 to .399 Large, Strong, or Sizable (+ or -) .400 to .549 Extremely large or Strong (+ or -) .550 to 1.0

  8. Research Question One Findings • California, Michigan, some of the NC datasets, and the 8th Grade Texas dataset -- --Provided a stronger relationship with AYP and the basic level NAEP scale vs. AYP and the proficient level NAEP scale Please refer to Handout Packet #3 and #4 for NAEP basic and proficient level definitions.

  9. Michigan 8th Grade 2007 AYP Please refer to Handout Packet #5

  10. North Carolina and Texas • Most of the North Carolina datasets produced a stronger relationship with AYP and the proficient level NAEP scale vs. the basic Level NAEP scale • The Texas datasets had a very minor relationship with AYP and the NAEP proficiency at the proficient and basic levels

  11. North Carolina 2007 4th Grade Dataset Please refer to Handout Packet #6

  12. Research Question 2 Results: • Michigan, California, and some of the NC datasets: • Produced Pearson Correlation results and Logistic Regression results showing a stronger relationship with the basic level NAEP scale and state accountability assessments • vs. the proficient level NAEP scale and state accountability assessments

  13. Research Question 2 Results • California had the greatest increase from the constant model vs. the predicted model showing that the logistic model had a strong influence • Thus the logistic model was useful in California. • It is important to emphasize that the logistic regression is a more powerful tool of analysis for predicting the probability that a school will meet proficiency status.

  14. Constant Model 2005 California 8th Grade Please refer to Handout Packet #7

  15. Predicted Model 2005 California 8th Grade Please refer to Handout Packet #7

  16. Research Question 2 Results • Texas and California had similar sample sizes and population demographics including a higher level of the HISPANICPER variable • However, the Texas results were not very useful and there was a very minor relationship between the state accountability assessments and the NAEP as measured by the Logistic Regression and the Pearson Correlation results

  17. Research Question 2 Results • Explanation: An extremely high percentage of schools in Texas met proficiency status resulting in low variability with the dependent variable. • Texas had the weakest relationship with NAEP at both the basic and proficient level among the 4 sample states • This suggests less rigor with the Texas state accountability assessments

  18. Texas 2007 4th Grade Mathematics: Pearson Results Please refer to Handout Packet #8

  19. Texas 2007 4th Grade Math Logistic: Constant Model Please refer to Handout Packet # 9

  20. Research Question 2 Findings • The California and NC logistic regression results were the most useful in this study • The Pearson results in Michigan were useful showing that that was a strong association with the NAEP scale basic level and the state accountability assessment results • Texas Results were the least useful

  21. Michigan 2005 8th Grade Mathematics Please refer to Handout Packet #10

  22. Research Question 2 Results: NC • North Carolina produced some of the most intriguing results • DUMMYREV had a positive impact in NC • 2007 4th and 8th grade mathematics results produced the largest r value in relationship to the proficient level NAEP scale • The Logistic Regression results also showed that there was a strong relationship with the 2007 NC state accountability assessments at the proficient level and the NAEP assessments in mathematics.

  23. 2007 NC 4th Grade Math Please refer to Handout Packet #11

  24. 2007 North Carolina Math Please refer to Handout Packet #12

  25. Research Question 2 • However, NC changed the reading cut scores in 2007, thus there was a very weak relationship with the reading assessments in 2007 • Cut score manipulation was one of the findings in the literature review (Guilfoyle, 2006; Harris, 2007; Sunderman, et al. 2005) • Other cut score manipulations, Michigan 2011 example, College Readiness

  26. Research Question 3 Logistic Regression results were similar to research question 2 results • NC and California Logistic Results were the most useful • The Texas logistic regression results were less useful supporting the finding that there was a low level of rigor with the Texas assessments as a high percentage of schools met AYP in Texas • This was an interesting finding in itself as almost all schools in Texas met AYP during the 2005 and 2007 school years

  27. Economically Disadvantaged (EDPER) variable: All 3 Research questions • The EDPER variable produced a consistent negative association with AYP proficiency. The results were generally found to be moderate to sizable • This was the most consistent finding in the study with both the Pearson correlation results and the logistic regression.

  28. California 2007 8th Grade Mathematics Please refer to Handout Packet #13

  29. NC 2005 8th Grade Mathematics Please refer to Handout Packet #14

  30. Demographic Variables 3 Research Questions • SPECIALEDPER and ELLPER variables produced inconsistent results: ELL in California a greater negative impact • NATAMPER, BLACKPER, and HISPANICPER variables were not consistent among the different datasets. • However, BLACKPER in Michigan produced a consistent negative influence on the probability that a school met proficiency status

  31. Demographic Variables 3 Research Questions • TOTALREVENUE produced minor to moderate positive associations with State AYP and the accountability assessments • TOTALREVNUE in NC had a strong positive association with the NAEP proficient level scale in comparison to State AYP

  32. Demographic Characteristic Findings from 3 Research Questions • The ASIANPER and WHITEPER were inconsistent but generally had a positive influence on state accountability results

  33. Research Question 4: School Population: Each School had different demographics • Blue School, low ED population • Red school high ED population • Orange school high ED and ELL population • Purple School high ED located in an urban setting. Experience with restructuring. Please refer to Handout Packet #15

  34. Research Question 4 Findings • AYP identification and sanctions will not lead to improvement based on participant responses • Sanction provisions lacking scientific evidence of effectiveness. Literature review aligned with participant responses • Ex. Red School principal “Dropping down the hammer will not work” Additional support is needed.

  35. Research Question 4 Findings • Purple School went through the NCLB restructuring process. • Did not feel restructuring led to improvement. • The removal of one grade provided change and actually gave the school a better statistical chance of making AYP as identified in the literature review.

  36. Research Question 4 Findings • EDPER variable and Triangulation: The literature review, quantitative dataset and the qualitative data established that this variable had the greatest impact on student achievement as measured by AYP and the standardized assessments. • Blue School principal: One of the only respondents that felt the ED population did not make a difference with AYP • Blue School had less than 3% ED.

  37. Research Question 4 Findings: Social Capital Social Capital Influence • Researchers (Elmore, 2002; Harris, 2007; Mathis, 2004: Mathis, 2006) contend that AYP is measuring school demographics and the social capital that students bring to the school instead of school effectiveness. If this is the case, then the external validity of the AYP measurement formula is questionable. • Orange School Felt that ELL students had the greatest impact but their ELLPER population matched the EDPER population

  38. Research Question 4 Findings: School Funding • Most participants felt that school funding was key. • Title One funds and additional resources were important when servicing ED students. • However, interesting that Blue School did not feel this way. Funding in their district was lower than others. However, they had a greater level of social capital in the school. • Probable that Social Capital with an Effective SIP was key to their success.

  39. Research Question 4: School Funding a two tiered variable • TOTALREVNUE had a positive impact or a mild negative impact with the quantitative dataset • The qualitative analysis helped to create a finding listing TOTALREVENUE as a two tiered variable • Funding is important but step two is how the funds are used: Effective or not effective

  40. Research Question 4 Findings: Positive Consequence • Focus on SIP plan and increased focus on scientific strategies and research based best practices (ex. Differentiated instruction) • Increased Collaboration with faculty • Increased focus on analyzing data and electronic systems • Increased focus on subgroup populations and individual student datasets at the classroom level

  41. Research Question 4: Philosophical Intent vs. Negative Impact or Unintended Consequences • Increased pressure on administration: Exception Blue school • Additional paperwork and tasks for administration • Stress on teachers and administrators • Possibility that some educators would not want to work in schools labeled as failing. • Merit pay, etc.

  42. Red School principal response • Well, I think it’s narrowing the curriculum. And its…. Taking away some of the choices kids have because they have to, you know, have to do all math. It kind of bothers me because I, you know, I think the purpose of education is to also help a child become more well rounded. But they also need to know math.

  43. Research Question 4: Negative Impact or Unintended Consequences • Low Morale. Ex. Orange school • Implemented Effective SIP based on my triangulation of data but not getting the AYP results. One subgroup had a negative impact • Less focus on higher level thinking skills needed for success after leaving the public school system

  44. Research Question 4 : Internal Capacity • No support at the school level for increased internal capacity and development • Many researchers in the field posit that the AYP formula is faulty because it relies on sanctions and punishments and fails to provide schools with the internal capacity to make change (Abdelmann & Elmore, 1999; Elmore, 2002; Fullan, 2006; Schoen & Fusarelli, 2008).

  45. Practical Implications • Data analysis is a positive outcome of AYP. • There was an increased focus on subgroups. • More support is needed at the building level to support schools with implementing effective SIP plans • Potential to make a great difference via accountability reform.

  46. However: We need to create an effective measurement system to measure school progress • This study shows that the current AYP measurement system is not effective at measuring school success and improvement. • Researchers maintain there is a lack of consistency with AYP that results in reliability issues with using AYP in order to measure school effectiveness (Elmore, 2002; Guilfoyle, 2006; Kane and Douglas, 2002; Linn & Haug, 2002; Porter et. al, 2005; Scheon & Fusarelli, 2008; Wiley et al. 2005).

  47. Theoretical Implications • Single Measure Standardized Accountability systems result in unintended consequences • If a single annual test were the only device a teacher used to gauge student performance, it would indeed be inadequate. Effective teachers assess their students in various ways during the school year. (Guilfoyle, 2006, p. 13)

  48. Theoretical Implications • The current system has a heavy reliance on standardized assessments (one size fits all measure) • A multiple Measures approach to evaluating school effectiveness would be more effective • Growth Data formulas • Use of the NAEP

  49. National Curriculum and use of NAEP • A move towards a National Curriculum and the Common Core Standards is gaining momentum • As mentioned, This was the first study that could be found which came up with a school level proficiency measure using the NAEP

  50. Summary and Questions

More Related