1 / 25

National Assessment of Educational Progress

National Assessment of Educational Progress. Ashley Singer University of Central Florida ARE 6905 April 16, 2013. Purpose. Teachers demand for creating applications from results

alyson
Télécharger la présentation

National Assessment of Educational Progress

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. National Assessment of Educational Progress Ashley Singer University of Central Florida ARE 6905 April 16, 2013

  2. Purpose • Teachers demand for creating applications from results • Add more value and clarity to existing NAEP testing by addressing the lack of teacher questionnaire in the visual arts assessment that is present in nearly all other NAEP subjects.

  3. Why It’s Important • Exists as a result of lack of lucidity that accompanies NAEP visual arts assessment data • Nothing offers possible explanations for results • Teachers are left to interpret the numbers without guidance • Add some context to the data • Be able to see what is current or trending in the classroom • What could be lacking in their curriculum • What is proving successful in their practice • Universities and schools could use to see what educator training programs have been successful in equipping teachers for their fields • What teachers may be lacking in their classrooms and how to give it to them

  4. Research Questions • How can NAEP clarify the results of the visual arts assessment by adding a teacher questionnaire with common practice and teacher background similar to existing teacher questionnaires of other subjects? • How can the demographic and background information be applied to understanding knowledge and experience as well as hiring trends? • Do the findings suggest certain training and specialties lead to classroom achievement? • What areas of art education are being concentrated on and what areas are being neglected? • How could we take the results to further develop a NAEP curriculum and understand best practices?

  5. What I’ve Learned • Creating a well-done questionnaire is difficult • Basic structure and style is simple • Scientific approach is tedious and thought-provoking • Various steps to developing a questionnaire • Not just writing whatever questions you want answers to and expecting reliable results from them • Overwhelming to develop questions that would yield best applications for educators while answering my questions

  6. What I’ve Learned • Critical to review who is writing tests, papers, and surveys • What do you want to know? • Likely based on what they know, their experience, or what they want to know • May not be true representation of the information • Boards and panels are important • Can also be influenced by central philosophy or philanthropist • Reduce bias based on multiple experiences and perspectives

  7. What I’ve Learned • Objectivity and Adjustments • Analyzing previous and test-specific data • Research’s ultimate progress • Minor and major changes made to improve tests • Discussion of limitations shows what could be better • Changes are not personal – just progress • Adjustmentsto create another test • Designed around teacher’s training and preparedness • Areas of focus, certification process, work history, etc • What is making teachers ready for the classroom

  8. What I’ve Learned • NAEP • Obvious need for more clarity • If complaints are lack of application, they have to find ways to make it relevant • Educators must be a part of the process • Either in test development, research, or advocacy • NAEP could find more ways to reach out to teachers • Whether you are a researcher or a teacher, you cannot continue doing things the same way and expect different or better results

  9. Review of Literature • “Finally, the arts assessment reminds us once again that arts education is for all students, not just for the talented. No one has suggested that math or science should be taught only to students with talent in those disciplines. The arts, similarly, provide long-term benefits that are important for every student. Experience has demonstrated to arts educators that all children can learn basic arts skills and knowledge, provided that they begin instruction early enough.” (Lehman, 1999) • “Most NAEP assessments” have teacher questionnaire (NAEP, 2012) • The common education practitioner often has difficulty gleaning consequence and meaning from the scores – must ask what we know about these teachers (Eisner, 1999) • “Test performance, like paintings, needs to be ‘read,’ not only seen. Information needed to give test scores a deep reading is very limited” (Eisner, 1999 • Recent study - “revealed that untrained people do not simply walk into classrooms and become successful” prepared and certified teachers are more successful than the untrained ones (Hatfield, 2007) • Test results only leave readers with “value without clarity” (Diket & Brewer, 2011)

  10. Review of Literature • “While teachers’ completion of the questionnaire is voluntary, NAEP encourages their participation since their responses make the NAEP assessment more accurate and complete” (Teacher Questionnaire, 2011) • Covers: • “teaching experience, certifications, degrees, major and minor fields of study, coursework in education, course work in specific subject areas, the amount of in-service training, the extent of control over instructional issues, and the availability of resources for the classroom” (Teacher Questionnaire, 2011) • “pre- and in-service training, the ability level of the students in the class, the length of homework assignments, use of particular resources, and how students are assigned to particular classes” (Teacher Questionnaire, 2011)

  11. Methodology Population • Similar to NAEP sample selection • Need to be directly related to the test results • Teacher questionnaires must match up with NAEP participants’ classrooms, schools, districts, etc. • NAEP participation is entirely voluntary • Teachers survey would also be voluntary • No way to accurately forecast who will be undergoing the research and how they represent the actual population of the United States visual arts classroom • NAEP visual arts exam only covers eighth grade students • Only be administered to corresponding eighth grade teachers of the visual arts program

  12. Methodology Procedures • Similarly follow NAEP testing to adhere to procedural protocol • Teachers will be given a general background questionnaire and a subject-area specific questionnaire • Consists of a series of select-response questions • Teachers will mark their answers in their booklet or record answers online as accurately as possible • Once the survey is finished the online answers will be saved or the booklet can be given to the NAEP school coordinator • Methodology – Descriptive/Quantitative • Used to look for trends and graph opinions, facts and demographic data • Used to make recommendations for classroom application • Could prove to be effective information for correlation tests

  13. Instrumentation • Development based on: • Other teacher questionnaires • Reading and writing teacher questionnaire. (2011). National Assessment for Educational Progress. • Writing teacher questionnaire. (2010). National Assessment for Educational Progress. • Teacher data in NAEP Data Explorer • NAEP 1997 national theatre results. (2002). National Assessment of Educational Progress. • Considered questionnaire development resources • Gillham, B. (2000). Developing a questionnaire. New York, NY: Continuum. • NAEP teacher questionnaire overviews • Teacher questionnaire. (2011). National Assessment of Educational Progress.

  14. Data Analysis • Best done by professional statistician • Per advice for collaboration within quantitative research (Brewer, 2013) • Descriptive • Analysis will show trends, demographic data, etc. • Correlation • Correlation testing to note potential relationships between student results and teacher questionnaires

  15. Results and Implications • Speculative in nature • Descriptive and correlation research • Whatever results are reported, they will be limited to: • Making recommendations, not judgments • Seeing relationships, not causes • Add transparency to results • Show that specific subjects are highly promoted or often neglected in classrooms • See what practices (i.e. writing, production, assessment, presentation, critical analysis) are being done in classrooms and which are not • Educational background and current practice and training in the field of the teachers With that information, we can compare the educators with the “ideal practices” and see how their classrooms performed on NAEP testing and determine possible explanations for success or failure by looking for patterns.

  16. Results and Implications • More background information = results will likely be more generalizable and reliable (Brewer, 2013) • Results from NAEP follow principle with teacher background • Generalizability  usefulness • Step towards examining school structure and culture that Eisner calls for in order to make improvements in student achievement (1999) • Could likely affect the qualifications for hiring and successful preparation programs if Hatfield is correct • Relationships between student success and certain visual arts subjects and practices  individual classroom structures may progress and a possibility for curriculum improvements • Rationale for teacher adjustments The call for direct applications may finally be heard and answered.

  17. Limitations • Not having a board or a panel creating the survey • Solely developed by me • Based on what I want to know – no hidden agendas • No other perspectives or experiences • Based on my experience or lack thereof • Cause assumptions because of what I think I know about the issues (Gillham, 2000) • Quick development • No pre-pilot or pilot stage • Affects wording and understanding (Gillham, 2000) • Assumed done from other questionnaires • Sample population variable

  18. Instrumentation

  19. Instrumentation

  20. Instrumentation

  21. Instrumentation

  22. Instrumentation

  23. Instrumentation

  24. Instrumentation

  25. References Brewer, T. (Forthcoming, 2013). A primer for today’s quantitative research in art education. In K. Miraglia & C. Similian (Eds), Inquiry in Action: Research Methodologies in Art Education. Reston, VA: National Art Education Association. Diket, R. M., & Brewer, T. M. (2011). NAEP and policy: Chasing the tail of the assessment tiger. Arts Education Policy Review, 112(1), 35-47. Retrieved from http://www.informaworld.com.ezproxy.lib.ucf.edu/openurl?genre=article&id=doi:10.1080/10632913.2011.518126 Eisner, E. W. (1999). The national assessment in the visual arts. Arts Education Policy Review, 100(6), 16-20. Retrieved from http://ezproxy.net.ucf.edu/login?url=http://search.ebscohost.com.ezproxy.net.ucf.edu/login.aspx?direct=true&db=eric&AN=EJ624037&site=ehost-live Gillham, B. (2000). Developing a questionnaire. New York, NY: Continuum. Hatfield, T. A. (2007). The unevenness of arts education policies. Arts Education Policy Review, 108(5), 9-13. Retrieved from http://ezproxy.net.ucf.edu/login?url=http://search.ebscohost.com.ezproxy.net.ucf.edu/login.aspx?direct=true&db=eric&AN=EJ771257&site=ehost-live Lehman, P. R. (1999). Introduction to the symposium on the "NAEP 1997 arts report card.". Arts Education Policy Review, 100(6), 12-15. Retrieved from http://ezproxy.net.ucf.edu/login?url=http://search.ebscohost.com.ezproxy.net.ucf.edu/login.aspx?direct=true&db=eric&AN=EJ624036&site=ehost-live Mathematics teacher questionnaire. (2013). National Assessment for Educational Progress. Retrieved from http://nces.ed.gov/nationsreportcard/bgquest.asp NAEP 1997 national theatre results. (2002). National Assessment of Educational Progress. Retrieved from http://nces.ed.gov/nationsreportcard/tables/art1997/sdt02.asp National Assessment for Educational Progress (NAEP). (2012). Questionnaires for Students, Teachers, and Schools. Retrieved from http://nces.ed.gov/nationsreportcard/bgquest.asp Reading and writing teacher questionnaire. (2011). National Assessment for Educational Progress. Retrieved from http://nces.ed.gov/nationsreportcard/bgquest.asp Teacher questionnaire. (2011). National Assessment of Educational Progress. Retrieved from http://nces.ed.gov/nationsreportcard/tdw/instruments/noncog_teach.asp Writing teacher questionnaire. (2010). National Assessment for Educational Progress. Retrieved from http://nces.ed.gov/nationsreportcard/bgquest.asp

More Related