[ Pobierz całość w formacie PDF ]
.For this assessment, stu-dents in grades four, eight, and twelve take tests in reading, math,writing, science, history, geography, civics, and the arts althoughpolitical and media interest in the NAEP has seemed to focus onlyon math, reading, and science scores.Schools from each of the fiftystates are randomly selected for participation in this assessment,and its stated purpose is to provide data that will allow comparisonsof states and of the nation s performance over time.Are NAEP tests really different from the state tests that driveinstruction in America s public schools? In fact, even the federal EDUCATION S CATCH-22 137government s education website asserts that they are.Further, edu-cational researchers Nichols, Glass, and Berliner (2005) conducteda study that confirmed these differences.They examined high-stakes testing pressure and NAEP scores in each state to see ifmore pressure to succeed on state tests led to improvements on theNAEP.Their data indicated that pressures put on fourth grade studentsto perform on state math tests correlated  weakly with score im-provements on the fourth grade NAEP math test.In other words, instates where there was a great deal of pressure to perform well onstate tests such as the threat of retention for a failing score fourthgrade students did a little better on the NAEP math assessment.However, these researchers did not find a correlation betweenpressures to perform on state tests and the eighth and twelfth gradeNAEP math scores or scores on any of the other subjects tested bythe NAEP in fourth, eighth, or twelfth grade.As they explained, themath curriculum in the early grades is apparently more standard-ized across the United States than it is in higher grades.Since Nichols, Glass, and Berliner (2005) could not establish acorrelation on any other subject assessed by the NAEP or in mathat the eighth grade or twelfth grade, we can infer that there is a cur-riculum mismatch between some, or most, of the state tests and theNAEP in all areas except fourth grade math.In brief, with the ex-ception of that one subject and grade level, the NAEP appears tobe substantially different from many state tests.As Gerald Bracey (2003b) noted, discrepancies between stu-dents scores on state-mandated tests and the NAEP are oftenhuge.For example, 91 percent of the eighth graders in Texas scoredproficient on their state s math test in 2003, but only 24 percent ofTexas s eighth graders scored proficient on the NAEP math assess-ment.Scholars and test statisticians agree that it is more difficult toachieve a score of proficient on the NAEP than it is on most statetests.Nonetheless, as Bracey suggested, it is incredibly ironic thatthe home state of President Bush and former Secretary of Educa-tion Rod Paige had the greatest discrepancy between a score on itsown state test and the corresponding NAEP test of any of the fiftystates. 138 CHAPTER NINEWe can understand this huge variation by considering the highstakes associated with performance on state-mandated tests inTexas.As the model for NCLB, Texas has been noted for exertingespecially intense pressure on schools to have all students scoreproficient on its state assessments.If there is a substantial differ-ence between the content and format of the NAEP and the testsused in Texas, we can logically expect large discrepancies in thenumber of students scoring proficient on the two sets of tests.Moreover, because NCLB now imposes on all states the same kindsof pressures that have characterized public education in Texas, wecan only wonder if other states will soon begin to have lower NAEPscores as they try to achieve higher adequate yearly progress (AYP)goals on their own tests.If that occurs, schools in those states will be caught in a trou-bling Catch-22.If they do not improve performance on theirstate tests in accordance with NCLB requirements, they will belabeled as failing schools.However, if they focus solely on theirstate tests to achieve compliance with AYP targets, they may per-form less capably on the NAEP because their students did notlearn the content it assesses.In either case, politicians and jour-nalists will be quick to seize the opportunity to proclaim an edu-cational crisis.Rod Paige engaged in exactly that type of behavior when he wassecretary of education.In a 2003 article, he criticized American stu-dents civics achievement on the NAEP, noting that most fourthgraders who took that test could not explain the meaning of thephrase  I pledge allegiance to the flag.The Tennessee curriculum requires me to teach my fourthgraders about the freedoms guaranteed by the Bill of Rights, the re-sponsibilities of U.S.citizens, the roles of the three branches ofgovernment, the checks and balances in our Constitution, and nu-merous other civics concepts.However, as important as it is thatchildren be able to explain the text of the Pledge of Allegiance atsome point in time perhaps when they are old enough to articu-late abstract concepts such as allegiance it simply isn t in thefourth grade curriculum in Tennessee, and that makes me wonderhow many other states teach it at that grade level. EDUCATION S CATCH-22 139When we consider that teachers are evaluated on their chil-dren s success on state tests, that NCLB requires steady improve-ment on these state tests toward an ultimate goal of 100 percentproficiency, that students scores on these tests are sent home eachyear, and that school results are published annually, it s obvious thatteachers are going to focus on their own states curricula and not theconcepts tested by NAEP.And I suspect Rod Paige knew that whenhe criticized fourth grade civics achievement on the NAEP.Especially since the passage of NCLB, state-mandated testshave driven classroom instruction in America s public schools.State-mandated tests are the most frequent topic of discussion atfaculty meetings, administrative gatherings, and educator work-shops [ Pobierz całość w formacie PDF ]

  • zanotowane.pl
  • doc.pisz.pl
  • pdf.pisz.pl
  • sp2wlawowo.keep.pl