An interesting graphic came across my screen this week. The purpose was to call attention to the hours spent testing elementary students by comparing them to the tests for college or graduate school:
Standardized testing is not new to schools in the State of Connecticut. Many schools will be using the Smarter Balance Assessment (SBAC) this year (pilot) for state testing. The new testing schedule will be the same as the NY State tests. The SBAC website provides testing times:
Both charts illustrate the number of hours that elementary, middle, and high school students will sit in order to take tests to measure their achievement in meeting the Common Core State Standards (CCSS). The SBAC tests will be given over a period of week(s), and scheduling may depend on the number of available computers that meet the testing software criteria.
Each sitting will match the minimum amount of time an older student sits for college and law school entrance exams. While these entrance exams (SAT, LSAT, and MCATs) are taken only once, the SBACS are taken annually in grades 3-8 and again in grade 11. Consider that an average student’s experience taking the SAT is a little under four hours, while a student will take the SBAC repeatedly for a total of 52 hours over the course of one academic career. Yet, the hours spent taking a test are not the only hours committed.
Washington Post education reporter, Valerie Strauss, cited a study by the American Federation of Teachers in her July 25, 2013, article “How much time do school districts spend on standardized testing? This much.” The report compared “two unnamed medium-sized school districts — one in the Midwest and one in the East” and determined that:
The grade-by-grade analysis of time and money invested in standardized testing found that test prep and testing absorbed 19 full school days in one district and a month and a half in the other in heavily tested grades.
The percentage of time for SBAC testing is roughly .07% of the school year (based on an average of 1100 school hours/year), but when when test preparation is added, (ex:19 days), that percentage jumps to 11%. This jump is enough to make the time for test preparation equivalent to a year of physical education classes. Ironically, research is proving that physical education may be the best kind of test preparation.
An article by Dr. Catherine L. Davis and Dr. Norman K. Pollock detailed some of the more recent studies on the relationship between physical education and cognition, noting that “benefits have been detected with 20 minutes per day of vigorous physical activity”.
Their paper, Does Physical Activity Enhance Cognition and Academic Achievement in Children? determined that, “incorporating 40 minutes per day of vigorous activity to attain greater cognitive benefits would require additional programs available to children of all skill levels.” They concluded that:
In a period when greater emphasis is being placed on preparing children to take standardized tests, these studies should give school administrators reasons to consider investing in quality physical education and vigorous activity programs, even at the expense of time spent in the classroom. Time devoted to physical activity at school does not harm academic performance and may actually improve it.
Schools are motivated to try different strategies in order to improve test scores. The data from standardized tests are used to determine the effectiveness of curriculum as well as individual student performance. Standardized test scores are also an increasing metric in teacher evaluations. In the State of Connecticut, test scores could count as much as 40% in a teacher’s performance review, with the spotlight on those educators who teach in testing grades 3-8 and grade 11.
Paradoxically, the focus on standardized testing as an evaluation tool is a contributing factor to the increasing commitment of time and resources to test preparation. Next generation tests like the SBACs will be taken on computers that will require school systems to invest in computer hardware that meets specific criteria. The cost of the hardware and practice software could be justified by increasing the number of students who will take the tests.
Additionally, those who fund education want tests that run on this hardware to be an effective measure of student achievement, and these tests must be of a substantive duration to make the expense worthwhile. Given the commitment of time and money, students will continue to sit for tests and test preparation, perhaps for even longer periods in the future.
What might students be thinking about sitting for all these standardized tests?
They might borrow the words of their favorite author, Dr. Seuss, “And we did not like it. Not one little bit.”