The burning education issue facing most states at the moment is which tests should they give their K-12 students next year to satisfy the conditions of their waivers from the United States Department of Education (USED) or the commitments they made in their Race to the Top (RttT) applications, whether or not they received an RttT grant or other funds from the USED or the Bill and Melinda Gates Foundation.
The two testing consortia funded by the USED – Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium (SBAC) – for the purpose of developing common tests based on Common Core’s standards have experienced dwindling state commitments. SBAC is down to less than 20, and PARCC is down to possibly 9. Both consortia have been piloting test items across the states this past academic year to acquire the pool of items needed for computer-adaptive testing (by SBAC) and for gauging difficulty levels at all the grade levels participating in the assessments (K-11).
A new twist is the question of whether state boards, commissioners, and/or departments of education committed their states (i.e., the taxpayers) to particular testing companies and future technology costs without going through statute-mandated bidding procedures and cost-benefit analyses. New Mexico and Louisiana seem to be tied up in constitutional issues on contractual matters, while Arizona is trying to ensure it follows its own statutory bidding procedures.
What hasn’t been getting much attention from mainstream media, possibly because most reporters have no children in Common Core-based classrooms and don’t talk to parents of school-age children on a regular basis, are the problems students and teachers are encountering with the tests themselves and the similarities in the problems reported for PARCC and SBAC pilot tests.
The information on PARCC’s pilot tests comes from school administrators in the Bridgewater-Raynham Regional School District in Massachusetts, as reported on June 18 in Wickedlocal.com. The article was based chiefly on what took place at a school board meeting in June, during which the School Committee voted unanimously to stay with MCAS, the state test, for the next academic year. At the meeting, the school administrators explained why they wanted to stay with MCAS, based on the experiences teachers and students had with the PARCC pilot tests the school district gave in the spring of 2014.
“It’s like telling our teachers, ‘We’ll teach you how to drive.’ But then the test says you won’t be driving cars. You’ll be driving boats,” said Bridgewater-Raynham school Superintendent Jacqueline Forbes of the PARCC exam. “It’s not aligning with our curriculum or instruction.”
Based on pilot testing, school officials said PARCC did not match up with Bridgewater-Raynham’s teaching methods and also contained numerous technological flaws.
“The one word I’d use to sum up our experience is ‘frustration,'” said Brian Lynch, an elementary school principal. “First, there were a lot of problems administering the test, which is taken on a computer – and the snags weren’t on the district’s end.”
“Second, the test requires students to be familiar with software programs the district does not teach,” Lynch continued. “The district uses a lot of technology, but students still take basic math tests on topics such as number lines and graphing using a paper and pencil.”
“Are we testing math or are we testing a child’s ability to drag and type?” asked Forbes. “We don’t teach typing in third grade. It’s not developmentally appropriate.”
According to high school Principal Angela Watson, the district piloted the PARCC Algebra I test to randomly selected ninth graders.
“Unfortunately, what we found is our written, taught and assessed curriculum doesn’t match up exactly with the PARCC exam. … It puts kids in unfamiliar territory,” Watson said. “It would take time and resources to make the switch to a curriculum that matches up with PARCC.”
Forbes, however, said that effort might turn out to be misdirected because other districts have articulated similar concerns about the PARCC test.
Regarding SBAC’s pilot tests, a recent letter by Fairgrounds Middle School Principal John Nelson to Nashua Superintendent Mark Conrad provided a disturbing picture, wrote the Nashua Telegraph in late January.
New Hampshire teachers had been asked by their local superintendent of schools to take an early version of SBAC in December 2013. According to the article, the teachers said the “new computerized test is confusing, doesn’t work well, and leads to frustration.”
In his letter to members of the Nashua Board of Education, Nelson said, “Teachers shared frustrations they had when they were taking the test and disappointment in test format and the difficulties they had trying to use their computer to take this test.”
His teachers agreed the test should not be used on Nashua students.
The FMS staff collectively believe that the Smarter Balance Test is inappropriate for our students at this time and that the results from this test will not measure the academic achievement of our students; but will be a test of computer skills and students’ abilities to endure through a cumbersome task.
Despite the teachers’ plea and support from Nashua’s teacher union, Conrad, the state board, and Department of Education refused to back down, leaving Nashua’s students with a test their own teachers think is meaningless.
As in Nashua and Bridgewater-Raynham, local reporters all over the country are likely reporting what is happening in their local schools as they pilot Common Core-based tests. But Congress, state legislators, governors, and other policymakers at the state and national levels are not getting an accurate picture of what is happening to the curriculum in our public schools or to the children in them.
Sandra Stotsky, Ed.D. is Professor Emerita at the University of Arkansas.