After yet another investigation into alleged cheating on DC Public Schools’ student achievement tests, DCPS officials yesterday announced that they were tossing out the standardized test scores for three classrooms. If one reads between the lines, it appears that the current action was based on allegations that someone altered the beloved bubble tests after the students took the exam.
This follows on the heels of similar allegations in Atlanta last year, which forced the resignation of long-time Atlanta Public Schools Superintendent Beverly Hall. And, of course, this isn’t the first time that DCPS has investigated alleged altering of the bubble sheets on its exams. The same charges were levied just a few years ago.
For the past few years, we have heard EdSec Arne Duncan rail against the dreaded “bubble test.” And while the good EdSec may be taking issue with such exams for a very different reason, he is correct. The days of No.2 pencils and scanned bubble sheets should be over.
With a growing chorus of opposition to bubble tests, with allegations of cheating on said tests on the rise, and with those pencil-and-scan sheet exams viewed as a general enemy to the educational process, it begs some essential questions. Why aren’t we testing through other means? In our 21st century learning environment, why do we still use 19th century testing approaches? Can we build a better testing mousetrap?
Those first two questions are typically answered with the usual responses. Change is more difficult than the status quo. We fear the new. If it isn’t truly broken, why try to fix it? It costs too much, either in dollars or in stakeholder chits. We don’t know enough yet (maybe we can form a committee to explore). It just isn’t a high enough priority.
As for the last question, though, we have already built a better mousetrap. A few states have begun using online adaptive testing, demonstrating promising practice (on its way to best practice). The gold standard, at this point, is Oregon’s OAKS Online, or the Oregon Assessment of Knowledge and Skills. Following on its heels are similar online adaptive assessment systems in Hawaii and Delaware. And with a $176 million grant from the U.S. Department of Education, the SMARTER Balanced Assessment Consortium (led by the State of Washington) is looking to develop a similar assessment framework to measure the K-12 Common Core State Standards.
Why these new systems? To the point, they seem to assess student achievement and learning faster and better than ye olde bubble sheets, at a lower cost to the states. From a practical point of view, they hopefully bring testing up to speed with instruction and learning. If we are serious about a 21st century education for all, it only makes sense that we would couple that with 21st century assessment. And that just isn’t done with a stick of wood and some graphite.
So in looking at alleged issues in DC, Atlanta, and elsewhere, the last questions we should be asking is how to avoid erasures on tests or the best way to detect systematic changes on bubble sheets. Instead, we should be asking why we aren’t using a more effective testing system in the first place, a system that better aligns with both where we are headed on instruction and how today’s — and tomorrow’s — students actually learn?
* Full disclosure — Eduflack does work related to the assessment efforts in Oregon, Hawaii, and Delaware.