Best of, Worst of for Student Test Data?

Two data sets on student performance are out this week.  But what exactly does the data tell us?  And more importantly, what do we say about the data?

According to TIMSS, math and science scores for U.S. students simply aren’t keeping pace with performance of students in foreign countries (particularly those in Asia).  We’ve heard this story time and again, but TIMSS provides us some pretty clear data that we have a ways to go before our students are truly about to compete on the evolving global economic stage.

And then we have yesterday’s release of the Trial Urban District Assessment (or as it is affectionately know, the NAEP TUDA).  This data set shows that students in our urban centers are making gains in math and reading.  And the math scores are really showing promise.  Of course, these urban scores are still below the national averages.

So what does it all tell us?  With regard to NAEP TUDA, one has to assume that some of the interventions made possible through NCLB are working.  Districts like Atlanta, Charlotte, Chicago, DC, and LA are prime targets for NCLB and Title I dollars, and these test results demonstrate that students in those schools are posting increases higher than the average American student.  That speaks of promise and of possibility.

But juxtaposed with the TIMSS data, it sends up a warning flag.  If we’re making gains at 2X, but our international counterparts are running at 3X, it doesn’t take a NAEP numbers cruncher to see that we are never going to catch up.  How are we supposed to read all this?

The communications challenge here is identifying our goals, both in terms of policy and public perception.  Do we seek to be the best in the world, or do we focus on the gains in our backyard?  Does it matter how we are doing against Singapore if our Title I schools are making the gains necessary to put all students on a pathway to a good-paying job?  And when are we going to see the quantitative proof of reading gains that we have witnesses anecdotally for the past two years?

At the end of the day, the message is simple.  Our schools, particularly those in low-income communities are improving.  Our focus on student achievement, effective assessment, and quality teaching is starting to have an impact.  And by identifying what works in Houston, NYC, and other cities, we can glean what will work in other cities and towns across the country.  We’re gaining the data to move the needle and get beyond the student performance stagnation we’ve experienced for the past  few decades.

Yes, the TIMSS scores are disappointing.  But sometimes we need to set the negative aside, and concentrate on the positive.  Let’s look at what works, and use it to fix what doesn’t.  Who knows?  Those NAEP TUDA students may be just the answer we need to right the TIMSS ship in four or eight years.


2 thoughts on “Best of, Worst of for Student Test Data?

  1. That’s one of those great questions that more people should be asking.  If you look at today’s http://www.educationnews.org, it is a question they ask as well.  NYC has posted data that shows significant gains on math and reading in recent years.  But the NAEP TUDA shows relatively flat for NYC.  We should be asking why the discrepency and what students have led the overall improvement for NYC schools.  What does the disaggregated data tell us?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s