But what really matters is how the Department of Education uses that information. After all, the Department is in the best position to use the criticism constructively to improve our children's education.
So how has the Department responded to the assortment of recent education report cards?
Of the serious criticisms offered by the Fordham Foundation last month, acting Education Commissioner Marge Petit said, "There are lots of things in [the survey] that our system doesn't value and hasn't implemented." She added, "This is somebody else's agenda. It's not our agenda." Others disagree --an alternative viewpoint was discussed in a recent editorial.
When the American Federation of Teachers evaluated the clarity, specificity, and content of each state's academic standards in November and Vermont distinguished itself as being the only state in the country that failed to meet AFT standards in every single subject and on every grade level, Petit replied "curriculum is the domain of the local school districtů"
Now Education Week has again joined the chorus of disapproval directed at Vermont's school reforms, giving Vermont poor grades for standards, accountability, and efforts to assure teacher quality (a review of last year's report is here). Petit said these criticisms were cause for concern but not panic, adding that many of the criteria used by Education Week are not valued as highly by Vermont education policy makers.
What should parents think of all this? Are we comfortable with the judgements of "Vermont education policy makers?" Should we take the report cards "with a grain of salt" as Governor Howard Dean suggests, because "the ideologies of the people doing the survey" are suspect? Is the Vermont Department of Education simply operating Vermont's schools on a level far beyond what the national critics can comprehend or appreciate?
Well one way to form an opinion is to look at student outcomes. How well are Vermont's students doing?
How well are our students doing?
Quite well, of course, according to the Vermont Department of Education. Commenting on the low grades Vermont received for academic standards and accountability, Petit told the Associated Press that Education Week "failed to note that Vermont students were among the best performers in math and science in the country in a national assessment of educational progress."
Oh really? Let's have a look at the results from the National Assessment of Educational Progress, the "Nation's Reportcard," summarized on the Education week web site and reported in detail on the the National Assessment of Educational Progress web site.
We find that on the NAEP tests:
Acting Commissioner Petit issued another statement last month about the superior performance of Vermont's students. This one was posted on Vermont's public political issues discussion Listserv (VTFORUM) on December 5th. She wrote:
This study used statistical procedures to put 8th grade NAEP scores in math and science in 1996 on a scale with the scores of various nations that participated in the 1995-96 Third International Mathematics and Science Study (TIMSS) of 8th graders. Using statistics in this way is controversial and the authors caution that "the technique used to link the two tests can provide only limited information, since NAEP and TIMSS cover different content and were taken by different groups of students at different times."
Nevertheless, the NEGP report suggests that Vermont is one of 15 U.S. states that would have been expected to score AS WELL AS students in 17 other nations, and better than students in 23 other nations. And yes, Vermont and the other states and nations were statistically inferior to Singapore in 8th grade science in 1995-1996.
These results hardly seem worth getting excited about, and it seems just a bit disingenuous to say, "Vermont was found to score second only to Singapore." It also must be noted that the TIMSS showed U.S. 8th grade students are below average in math and the NAEP report did not place Vermont among the top states in math.
Perhaps more important than the above, however, are the results of the TIMSS study of 12th graders released in 1998. The following is quoted from the official press release (found here):
The issue becomes even more serious when we consider Vermont's current means of measuring student performance. Results of last year's statewide testing have been trickling out, and many parents are anxious to see how we measure up. That's why many were quick to read recent front page stories such as, "Most English test scores show improvement" (Burlington Free Press, 12/23/99).
One couldn't find an actual test score anywhere in the article.
Instead of test scores, the Vermont Department of Education reported percentages of students who "met or exceeded the standard." But wait; what was the standard? Was it eighty-percent competence? Forty-percent? Even portfolio assessments are corrected using criteria that yield scores. Where are those scores? What considerations went into deriving a standard from those scores? Was the standard the same for all schools? How does the standard relate to performance on various objective tests used nationally, or to college admissions standards?
Based on the pass/fail pseudo-statistics provided, the Vermont Department of Education may be able to claim a slight reduction in the number of students who fall short of a particular measure of adequacy. But by withholding the actual test scores and other comparative data, the state reveals nothing about the performance of students who met the standards, or the effectiveness of our schools in serving them.
Taken together, it does not appear that the Vermont Department of Education has established itself as being above the criticisms of the Fordham Foundation, Education Week, and the American Federation of Teachers. Our students are not doing badly, but sometimes it's hard to tell and there's certainly room for improvement. So how about let's cut out the statistical manipulation and exaggeration, and when national groups criticize us, let's listen --and take action!