fwd: teachers and tests

Peter Smagorinsky (smago who-is-at peachnet.campus.mci.net)
Sun, 20 Sep 1998 10:34:46 -0400

I wrote the following op-ed piece (rejected by the LA Times) last summer in
response to the MA testing situation. It includes some information from
Gerald Bracey's excellent book as well as data on teachers' literacy levels
already
discussed/posted on this list from the NALS. It is also interesting to note
re: MA that: (a) no teachers actually took the test, only teacher candidates.
Judging the quality of teachers from those who fail the entrance exam is
Like gaging the quality of accountants by the percentage who fail the CPA, or
lawyers who flunk the bar. If you don't pass, you're not a
teacher/accountant/lawyer; (b) the literacy portion of the test did not
consist of merely defining words like 'abolish,' as has been reported both on
the AP wire and in the LA Times. Part of the test was a dictated 200 word
passage which examinees had to spell and punctuate correctly. According
to the Boston Globe, the passage, chosen by Silber, was from the (18th
century)
Federalist papers.
Jeff McQuillan, CSU-Fullerton
---------------------------------------------------------

Are Teachers Stupid?
Jeff McQuillan

The release this past week of minimal competency test scores for a group of
Massachusetts teacher candidates has set off the latest round of public
school
bashing among eager politicians and "expert" pundits. Nearly 60% of the
prospective Massachusetts teachers who took the test failed it, leading to
calls to rid our schools of, to use the words of Time magazine, "middling
teachers."

The accusation that teachers are stupid is not a new one. The "disaster" of
Sputnik in the late 1950's was in part blamed on allegedly dunce teachers.
Political appointees in the Reagan Department of Education similarly pointed
the finger at teachers for putting the nation "at risk" in the early 1980s.
But as in the case of these earlier "crises" in U.S. education, there is
currently neither widespread academic failure that needs explaining, nor
evidence that teachers are falling short of the mark.

Stupid Students?
Despite rhetoric to the contrary, American schoolchildren are, on average,
performing as well as, and in some cases, better than their peers of the past
three decades, a point made by David Berliner and Bruce Biddle in their 1995
book, The Manufactured Crisis. Test results in both reading and math
from the
U.S. Department of Education's "report card" on schools, the National
Assessment of Educational Progress, have been essentially rock solid
since they
were first collected in 1971. The academic performance of our fourth,
eighth,
and twelve graders have been remarkably stable over the years, despite the
often limited funding and the massive demographic changes that have taken
place
in U.S. schools. In the last round of international testing, U.S.
nine-year-olds ranked second in the world in reading, and we were near
the top
among fourteen-year-olds. In math and science, our students were
performing at
the same levels as other leading industrial nations when compared to those of
similar educational experiences. In short, there is no crisis in U.S.
schools,
and there never has been. For this, we should be thanking our teachers, not
dumping on them.

Stupid Teachers?
The fact that a certain percentage of teachers have failed a minimal
competency
test is, in and of itself, evidence of precisely nothing when it comes to
judging how "smart" teachers are. First, the passing grade needed for such a
test is, like almost all such tests, arrived at rather arbitrarily. It
certainly has no necessary connection to how effective a teacher might be
when
faced with a group of energetic fourth graders. Second, the results tell us
nothing about how teachers compare to other professions in their reading or
math abilities. Would accountants, biologists, or politicians have done any
better on the Massachusetts test? We don't know, they didn't take it.
If the
public is truly interested in measuring the "basic skills" of teachers,
then it
makes more sense to compare their abilities to those of their professional
peers on a common measure of knowledge.

In fact, as Gerald Bracey points out in his book, Setting the Record Straight
(1997), the U.S. Department of Education conducted just such a comparison
back
in 1984. A study commissioned by the Department found that college freshman
who indicated that they intended to become teachers had comparable high
school
GPAs as those who planned on majoring in arts and humanities, science, or
business administration. The same was true for college GPA: at the end of
their sophomore year, those intending to teach had a GPA of 2.88; other
majors
had an average GPA of 2.87. This is hardly an indication of widespread
mediocrity. As Bracey points out, these GPA reports came before these
students
would have taken any of the supposedly "soft" courses offered by Schools of
Education for their own majors. An interesting postscript to this story: The
Department of Education "spiked" this commissioned report, declining to
publish
it, although it was archived in their public database.

Even more direct evidence on the competency of teachers comes from the
National
Adult Literacy Survey (NALS), a series of reading comprehension tests
administered by the Department of Education in 1992. The NALS was based on a
nationally representative sample of American adults from a wide range of
professions, and more importantly, measured the reading proficiency of "real"
teachers, not just those planning on going into education while in
college. So
how did teachers fare? Their overall score on the three tests of
literacy was
the same as adults from other well-educated professions, including those
working in business and the sciences. This should come as no surprise to
anyone even remotely familiar with real life teachers, over 50% of whom
hold a
Master's degree.

None of this is to deny that there are lousy teachers and poorly performing
students. But the evidence simply does not indicate that we have a crisis on
our hands in either area. Perhaps we need a minimal competency test for
politicians and media commentators, one which requires them to check their
facts before jumping on the latest school bashing bandwagon.