What Does a High Graduation Rate Prove?

A mantra fills the airways from the White House to the NCAA and from there to California governor’s mansion: keep graduating students from American colleges and universities. Keep the system of higher education humming. But what precisely does a graduate rate measure other than the completion of thirty, perhaps 32, courses whose quality is unknown and whose instructors have varied talents?

The way graduation rates are measured is deeply flawed. President
Obama may believe these rates and the commitment to excellence are
equated, but many in the Academy would disagree. In the 1980’s two
former basketball stars, Senator Bill Bradley and Representative Tom
McMillan, made a forceful pitch for colleges to publish the graduation
rates of their athletes. Although the NCAA eventually agreed to do so
and others soon followed suit, the law that emerged in the 1990’s
defined the graduation rate as the percentage of full-time, first-time
students who enrolled in the fall and completed their degree within “150
percent of normal time” – six years for students on a bachelor degree
track.

However, with a third of students transferring from the college
where they started and “gap” years now applied in the middle of a
program, there is a disconnect between practice and the accepted
definition of a graduation rate. Yet even if one were to adapt the
definition, the key component of graduation – requisite knowledge and
skills – would still be unknown. In fact, do increases in completion
rates over time mean that quality is improving? As you might guess, no
one knows. In some cases where remediation is effective, completion
rates may have some distant relationship to attainment. But the evidence
is incomplete.

When the United States had unparalleled attainment, this question
never arose. If the U.S. is really interested in improving its
competitive stance vis-à-vis other nations, we must look to other
indicators that measure the value of the education process. Can students
engage in basic math? Do they understand the fundamentals of science?
Can they write and speak clearly and cogently? Have they learned a
foreign language? Are they familiar with the unique civilization in
which they find themselves? Are they conversant with the great works of
Western thought?

These questions, which go to the essence of the college
curriculum, are rarely asked and are never tested. We simply do not know
what is imbibed in the four-year program or even the six years for
degree completion. That explains why may administrators prefer to rely
on the completion rate as a standard for achievement. It is a little
like asking how many oranges a person ate instead of the nutritional
value derived from the fruit.
All university administrator contend that their colleges offer a quality
degree. But in the absence of standards, it is impossible to know what
students have learned. It would be refreshing if a college dean stood up
and said my students read Shakespeare, studied calculus, appreciate
American history, read the Bible, Plato and Aristotle. These students
are prepared to be tested in the disciplines.

Now that would be something. Instead we are likely to be bogged
down in the matter of completion rates since this is measurable,
understandable and irrelevant. Whoever said academics want to be
relevant?

Leave a Reply

Your email address will not be published. Required fields are marked *