All posts by Peter Wood

Peter Wood is president of the National Association of Scholars and author of “Diversity: the Invention of a Concept.”

The U. of Chicago’s Flawed Support for Freedom of Expression

In January 2015 the University of Chicago Committee on Freedom of Expression issued a brief report which eloquently made a case for the importance of free speech as “an essential element of the University’s culture.”  I commented at the time in an approving manner.  Over the ensuing months, the Chicago statement has gathered more and more approval.  In April the faculty of Princeton University incorporated much of the Chicago statement into a statement of their own.   On September 28, the Foundation for Individual Rights in Education (FIRE) launched a national campaign asking colleges and universities to adopt the statement. The American Council of Trustees and Alumni (ACTA) has also urged over 19,000 trustees to embrace it.

In an era when student activists on many campuses are attempting to silence expression of views they disagree with, the University of Chicago statement is a welcome counter-measure.  It is easy to see why principled scholars and organizations concerned about the integrity of the university are drawn to it.

Not the Whole Loaf

But I urge caution.  The Chicago statement is, in effect, half a loaf.  And sometimes half a loaf can be worse than none.  The basic problem with the statement is that it presents a context-free defense of freedom of expression. It does not offer any reason why such freedom is important and, in the absence of such a reason, it amounts to an endorsement of much of what is currently wrong with our colleges and universities.

To be sure, there is much in the statement that is attractive and endorsing it makes sense as a tactical move against “social justice warriors” who want to preempt important debates.  If the Chicago statement were to be understood as mainly a call for the university to respect the rights of outside speakers to have their say, regardless of viewpoint, it would be welcome without any serious reservations. But the statement does not contextualize itself to outside speakers and appears to apply equally to speech within the university.  The differences between outside speakers and speech within the university, however, are profoundly important.  The latter involve considerations that the Chicago committee ought to address but did not.

Four Flaws

In that light the statement has some serious flaws as an enunciation of general principles.  It is easy to imagine new circumstances where the positions laid forth in the Chicago statement would themselves become impediments to good education.  Indeed, some of these circumstances are already here.

There are four flaws. The statement ignores the need for true speech, wrongly elevates free speech over teaching, fails to say why free speech is important on college campuses, and is conducive to the further trivialization of the university.

First, the statement says “freedom of expression,” “freedom of inquiry,” and “freedom to debate” are “fundamental” to the university.  Surely they are.  The trouble is that other principles are no less fundamental.  One might think of the pursuit of truth; the obligation to distinguish the important from the trivial; integrity in research; respect for freedoms besides academic freedom; and genuine care for the welfare and educational prospects of students.

 The Pursuit of Truth

I grant that most of these do not spring readily to mind for faculty members who are not at the moment faced with a conflict, say, between freedom of expression and the pursuit of truth.  But such conflicts are never far off.  People lie, frequently.  Freedom of expression permits lies and misrepresentations and, up to a point, protects the liar in his exercise of the right.  The “fundamental” regard of the university for freedom of expression, however, is in direct tension with the fundamental regard the university must also have for the truth.  How does the Chicago statement handle this?  It is silent on the matter.  The statement does indeed say that “freedom to debate and discuss” is not absolute.  That freedom must bend in some cases:

The University may restrict expression that violates the law, that falsely defames a specific individual, that constitutes a genuine threat or harassment, that unjustifiably invades substantial privacy or confidentiality interests, or that is otherwise directly incompatible with the functioning of the University.

Defamation, threats, harassment, and violations of privacy are out.  But on matters such as fabrication of data, perjury, deliberate historical misrepresentation, suppression of discrepant evidence, false testimony, plagiarism, and the like, the statement says nothing.

I would not infer from this that the University of Chicago Committee on Free Expression regards these as inconsequential matters.  Rather, it was charged with addressing the principle of freedom of expression and it did so, leaving various complications aside.

A Dangerous Mistake

Perhaps the University of Chicago is a community so well ordered that it could trust itself to deal with “freedom of expression” as an isolate—a fundamental that need not be considered in the company of other fundamentals.  Be that as it may, I think it is a dangerous mistake for colleges and universities across the country to adopt this statement as is, without creating a conceptual context in which other fundamentals are given due consideration and weight.

Let me acknowledge that extending the discussion in the direction of other such fundamentals may well prove difficult and frustrating.  A simple statement of principle—that freedom of expression is fundamental for a university—can be pure and inspiring.  Recognition of counterbalancing and sometimes contradictory principles that must somehow be made to mesh is less an occasion for rhetorical triumph.  But it returns us to the real world of higher education where flawed men and women struggle for achievement in a special kind of community.

Where’s the Word ‘Curriculum’?

Second, the Chicago statement treats freedom of expression as something unmoored by the curriculum.  Indeed the words “curriculum” and “course” (in the sense of an academic course) never appear in the statement.  The only example of freedom of expression that is cited is a Communist Party candidate invited to speak on campus by a student organization.  All of the other statements on academic freedom are hortatory declarations of the abstract principle.  But the reality is that colleges and universities must make practical choices to teach this subject, and not that one.

If they have—as the University of Chicago has—a core curriculum, they must decide which disciplines should be represented, and which not.  No university is so large that it can encompass every subject.  It must makes choices, just as the individual faculty member must makes exclusionary decisions in every syllabus and every time a class meets.  The license to make these choices is part of academic freedom but it is a particularly fraught aspect of academic freedom because it presents the question:  Who decides?

One approach maximizes the autonomy of the individual faculty member to teach what he wants to teach.  But even the university that leans far in this direction reserves the key power to approve a course or not.  And the best colleges and universities devote great care to the work of shaping their academic programs.

In short, the “fundamental” right of free expression is dramatically limited in the single most important context of higher education:  the college or university’s decisions about what should be taught.

What Should Be Taught

The Chicago statement in this regard sounds like a dream of faculty members reveling in the idea that free expression can be upheld as the governing principle of an institution that is in fact ruled by a dramatically contrary principle:  the need to provide students with a coherent education.

Third, the assertion of the “fundamental” value of freedom of expression sidesteps the underlying rationale for free expression.  The statement treats free expression as so integral to the university that no explanation is needed; just assertion.  Perhaps this reflects disagreement among the committee members on what the rationale for free expression should be, but the omission is odd.  In my view what makes free expression fundamental is that it prevents sleepwalking.

It treats every idea as open in principle to challenge, which means even the best ideas must be maintained by alert, intelligent, and informed people who are ready with good arguments and robust evidence, and who are also ready to put in the necessary time and effort to defend them.  Free expression exists as the antidote to intellectual complacency and the slumber of settled propositions.  It does not allow “consensus” or appeal to the authority of either the crowd or the expert to settle a dispute.

But if I am right about this rationale, free expression should lean towards these ends.  Free expression should not itself be a cover for mob rule (“consensus”), mere doctrine, or efforts to shut the door to further inquiry.  The rise of “studies” departments that are little more than ideological satrapies on campus does not jibe with free expression.  To hold a legitimate place in the community of higher education, a field of study must be willing to treat even its most basic ideas as hypotheses that are open in principle to challenge, not as matters of settled belief.

Chicago vs. Yale

The Chicago statement veers away from any such understanding of freedom of expression.  As far as the statement goes, all expressions enjoy the same title to “freedom of expression.”  That’s a view that comports pretty well with the First Amendment, but comports very poorly with the reasons why higher education values freedom of expression.

As it happens, the Chicago statement can be usefully contrasted with an earlier statement of freedom of expression issued in December 1974 by Yale, as the result of the deliberations of a committee appointed by President Kingman Brewster.  The Yale statement is every bit as vigorous in its support for freedom of expression as the Chicago statement, but at 31 pages, it is longer than the Chicago statement (3 pages), more in-depth, and attentive to complications that the Chicago statement ignores.  Perhaps most importantly, the Yale statement explains why freedom of expression should matter to a university.  Its first sentence declares: “The primary function of a university is to discover and disseminate knowledge by means of research and teaching.”  There is nothing comparable to this in the Chicago statement.

Fourth, the Chicago statement rests easily with the post-modern and (ironically) the anti-foundationalist condition of the contemporary university.  In treating free expression as an end in itself and divorcing it from any concern about the processes that establish and dis-establish intellectual authority, the statement gives license to the forces that have brought on the regime of triviality, curricular incoherence, narcissistic teaching, and intellectual aimlessness that have beset so many colleges and universities.  Again, these conditions may not prevail at the University of Chicago, but when other colleges and universities emulate or adopt the Chicago statement, they are also giving their imprimatur to an education that endorses formless exploration over purposeful inquiry.

An Unmoored Freedom

These four flaws in the Chicago statement have a kindred character.  All four have to do with the superficiality of the statement, which leaves out essential things:  other fundamentals; the shaping of the curriculum which is necessarily guided by principles above and beyond freedom of expression; the purpose of the college to which freedom of expression is necessarily subordinate; and the tendency of an unmoored freedom of expression to perpetuate the intellectual weaknesses of the contemporary university.

There are, of course, competing views about the purposes of higher education, an institution that must somehow blend discovering new knowledge, transmitting existing knowledge, sustaining the legacy of civilization, shaping character, preparing students for productive lives, and teaching students how to live responsibly in freedom.   Free expression is a vital component of several of these ends, and none more so than the last.  We uphold freedom of expression in large part to teach students to become citizens who can govern themselves wisely in our representative democracy.  But that requires that we understand this freedom not as an end in itself but as purposeful—which in turn means that we must pay attention to its purposes.

I’d recommend that the University of Chicago continue the work of the committee that wrote the statement by asking and answering the follow-up questions:  Why is freedom of expression important?  How does it advance the education of students?

It is in the spirit of the Chicago statement to welcome debate.  As far as I can tell, there has been little or no debate over the statement itself.  I offer these four points for the consideration of any college or university that is considering FIRE’s invitation to endorse the Chicago statement.  And I offer them as well for the benefit of the University of Chicago, which would seem to welcome, to borrow President Robert M. Hutchins’ apercu, the kind of “free inquiry [that] is indispensable to the good life.”

College Scorecard: How Much Will You Earn?

Sixty-six percent of the graduates of my alma mater earn more than people who have only a high-school diploma.  This fact comes courtesy of the U.S. Department of Education’s new “College Scorecard.”  I took advantage of the online interactive system to see how well Haverford College alumni stack up in the race to achieve financial stability.

The new College Scorecard has been pretty well received since it debuted on September 12.  It replaces one that originally debuted in February 2013, but which lacked much of the financial data President Obama promised in his 2013 State of the Union speech.

Hillsdale Excluded

The new, more data-rich version has occasioned reflections ranging from worries about the “more than one out of every three student borrowers nationwide” who fail to “make any progress in repaying their loans,” as Michael Stratford put it in Inside Higher Ed, to complaints that the Obama administration abandoned the rankings it had promised would be part of the new Scorecard because of pressure from college presidents and “organizations,” as NPR put it.  Meanwhile conservatives noted that the Department of Education had simply excluded from the Scorecard colleges such as Hillsdale and Grove City.

The snub to Hillsdale was especially interesting.  Obama had promised the Scorecard would cover “every institution of higher education.”  John Hinderaker on Powerline picked up the story that Assistant Press Secretary for the Department of Education Denise Horn defended Hillsdale’s exclusion on the grounds that the famed liberal arts college primarily awards “certificates” rather than bachelor’s degrees.  This is simply false, and it is a little disconcerting that another federal project aimed at creating greater transparency in an important sector of the economy has been launched trailing clouds of obfuscation.

But it is probably better to take the Scorecard for what it is rather than for what is missing.  It is a scorecard that declines to say who is winning or even what all the teams are, but it does provide vast quantities of data if only we can figure out how to make sense of the numbers.  Here I will try my hand at that, starting with Haverford.

Diving Into the Numbers

That 66 percent of Haverford grads who out-earn their high-school-only counterparts is a number that in pristine isolation doesn’t mean much.  If I had to guess, I would have thought more than two-thirds of the ‘fordians abroad in the big world would be out-earning the kids who decided to live the lifestyles for which a high school diploma alone entitles you.

Now, when I think about it, I see the complications.  Some Haverfordians pursue self-sacrificial career choices.  They spend their lives paying witness to social justice crusades that Don Quixote himself would have thought lunatic.  They turn conservative and seek careers in higher education, where they are relegated to Flying Dutchman lives as perpetual adjuncts.  You get the picture.  Haverford, with its active Quaker tradition, may be a little deficient in stoking the profit motive in its young charges.

And on the other side of the equation, some high-school grads have the Midas touch.  They get at least a four-year advantage in acquiring marketable skills and seniority. And if they have the knack for earning money by building, repairing, selling, cooking, or renting things, they can thrive in this America.

So maybe 66 percent of my fellow grads out-earning their high-school counterparts is reasonable.  But what I really need to do is see how that 66 percent matches up with other colleges.  But maybe first I’d better check the fine print in College Scorecard.

Look at the Fine Print

The Big Print says “Salary after Attending” Haverford is $55,600.  The fine print explains that this means “The median earnings of former students who received federal financial aid, at 10 years after entering the school.”  The 66 percent figure likewise turns out to have some qualifiers.  It refers to the percentage of former students who earn more than $25,000, “the average earnings of a high school graduate aged 25-34, 6 years after they first enroll.”  Got that?

I am suddenly struck that a third of the graduates aged under age 34 are earning less than $25,000.  Perhaps they are spending their 20s in graduate programs, writing dissertations, doing post-docs, and making ends meet with odd jobs.  That was pretty much my life.  Or they have enrolled in law schools in the ill-founded expectation that a lucrative career at a major law firm would be waiting three years out, and are now hustling real estate or tending bar.

There is this little consolation, written into every College Scorecard graph.  The national average earnings for the up-to-10-years-out is $34,343.  So ten years after graduation, the average Haverfordian has a premium of $30,600 in annual salary over the average high-school-only graduate, and a $21,257 premium over the average college graduate. That sounds like a pretty good deal.

Especially since the average annual cost of attending Haverford is $18,853.  That figure is also from the College Scorecard. It includes only students who take federal financial aid.  The Scorecard also breaks it down by family income.  A family with under $30,000 in annual income pays on average a net Haverford bill of only $5,685.  Oddly the average cost falls for families on the $30,000 to $48,000 range to $5,599.  Then it quickly escalates:  $15,612 for family incomes up to $75,000; $18,476 for family incomes up to $110,000; and $38,323 for family incomes above that.

College Grants

The College Scorecard doesn’t say, but Haverford’s official tuition is $48,656; room and board is $14,888; and the student activity fee is $442, for a grand total of $63,986 per year.  So those net college costs reported by the College Scorecard represent hefty discounts from the sticker price.  In fact, more than half of Haverford students also receive “college grants” and these grants average $40,014.

Putting costs and potential income together, one could conclude that Haverford is a reasonably wise “investment” for a young person who seeks a liberal arts education without undue risk of poor earnings or insupportable debt.  “Typical total debt” for Haverford graduates, according to the Scorecard, is $13,854.  The fine print explains, however, the “total” in “typical total debt” isn’t total at all.  It is just total federal student debt—excluding private debt and debt secured by students’ parents such Federal PLUS loans.  Nor does “typical” mean typical.  $13,854 is a median figure, and only 20 percent of Haverford students receive federal loans.

So it is not surprising that a robust 95 percent of Haverford graduates who took federal student loans have paid “at least $1 of the principal balance” within three years of leaving school.  I do wonder about the remaining 5 percent who could not scrounge up even that much.  The national average among college students paying down their debt is 67 percent.


My apologies to readers who have steadfastly walked beside me through those numbers.  The main things to be taken from them, I would say, is that the Department of Education has assisted a very expensive college in its efforts to look affordable and that the DOE has also advanced the narrative that traditional colleges are still a financial bargain for most of the students who attend them.

To go deeper than this requires that you make comparisons, and the Scorecard certainly lends itself to both consumer shopping for the highest rates of return on “investment” in college expenses and to various sorts of ranking.  NPR’s Planet Money team provided some of the rankings that the Department of Education decided not to.  The Planet Money team came up with several analyses.  Anthony Carnevale’s list offers no great surprises:  his rankings, which blend income and some other factors, put Harvard first, with the median wage of graduates ten years after entry as $87,200.  Next are MIT, Princeton, Stanford, and Babson.  The highest median earnings, however, come not to Harvard grads but to MITers, at $91,600.  Number six on the list is the Georgia Institute of Technology, at $74,000—then Georgetown, the University of Pennsylvania, and “University of the Sciences in Philadelphia” (the new name for the former Philadelphia College of Pharmacy), and so on.

Other Planet Money lists focus on colleges that emphasize upward mobility and colleges that leave students with “little debt and good financial opportunities.”  The lists differ in appreciable ways. Duke is number 13 on Carnevale’s list, absent on the upward mobility list, and number 1 on the best financial sense list.

We will be playing this new game for many years to come.  It is nothing to be especially happy about:  just one more step in the fatal march towards treating higher education as a commodity.


The data plays into the hands of those who are endlessly preoccupied with the forms of “inequality” in our society.  Kevin Carey writing in the New York Times observed, “the deeper that you delve into the data, the more clear it becomes how perilous the higher education market can be for students making expensive, important choices that don’t always pay off.”

Yes, the data show that, which one might say is a reason to be a little more cautious in how emphatically we speak of college as an “investment.”  Carey, however, turns his attention to the “earnings gender gap” revealed by the data.  At Duke, the median earnings for women graduates are $93,100—which is pretty nice.  But the median for Duke’s male alumni ten-years-out is $123,000.  What are we going to do about it? Carey doesn’t say but he is broadly on the side of “need-based financial aid to low-income students.”

Carey does give a nod to the danger of “defining higher education in purely economic terms.”  But the risk he sees arises from “corporatization of the modern university,” which scants the need for students to learn to be better citizens, and the need for “dancers and poets” as well as “investment bankers and tech entrepreneurs.”

He goes not nearly far enough.  Higher education is about entrusting to each new generation the legacy of a civilization.  We learn—or we should learn—respect for reason, civil dialogue, the great accomplishments of art and science, the enormity of our failures, the profundity of our ideals, and a great deal more that makes us not just capable of carrying forward a society worth living in but an eagerness to do so.  A college education rightly conceived prepares its graduates for leadership in that society, not just material success—and maybe not even material success, since a high income ten-years-out isn’t necessarily the only or the best mark of leadership.

Ideals Matter

To say these sorts of things, of course, is to risk a derisory smile or two.  The worldly wise know that money counts, and faced with enormous tuition bills and substantial debt, nearly everyone will consult the numbers first and the ineffable ideals maybe later.

But the ideals are, in the end, what matters.  There would be no college education for anyone if Western civilization hadn’t created and sustained the conditions for higher education.  Our colleges and universities now coast on the considerable momentum of that achievement, but they do little to replenish it.  The College Scorecard is one more step downward towards a utilitarian calculus of learning—a calculus promoted far more by the egalitarian left than the freedom-minded right.

I suspect we would have been better off as a nation without having launched this particular invitation to compare paychecks, but there is probably no going back.  Those of us who care about defending liberal learning against the tendency to dissolve everything in the universal solvent of money have one more obstacle.  And no doubt our overpriced and profligate colleges and universities have brought this on themselves.

The Pressure of Group Thought

Academic “consensus” is in the news. Stetson University professor of psychology Christopher Ferguson, writing in the Chronicle of Higher Education,recently gave a run-down on how the American Psychological Association supposedly compromised itself by manipulating a task force into endorsing harsh interrogations of prisoners.  Ferguson says the APA “crafted a corrupted ‘consensus’ by excluding those who might disagree.”

Which is, of course, exactly how the dodgy “climate consensus” works too. “Climate consensus” is the rhetorical club wielded by the proponents of the theory of catastrophic man-made global warming. On campus—and in many other venues—to express the slightest doubt about the theory is to risk a “climate consensus” drubbing.  “Consensus” in this sense is pretty close to what John Adams warned in 1788 could become “the tyranny of the majority.”

The U.S. Constitution was meant to forestall that tyranny; but Americans also found other ways to hold back the eagerness of proud majorities to impose their views on everyone else.  The doctrine of “academic freedom” is one of those majority-busting concepts.  An idea isn’t necessarily right merely because lots of people like it.  Keeping a space open for dissenting views is always a good idea.

And that’s why when someone pulls the “consensus” card out of the deck, it is probably time to demand a new shuffle.


Christopher Ferguson tiptoes around the fake climate consensus.  His essay, after all, was published in the Chronicle of Higher Education, which flinched when the climate thugs came out a few years ago, and these days it never deviates from its consensus crouch.  Ferguson genuflects before the global warming consensus: “of course” it is “based on the quality of the data.”  He fails to mention that the data and the theory have never matched and the discrepancy has grown into a yawning chasm in the last eighteen years of the “pause” in global warming.  But let’s not pause.

Ferguson does have other examples.  He cites the 1986 Seville Statement in which twenty scholars convened by UNESCO serenely declared that human nature has no “genetically programmed” tendency towards violence.  As Ferguson notes, the science since Seville hasn’t been friendly to that particular consensus.

When Consensus Collapses

Ferguson’s examples of the “consensus” of one historical moment collapsing in the next is intriguingly brief.  I suspect he or the editors of the Chronicle did not want to remind readers of just how many consensus idols once cherished by academics have crumbled into dust.  Or, being cherished, are held together by the intellectual equivalent of Elmer’s Glue-All.

Once upon a time, Margaret Mead’s view of “culture” enjoyed the status of consensus in anthropology. Culture, to Mead, was an Open Sesame that permits any social arrangement we care to imagine and abolishes the constraints of human nature. Mead-ism still has its supporters, especially among advocates of outré arrangements such as polyamory and transsexual rights, but it has long since dissolved into intractable controversy.  No well-informed observer within anthropology would claim that there is a “consensus” about the quality of Mead’s field observations or the breathtaking generalizations she drew from them about human sexuality.

For many years it was an article of faith—and consensus on the American left—that Alger Hiss had been wrongly accused of spying for the Soviets and lying about it under oath.  Hiss was accused in 1948, convicted of perjury in January 1950, and sentenced to five years in prison. For the rest of his life (he died in 1996) Hiss maintained his innocence and enjoyed fervent support from liberals as well as leftists. But shortly after that, Ron Radosh, Harvey Klehr, and John Earl Haynes, and other researchers in the archives of the former Soviet Union turned up irrefutable evidence of Hiss’s guilt.

When a “consensus” dies, die-hard supporters remain.  Believers in Hiss’s innocence remain. Perhaps there is something in human nature, overlooked by Margaret Mead, that causes some people to stick with a sinking ship even after all that is left is a half-submerged deck chair.

Consensus after consensus has sunk beneath the waves, but it is a big ocean.  There was room in it for the Cold War revisionist thesis that blamed Stalin’s aggression on the West, and room too for the War on Poverty and for Affirmative Action.  Those ideas, of course, are not dead, but they are shadows of the “consensus” once claimed on their behalf.

None of these examples peep out from Ferguson’s Chronicle article, but he does offer the telling observation that consensus-style arguments in the social sciences are especially bad.  Ferguson observes that such consensus “has usually been declared despite continuing debate among scholars.”  An example or two might have helped.  What about the question that got Larry Summers defenestrated at Harvard?  Why are there so few women at the highest levels of mathematics and theoretical sciences?  The “consensus” answer—discrimination against women explains everything—was indeed imposed “despite continuing debate.”  We are in much the same territory with the new “consensus” on the percentage of college women who are assaulted.

Ferguson’s own example of phony consensus is from research on the effects of video games on children. I’ll take his word that the field is rife with flimsy claims dressed up as “consensus.” The more interesting question he raises is why the word “consensus” carries such weight.

To answer that, it helps to have a sense of how the word climbed into popular use.

Different Seeds

If you go to Google’s famous Ngram Viewer, which tracks the frequency with which words appear in books year by year for many centuries, you will find there was virtually no “consensus” in the English-speaking world in the 17th century.   An apparent “consensus” boomlet between 1625 and 1630 was caused by the printing of a Latin text comparing the Roman Catholic and Eastern Orthodox views of the teachings of Peter on church doctrines of matrimony.   The word was still Latin for the most part, though a 1675 English dictionary offers “consension,” meaning “agreement,” not to be confused with “conseminate” which is “to sow different Seeds together.”

Consensus didn’t escape Church Latin and some specialized medical meanings until the end of the 19th century.  A sober little essay on “Pitfalls in English” that appeared in the Journal of the Canadian Bankers’ Association in 1897 (why there?) calls out the phrase “consensus of opinion” as a vogue usage that has replaced “the general opinion.”  The Canadian bankers, alert to pitfalls, observed that “consensus of opinion” is redundant because “consensus” all by itself means “agreement in opinions” and “agreement of opinions of opinions is not a very neat expression.”

After that the term rumbles along for half a century at a low level of usage.  The Adoption of the Consensus System is the Only Remedy for Political Chaos huffs the title of a 1927 tract.  The Boy Scout magazine, Boys’ Life, in 1933 describes its selection of football players for the all-Scout all-America team as “the consensus of five nationally famous all-America selections.”

The point at which the word takes off is 1950.  Duke University Press that year published Toward Consensus for World Law and Order, which seems to have been a pro-UN screed popular with what we now call Peace Studies folk.  In the July 1950 Bulletin of the Atomic Scientists the eminent Michael Polanyi warned that “Scientific consensus can properly emerge only by a discussion based on the mutual grounds of the same scientific convictions.  It cannot be negotiated in the manner of a contract, nor decided by any voting procedure.”  By the late 1950s, “consensus” is clearly in vogue as an academic term cropping up in all manner of topics, from Richard Hofstadter’s history of the progressive era, The Age of Reform, to such oddities as sociologist Gresham Sykes’ study of a maximum security prison in New Jersey, where he writes about “the hard core consensus expressed by the members of the captive population.”

Culture Wars vs. Consensus

After that, “consensus” talk just grew and grew, the upside-down image of consensus reality, which withered and cracked.  All through the 1960s, 70s, and 80s, as the Culture Wars took shape and America split apart into identity groups that wanted nothing to do with the old “melting pot,” American writers turned out books and articles by the thousands extolling “consensus.” I won’t try to follow this ascent in detail.  I don’t think there is any one book or author who stands as the redwood tree in the forest of ordinary consensus celebrants.  But there are curious further developments.  Around 2000—the year George W. Bush was elected—“consensus” began a slow decline in popularity, and an even sharper decline starting in 2007.  But then the Obama years bring a consensus come-back.  Google finds a paltry 876,000 “consensus” documents in 2008.  It zooms to 1.8 million in 2009, then climbs steadily to 9.6 million by 2014.

Permission to speculate?  I see some evidence that “consensus speak” is largely the rhetoric of the left.  Consensus is the agreement or the solidarity that the left seeks to declare when it sees itself as able to dominate.  The left isn’t so interested in consensus when it is out of office or sees its rivals on the rise.  The right seems far less interested in consensus, nationally or internationally.  Probably that’s because the right tends to see politics as the balancing of competing interests rather than a winner-take-all game.

Be that as it may, the American infatuation with consensus does seem to vary inversely with our actual level of social agreement. Consider a divisive issue that marks as well as anything the divisions of the culture war:  abortion.  In the Google Ngram world, the chart of increased use of the word “consensus” is a close match with the use of the word “abortion.”  They rise and fall together.

Distrusting Consensus

The left today is infatuated with “consensus” as a tool that can be used to ostracize views it would rather not have to debate.  If there is a consensus on “Black Lives Matter” or “Climate Change,” the matter is settled.  The herding instinct of the collectivist left is stroked by consensus.  The right has some of these herding impulses too, but it also has a much stronger strand of individualism, and for individualists the pronouncement that something is backed by consensus is a warning label.  That label says, “Probably contains unwarranted assumptions, unfounded factual claims, and an aversion to considering alternative ideas.”

But liberals don’t always like consensus, and there is one particular consensus they heartily dislike: the broad agreement among Americans in favor of America itself.  In her book, Inventing the “American Way”: The Politics of Consensus from the New Deal to the Civil Rights Movement (2008), Stanford University historian Wendy L. Wall, for example, works hard to demolish “the façade of consensus” in the United States, which “concealed an ongoing context involving many different groups.”  When the opposite of consensus is social diversity, naturally academic sentiment is on the side of diversity.  Diversity trumps in every context except diversity of ideas.

There was a time, however, when liberal sentiment was decisively on the side of national consensus.  The historian and political scientist Clinton Rossiter famously described Parties and Politics in America (1960) as founded on consensus.  And he seemed to think this was a good thing: “The American consensus is unique in its virility and broad appeal”; America has “an acceptable consensus among the elites in every part of the land”; and “The blessed fact of the American consensus forces the parties to share many of the same ideas. The blessed fact of American diversity forces them to be selective about the ideas they may wish to emphasize at any one time or any one place.”

It bears pointing out that the “diversity” Rossiter was writing about was the political culture of the states.  “Diversity” as a code word for ethnic identity groups had not yet been born. By the end of the 1960s, however, the old consensus was dead—and so was Clinton Rossiter, who committed suicide at Cornell in the aftermath of the seizure of the student union by heavily armed black students.  Rossiter’s attempts to temporize with violent racial grievance had left him few friends and nowhere to stand.  Sic transit consensus.

Consensual Relations

“Consensus” seems to be in decline.  Christopher Ferguson’s Chronicle article is part of the evidence, and so too is the freezing up of campus debate on contested issues.  As quite a few observers have now pointed out, we have moved into a period in which students themselves are attempting to silence dissenting opinions.  Greg Lukianoff and Jonathan Haidt’s thorough account in the Atlantic, “The Coddling of the American Mind,” takes us right to the heart of a movement, “driven largely by students, to scrub campuses of words, ideas, and subjects that might cause discomfort or give offense.”  This is “consensus” thinking in action. If “everyone” agrees, we can just disregard the ignorant or stubborn few who actually don’t agree.

That might seem like “consensus” on the rise rather than in decline, but the logic of cultural change is always full of irony and inversions.  The self-coddling students imposing a new Victorian etiquette on campus and the viciously enforced pseudo-consensus on global warming are two different kinds of admission that “the tyranny of the majority” in higher education is breaking down.  That tyranny works most effectively when it is quietly assumed.  If the proponents have to use sharp elbows and uppercuts, the pretense of “consensus” is demolished.

Should we want consensus to rule in higher education?  At a certain level, yes.  We need consensus on the framing principles:  searching for truth; listening to opposing points of view; demanding evidence for assertions; asking skeptical questions even—or perhaps especially—when skepticism is unwelcome; learning how to respond to substantive arguments; and grasping that ad hominem attack is not the way civilized people respond to those with whom they disagree.  And the individual scholarly disciplines need some practical consensus.  We need to agree on what the mass of a kilogram is and whether class meets today or tomorrow.  But we can actually get by rather well in the midst of strong disagreements about fundamental questions.  Does God exist?  What is human nature?  Is American exceptionalism valid?  It is not that the answers don’t matter.  It is that getting to the answer by imposing a “consensus” is bound to have bad results for a free people who need to learn how to think for themselves.

Books for Book Virgins and Book-o-phobes

The annual controversy over books assigned to freshmen as summer reading is upon us.  Spoiler alerts.  Odysseus makes it home. Hamlet dies. The Whale wins.

Oh, not those books.  We are talking more about White Girls (by Hilton Als, 2013) and Purple Hibiscus (by Chimamanda Ngozi Adichie, 2003).  White Girls, as one reviewer puts it, is “an inquiry into otherness” by a writer for the New Yorker, who is a black male.  Purple Hibiscus is a novel by a Nigerian woman that depicts the travails of fifteen-year-old girl who has to cope with her violent and cruel, fanatically Christian father.

In 2014, the topmost assigned book (17 out of 341 colleges that have such programs) was Wes Moore’s account of a convicted murderer who shares his name and his beginning as a fatherless black child in Baltimore, The Other Wes Moore (2010).  Second on the list (eight colleges) was Dave Eggers’ novel about a woman who works for a privacy-destroying internet company, The Circle (2013), and third was Rebecca Skloot’s account of the poor black woman whose cervical cancer cells were the first human cell line to be kept growing in a lab,  The Immortal Life of Henrietta Lacks(2010).  In 2015, according to Inside Higher Ed, it appears that Bryan Stevenson’s memoir of his efforts to exonerate wrongly convicted prisoners, Just Mercy (2014) will be the winner.

Books Younger Than Their Readers

For the last six years, the National Association of Scholars has been assiduously tracking all the books selected by all the colleges that do this sort of thing.  We call them “beach books,” but the usual term is “common readings.”  NAS executive director Ashley Thorne has pretty much single handedly turned a minor campus phenomenon into a subject of widespread controversy—the subject of annual conferences, legislative hearings, and mass media attention.  The latest reverberation was a report on July 23 on NPR.

This year 93 percent of the books assigned to “first-years” (the new PC term for freshmen) are younger than the students who are asked to read them.  There are many threads to the beach books story, but the extreme youth of most of the books is the most revealing.  Why so much emphasis on hot-from-the-presses titles?

We’ve heard three answers. First, the program coordinators insist that the best way to engage students is to bring the author to campus to speak.  That makes for a nice income stream for some contemporary writers, and too bad for Mark Twain.  He had his chance.

Second, the coordinators tell us they have to meet the students where they are. Many are “book virgins” who reach college never having read any book cover to cover.  Such students need to be coaxed by assigning them a book that is “right now.”

And third, the coordinators are convinced that the past is over and done with anyway and, regardless of what the students think, the focus should be on contemporary social issues.

This last one concerns me most, but the other two are lame as well.  College students should get used to reading books by dead people.  If you can’t read Edith Wharton or Mary Wollenstonecraft without her in the room, hire an actor.  As for book virgins and book-o-phobic first years, why not get them started on the real thing?  If Hemingway is too hard, try Aesop’s Fables.  If Aesop’s talking animals are above their level, try Mother Goose.  Am I exaggerating how bereft of literary foundations these students are?  I hope so.

No Dead Writers, Please

But the third point—that education all by itself requires that the beach books be molded from the freshest, most up-to-date progressive sand—deserves a little more attention and, let’s say, a lot more opprobrium.  Among the responses to the recent NPR report on summer readings came this crystalline summation from an undergraduate named Kai:

Good literature teaches students about our world now, about the challenges our society faces and will continue to face. Climate change, inequity, and—this is the big one—discrimination (especially racial). Real world issues start to be acknowledged when college kids read about them in books like “A Long Way Gone,” “White Girls,” or “The New Jim Crow.” And that’s why college reading programs SHOULD NOT contain the classics. College reading should be controversial, inspiring, provocative contemporary literature.

Kai is full of youthful arrogance.  He’s read someone named “Vergil” in the original.  But he sees the need to get beyond “institutionalized, oppressive traditions.”  The literature that “has shaped the predominant modes of interaction in western civilization” may be “fun to read—indulgent, even,” but it is time to move on.

I don’t mean to make too much fun of poor Kai.  He is clearly an eager student who has diligently taken in the premises of his college and enthusiastically made them his own.  But his is the voice of someone imprisoned in “now,” for whom “good literature” is writing by contemporary social activists.  He is oblivious to the need to learn about the past and the deep ways in which great literature from previous eras bears on the present.

We all, of course, live in the present and need to pay attention to its particular demands, which include listening to people prose on about “climate change” as earlier generations prosed on about other supposed menaces.  Inequity and discrimination?  Kai might be on firmer ground if he knew more history and understood how much inequity and discrimination are endemic to the human condition.  Virgil, for example, has something to say on the topic of oppression.

Devaluing the Past

The saturation of college students in what might be called present-tense books should worry us.  Higher education cannot of course erase the past but it can radically devalue it.  Introducing students to college-level reading by feeding them candy bars of social outrage is about the poorest way I can imagine to develop their taste for serious ideas expressed with power, imagination, and intelligence.

The problem is not new.  We noticed this extreme focus on contemporary books in our first study of common readings in 2010, when we found the “vast majority” of assigned books in the 290 colleges we studied to have been published in the preceding decade.  But back then, we did find ten colleges (3.4 percent of the total) that had reached back further.  Thoreau’s Walden made an appearance, as did Marx’s Communist Manifesto, Dashiell Hammett’s Maltese Falcon, and Alan Paton’s Cry, The Beloved Country.  More daringly, two colleges had assigned Mark Twain’s The Adventures of Huckleberry Finn.

We have redone and expanded the study each year since then.  The list of books that are assigned changes dramatically from year to year—as naturally it would if the college is set on chasing the winds of ideological fashion.  But what doesn’t change is the relentless focus on books that are in their dewy youth.  In 2010, it was the world of Steve Lopez’s account of a skid-row violinist, The Soloist (2008); Greg Mortenson’s account of his building schools for girls in Pakistan, Three Cups of Tea (2007); and Sonia Nazario’s account of a child from Central America illegally slipping in the United States, Enrique’s Journey (2006).

The Soloist is now off playing by himself in a different skid row.  Mortenson’s cup ran dry when he was exposed as a fraud; those Pakistani girls’ schools were made up.  Enrique went underground for a while but has resurfaced in view of current illegal immigration.

The relative youth of a book is no knock against it as a book, but it is a knock against making it the one (and usually only) book that a class of college students will read together.  I’ve elsewhere made my own suggestions for better books for the first-year beach babies.  I’m moderate about this.  If Don Quixote is too long and Crime and Punishment too dark, try The Right Stuff or Life on the Mississippi, or perhaps better yet, John Stuart Mill’s On Liberty.

This article was originally published on the National Association of Scholars site.

Metal Fatigue and Campus Pessimism

When I was in college I got a job one summer blasting, scraping, and sanding the corroded sides of dry-docked ships.  It sounded like nasty, if well-paid, work. But before I could don gloves and mask in my war on barnacles, some union called a strike and my job was wiped out.  I ended up in a still less glamorous job on a road crew, scraping hapless raccoons from asphalt.

Even decay, it seems, isn’t an entirely reliable business.

My youthful almost-employment as an agent of maritime tidiness was resting somewhere in my mental scrapheap, long forgotten.  I’ve been busy with more up-to-date concerns, among them the critique of the campus sustainability movement. In March, Rachelle Peterson and I rolled out our stainless new National Association of Scholars’ study, Sustainability: Higher Education’s New Fundamentalism, and we’ve been fashioning hood ornaments for it ever since.

But something stirred that old memory:  I noticed Rust: The Longest War, Jonathan Waldman’s corrosive new book.  It is what the title says, a book about the weathering away of steel girders and tie rods and all our other iron pinions with which we try to hold the present against the inevitable rust of time.  Sustainability? Take that, says rust.  And rust wins.

Steel for Stone

What a perfect time for Waldman’s thoughtful appreciation of this enemy of civilization.  We have built our world on metals.  Copper tools were invented about six thousand years ago, and copper alloyed with tin gave us the harder-edged Bronze Age about 4,500 years ago.  Metal plowshares, metal swords, and metal hand tools created the material conditions for large-scale agriculture and for cities and states, and eventually for art and science.

There is only so much you can do with wood and stone.  Today’s enthusiasts for “paleo diets” and pre-industrial technologies sometimes forget the eagerness for metal among those people who lacked it.  Western sailors had to fight Polynesians who would try to pull the metal nails out of the decks of visiting ships.  The Australian anthropologist Lauriston Sharp wrote a classic essay about an aboriginal tribe, the Yir Yoront, whose contact with whites was sporadic and minimal until the 1940s.  For the Yir Yoront, the stone ax was the principal “piece of capital equipment,” used to produce firewood, makes huts, and part of every important act of survival.  Making a good stone ax was arduous skilled labor and the ax itself was, unsurprisingly, an object of deep significance.

Yet given the opportunity to acquire steel axe heads, the Yir Yoront didn’t hesitate.  The steel axes rapidly displaced the old technology.  The anthropologist duly recorded that a kind of cultural collapse ensued “in the realm of traditional ideas, sentiments, and values.”  Women and young men obtained access to the new axes, which undermined Yir Yoront hierarchy and ritual.  Axes have consequences, as Richard Weaver might have said.

In Waldman’s book one can learn about the heroic endeavors of the American Galvanizers Association whose members take the battle to the rusty foe, while fighting rearguard actions against the rival stainless steelers and the paint industry.  It is an entertaining book as well as a handsomely written one:

Every metal is vulnerable to corrosion.  Rust inflicts visible scars, turning calcium white, copper green, scandium pink, strontium yellow, terbium maroon thallium blue, and thorium gray, then black.  It’s turned Mars red.

But what does this have to do with higher education?


The battle over sustainability on campus and elsewhere can be thought of as a contest between competing ideas of preservation.  Those who favor “sustainability” set themselves up as seeking the preservation of the natural order against the destructive changes to the planet wrought by humanity.  Those who critique the sustainability doctrine generally hold that humanity will thrive only by dint of further development of the earth’s resources and further advances in science and technology.  The critics seek to preserve the cultural legacy of our civilization.

Things, of course, get a lot more complicated than that two-way choice between preserving nature and preserving culture.  Both sides stake some claims to the other’s territory.  Sustainatopians want to preserve some pieces of culture as well as nature.  Uber-sustainatopian Bill McKibben, for example, fancies beekeeping.  And virtually all critics of sustainability favor clean air, clean water, and a healthy environment.  But once each side has gathered in its share of the other’s bounty, the division is robust.  Sustainatopians see nature as essentially benign and the Earth as terribly fragile.  Once the atmospheric level of carbon dioxide exceeded 350 parts per million, the Earth was on an unstoppable slide to catastrophic global warming.  That’s why McKibben named his activist group

The enviro-catastrophism has in the view of its college and university advocates a straight line application to what colleges actually do.  How can you sit around reading Plato or Jane Austen when the Arctic icepack has melted?  How can we teach political theory as if nations mattered when the only viable solution to climate change lies in transnational institutions?  How can we teach biology as if the Anthropocene—the age of manmade climate change—hadn’t already begun to produce mass extinctions?

Indeed every subject in the curriculum can be refashioned around the goal of putting the issues of sustainability in its center. That’s exactly what the American College and University Presidents’ Climate Commitment asks of higher education, and exactly what most of them are doing.  Rachelle and I spent some time documenting this.  Yale classifies more than 400 of its undergraduate courses as “sustainability focused” or “sustainability related.”  At Cornell, 68 percent of the academic departments offer sustainability courses.  At Middlebury it is 72 percent.

But leave aside the details.  The main point is that the sustainability doctrine authorizes—or perhaps more accurately demands—the subordination of all forms of inquiry to the larger goal of preserving the natural order.  Sometimes this is phrased in quasi-mystical language, such as the call from Peggy Bartlett of Emory University for a “re-enchantment” of nature.  But generally it is just assumed into place.  We all know the “climate consensus.” Our world is at grave risk.  Let’s not waste time on superficial things such as the old liberal arts curriculum.

The opposing view—my view—is that, even if the natural world is at risk, what higher education should be most concerned about is the preservation of our culture.  The chances of doing something about global warming are vastly improved if we remain a civilization that commands the power to innovate and the optimism to believe we can address our problems successfully.  Turning our colleges and universities into wheelhouses of apocalyptic fantasy and cultural despair is likely to be a self-fulfilling prophecy.  Sustainatopian belief is corrosive.  It turns the institutions on which we depend for cultural vitality into recruitment centers for hostility to our civilization.

Rust Happens

Civilizations are, in principle, made to last.  But so are ships, bridges, and skyscrapers, and none of them last forever.  Rust happens.

Colleges and universities ought to be our galvanizers.  The effort to preserve is not a matter of resting content that we have true and perfect knowledge that merely needs to be carried forward intact from generation to generation.  Real preservation requires an active commitment, the blasting, scraping, and sanding of the cultural corrosion that inevitably gains ground if we don’t intervene; the replacement of the broken parts; the determination to keep the essential and to improve where possible

A truly sustainable civilization requires the strength to say no to the idea of going back to nature.  That so many in our society are fatigued by metal and ready to divest from carbon is a bad sign, a diversion of our imagination and energy to a dead-end fantasy at a time when we need robust and creative thinking. Trading up from stone to steel axes may have been traumatic, but trading back down will be a lot more so.

Two Controversial Professors

The AAUP—the American Association of University Professors—held its annual Conference on the State of Higher Education at the Mayflower Hotel in Washington, D.C. June 10-14.  A few subway stops away, the Heartland Institute held its tenth International Conference on Climate Change at the Washington Court Hotel, June 11-12.  I suspect that I am the only person to attend both.

Both events dealt with the issues of academic and intellectual freedom.  Both focused on current threats to such freedoms.  Both pictured a world in which politically-motivated foes of free expression are using their wealth and power to silence legitimate dissent.

But, of course, these events were polar opposites.  The AAUP was gearing up to pass a resolution to censure the University of Illinois at Urbana-Champaign for rescinding its offer of an academic appointment to Steven Salaita.  The Heartland Institute was championing the work of Dr. Willie Soon, the solar physicist at the Harvard-Smithsonian Center for Astrophysics who came under attack by Greenpeace and the New York Times after he published an important article in Science Bulletin.

Both controversies have received ample coverage, though I think it is quite possible, even likely, that people who know a lot about one may not know a lot about the other.  A primer:

Steven Salaita. He was a tenured associate professor of English at Virginia Tech who in October 2013 received an offer for a tenured position in the American Indian Studies Program at the University of Illinois Urbana-Champaign, contingent on the board of trustees’ approval.  On August 1, 2014, the university’s vice president of academic affairs and its chancellor wrote to Salaita informing him that they were not proceeding with the appointment.  Salaita appealed to the trustees who on September 10, 2014, voted 8 to 1 not to reconsider his appointment.  Salaita soon after filed a lawsuit which is on-going.

The reason that the university gave for withdrawing its offer of an academic appointment was that Salaita’s inflammatory public statements about Israel would hamper his ability to teach and the university’s ability to attract students, faculty and staff.  The president of the University of Illinois Robert Easter summarized this view when he asked the board not to approve Salaita’s appointment:

“Professor Salaita’s approach indicates that he would be incapable of fostering a classroom environment where conflicting opinions could be given equal consideration, regardless of the issue being discussed…I am also concerned that his irresponsible public statements would make it more difficult for the university … to attract the best and brightest students, faculty and staff.”

The decision created a furor and quickly drew the attention of the AAUP.

Wei-Hock “Willie” Soon. A plasma physicist, he has served as a non-tenured employee of the Harvard-Smithsonian Center for Astrophysics since 1991, where he previously did his post-doctoral work.  In 2003 Soon published a paper in Climate Research in which he argued that the 20th century was not the warmest in the last millennium.  The paper occasioned much controversy, and in 2011 Greenpeace using documents obtained through Freedom of Information Act requests attacked Soon for receiving over $1 million in funding from petroleum and coal interests.

In January 2014, Soon was the co-author on another paper, “Why Models Run Hot: Results from an Irreducibly Simple Climate Model,” which takes exception to the “consensus” climate models that predict significant global warming because of rising levels of carbon dioxide in the earth’s atmosphere.  In February, directly following the publication of this paper, the Guardian and The New York Times making use of material provided by Greenpeace and “an allied group,” Climate Investigations Center, ran attacks on Soon for supposedly failing to disclose his sources of funding and for “conflicts of interest.”

Wishing Settlers Get Lost

The speech that gave rise to the University of Illinois’ action against Salaita consisted of his numerous statements on Twitter in 2014 that were, as Inside Higher Education put it, “deeply critical of Israel” to the point of striking some “as crossing the line into uncivil behavior.”  Perhaps the most famous of these was Salaita’s comment on June 19, after three Israeli teenagers were kidnapped (but before they were found murdered), “You may be too refined to say it, but I’m not:  I wish all the fucking West Bank settlers would go missing.”  Salaita’s rants have struck many readers as anti-Semitic, but he stoutly denies this. Salaita’s caustic and often extremely uncivil tone is not limited to his tweets.  Many of his reviews and his other academic writings are in a similar vein.

The scholarly paper that landed Soon on the front page of The New York Times and in many follow-up stories in the liberal media contains nothing rhetorical or demeaning.  It is a straightforward scientific argument. The abstract runs in part:

Between the pre-final and published drafts of the Fifth Assessment Report, IPCC cut its near-term warming projection substantially, substituting “expert assessment” for models’ near-term predictions. Yet its long-range predictions remain unaltered. The model indicates that IPCC’s reduction of the feedback sum from 1.9 to 1.5 W m−2 K−1 mandates a reduction from 3.2 to 2.2 K in its central climate-sensitivity estimate; that, since feedbacks are likely to be net-negative, a better estimate is 1.0 K; that there is no unrealized global warming in the pipeline; that global warming this century will be <1 K; and that combustion of all recoverable fossil fuels will cause <2.2 K global warming to equilibrium.

The text of the article itself continues in this vein.

One might think the effort to drum a senior physicist out of the academy through a campaign of public smears and innuendo would concern the AAUP at least as much as the decision by a university not to proceed with the appointment of an ardent polemicist. But that is not the case.

I was at the AAUP conference for two sessions devoted to the topic of academic freedom. Salaita was a major theme in one of the sessions—on “Social Media, Civility, and Free Expression on Campus”—and a secondary theme as the second session on “Versions of Academic Freedom.” Willie Soon was never mentioned, although at the end of the second session some audience members edged towards the topic.  A professor from Florida State University complained that the Koch Foundation is violating academic freedom by paying for some faculty positions in the economics department there.  And another member of the audience followed up by avowing that the Koch brothers are terrible people whose fossil-fuel riches are used in part to deny climate change!

‘Consensus’ Science

Climate Change conferees likewise had nothing to say about the travails of Steven Salaita, though here the parallel breaks down.  The Climate Change conference was not aimed at an all-embracing view of academic freedom.  It was focused on the specific contentions that a self-interested establishment is impeding the publication of accurate climate data, well-designed scientific research, and scrupulous economic analysis.  It was also focused on the ways in which reasoned debate and criticism of “consensus” science and regulation are being stymied.  Salaita was not relevant.

The Role of Civility

I have tried to strike a non-partisan tone in these descriptions but I don’t mean to imply that I am a neutral party.  I was invited to the AAUP event by my friend John K. Wilson, who has regularly asked me to AAUP events that I might enrich the conversation with some views that would probably otherwise go unvoiced.  This year my NAS colleague, Executive Director Ashley Thorne, also gave a talk in which she defended the ideal of “civility” as part of what we should expect in academic discourse.  Her fellow panelists and the audience were unpersuaded.

Civility to them is one of the masks that the powerful use to suppress free, creative, dissenting, and unorthodox ideas and speech.  For my part, I urged the idea that academic freedom is to be valued as the means by which the university encourages the pursuit of truth, and that the attempt to deploy the rhetoric of academic freedom as a cover for engaging in political advocacy is a misuse of the concept.  My fellow panelists and the audience also found little attraction in that approach. Pursuit of truth, it seems, is another mask that the powerful wear when they set out to suppress dissent.

At the Conference on Climate Change, the National Association of Scholars was the recipient of several enthusiastic endorsements from speakers who drew attention to our report, Sustainability:  Higher Education’s New Fundamentalism.  Our table full of handouts was emptied of everything we brought on the first morning.  And the Heartland Institute included a 12-page summary of our 260-page report in the bag of materials that all conferees received. Given that our report takes no stand at all on climate change,” this was a remarkably warm reception. All we did was call for universities to allow open debate that included skeptics of the climate “consensus.”

I find it hard not to be moved by the plight of Willie Soon and other scientists who have become, in effect, “enemies of the people,” for their determination to pursue research that runs against what the climate consensus establishment prefers. The canard that “97 percent” of climate scientists agree with the so-called consensus has been shown up as an artifact of shameless manipulation of the research record.  But no matter: it is repeated endlessly in an effort to make these non-conforming scientists look ignorant, silly, or corrupt.  They are, to the contrary, serious and seriously smart people who have also shown a certain measure of courage.

Science or Politics?

Whether their dissents are accurate will be determined in time to come.  If they are right, the climate consensus is a house of cards built more on political aspirations than on good science.  But, right or wrong, they deserved to be heard and do not deserve to be subject to the sort of ad hominin attack exemplified by what happened to Willie Soon.

So what are academic and intellectual freedom?  They aren’t quite the same thing.  Academic freedom is germane to the university where the disciplined pursuit of truth by rational inquiry and scrupulous examination of the evidence needs to prevail over all orthodoxies of opinion.  Academic freedom can only persist within a community that enforces on itself some degree of compunction about how things are said, including deference to the reality that no matter how strongly we believe in the validity of our own opinions, we may be mistaken and it behooves us to listen with respect to other views.  Intellectual freedom is broader than academic freedom.

It is germane to a free society where every individual ought to enjoy the right to make up his own mind about important questions and where manifestly false opinion or eccentric belief enjoys a wide zone of toleration.  We need not fall silent when confronted with views with which we disagree.  Neither academic freedom nor intellectual freedom entails indulging folly by saying nothing.  But we should never expect to throw someone in jail for an errant opinion or preempt their right to have their say.  If we choose to answer folly, we should do so with our own speech—which just may turn out to involve even greater folly.

The AAUP is celebrating the hundredth year of its founding declaration, its Statement of Principles, which remains one of the great documents in higher education.  Ironically, the AAUP has long since repudiated most of the Statement of Principles, which said all too much about the responsibilities of professors, the need for a scholarly spirit, temperate language, and staying within the guardrails of the professor’s actual expertise.  But no matter, the founding principles of the AAUP are still alive.  In D.C. last week, they were to be found just a few subway stops away.

Why College Today Is a Mishmash

Kevin Carey is convinced that online learning has created a watershed moment in the history of higher education.  Not since Johannes Gutenberg assembled an ensemble of movable type, meltable alloy, oil-based ink, and a screw press in 1439 has there been such a moment—or so says Carey in his new book, The End of College: Creating the Future of Learning and the University of Everywhere.

It is a strong assertion that rests on the relatively fragile facts of no more than twenty years of shaky experiments with the new technology.  If we stick with the Gutenberg analogy, online learning is still in the era of incunabula, that period before 1500 when artisans were still working out what to do with the printing press.  As often as not the early printers set aside Gutenberg’s movable type in favor of a carved wooden block for each page.  Woodblock printing could retain some of the delicate beauty of medieval ornamented manuscripts, but it couldn’t compete with the speed and economy of production and the ease of correction of movable type.

A Serious Man

Kevin Carey is among the handful of contemporary writers on higher education who merit serious attention.  He is far from alone in his enthusiasm for online learning and his belief that it will transform higher education.  But he is a far better writer than other enthusiasts and his book deserves the attention of even those who view the new technologies as a mere diversion from more important things.

In the second chapter of The End of College, Carey compresses into 25 pages the history of the university from the founding of the University of Bologna in 1088 to the floodtide of degrees from American colleges and universities in 2012.  It is a neat performance, free of ponderous explanation, narrowing swiftly to the matters at hand, and yet touching nearly all the key matters.  The modifier “nearly” is needed because Carey (deliberately I suppose) skirts the topic of how universities have been shaped by and helped to share broader political and social movements.

A word on this before turning to Carey’s actual subject.  Carey is alert to how higher education has always responded to the changing needs for “intellectual capital.” The medieval university, he writes, arose out of particular circumstances that brought students together in towns where knowledge could be organized and shared.  Universities were from the start the seedbeds of what we would now call transnational elites.  But they also became seedbeds of nationalism, romantic revolutionary ardor, and later Marxism.  In the United States, the history of higher education has been interwoven in complicated ways with religious aspiration and various egalitarian movements, including efforts to advance the rights of women and racial minorities.  It would seem difficult to explain the history of American higher education over the last half century without treating race and racial preferences as a central topic. Yet the topic is entirely missing in The End of College—as are the topics of campus radicalism from SDS to BDS; the sustainability movement; free speech controversies; and the politicization of higher education.

These blind spots are no less evident in Carey’s other writings on American higher education.  Perhaps it is best to say that he knows his audience, which is liberal, self-satisfied, and not perturbed that colleges and universities have become leftist monocultures.

What Charles Eliot Did

What does perturb Carey is that American higher education is a mishmash of efforts to achieve three competing goals:  vocational training, the research enterprise, and the liberal arts.  None of these is accomplished especially well, although the liberal arts come off the worst.  Carey places the blame for the mishmash at the feet of Charles Eliot, the Harvard University president who in 1869 invented the “elective system,” and who also made the bachelor’s degree a prerequisite for admission to Harvard’s graduate and professional schools.  The elective system, soon copied at almost every other college and university, meant the demise of the core curriculum and its replacement by an expensive and expansive collection of courses that led to limited learning and incoherent programs.  In Carey’s assessment, Eliot also opened the door for the faculty to be made up of research specialists who have no training in or necessarily any aptitude for teaching.  The de-emphasis on the core curriculum and the dominance of research over teaching are two sides of the same coin.

But that coin is burnished to a golden gleam with the rhetoric of liberal arts education, endlessly deployed by college presidents who have redefined the “liberal arts” as whatever their institutions happen to be doing at the moment.  Learning to “think critically” covers just about any contingencies short of grunt labor, but maybe that too if the labor is spent sorting recyclables or undertaking other sweaty tasks on behalf of social justice.

Rich in Characters and Ideas

In the 2013 spring semester, Carey enrolled in the MIT online course, The Secret of Life, taught by biology professor Eric Lander.  The course was one of those that MIT made available as a MOOC through the Harvard-MIT online collaboration, edX.  Carey was enthralled by this enormously difficult course, and despite his non-science undergraduate and graduate education, stuck with it, problem sets and all.  The End of College carries The Secret of Life through most of its chapters as Carey weighs its lessons and does the writerly equivalent of turning over proteins and amino acids to see how things fit together.

It is a book rich in characters as well as ideas.  The portrait of Stephen Joel Trachtenberg in chapter 3—the former president of George Washington University and one of the people who unleashed the terrific price spiral that has turned American higher education into a cul-de-sac of campus luxury, student debt, and intellectual mediocrity—is fair-minded and finely etched.  Carey’s conversation with Trachtenberg is one of a dozen or so encounters that he draws on to develop his thesis that the old university—what he calls “the hybrid university”—is on the way out and that the new online thing, “the university of everywhere,” is on the doorstep.

Is it really?  The End of College is the best-argued case I have seen yet that digital learning will transform higher education.  Carey is fully aware of the inertial resistance to that transformation.  Our existing colleges and universities have strong institutional reasons to impede it even as they incorporate some of its technology.  And there are deep sources of social and cultural resistance from a public that is invested in the older forms of credentialing and prestige.  “The hybrid university will not disappear tomorrow,” he writes, “but they (hybrid universities) have been ripping off parents and students for decades by shortchanging undergraduate learning.”  There are sober thinkers on the other side of this, such as Andrew Delbanco, who have argued the crisp opposite:  that online education is the barbarian that threatens to despoil undergraduate learning.

The barbarians, if that is what they are, have now found their most eloquent champion in Kevin Carey. Let the contest begin.  Unleash the broadband of war.  Let MOOCs mix it up with Morrill; Gutenberg grapple with GitHub; and edX close quarters with Eliot.  However this works out, Carey acquits himself well on the topic at hand.

A Setback for BDS

The movement to impose a boycott on Israeli universities, to get colleges to divest from Israeli companies, and to impose other sanctions on Israel—the BDS movement (boycott, divest and sanction)—was launched in 2005 by a collection of Palestinian organizations.  Over the last decade it has gathered significant support in American higher education, but the enthusiasm of some American academics for the cause didn’t attract much attention outside the academy until the vote by the American Studies Association (ASA) in December 2013 to join the boycott.

That vote shocked many who had not yet heard of the US Campaign for the Academic & Cultural Boycott of Israel.  The National Association of Scholars called on the ASA not to join the boycott.  And when the ASA went ahead with it, some colleges and universities responded by dropping their institutional memberships in it.  One consequence of the furor was a series of decisions by other scholarly associations, including the Modern Language Association, to reject proposals that they also join the boycott.

Since then, the BDS movement has been less prominent in American higher education but it has not gone away.  Last week the New School held a two-day conference, “Sanctions and Divestments:  Economic Weapons of Political and Social Change.”  Nimer Sultany, lecturer in public law at the University of London, who is Palestinian and one of the international leaders of the BDS movement, argued that BDS is a promising tool to advance the goal of returning “all Palestinian lands” to Palestinians and to “reverse Israeli colonization.”  Todd Gitlin, the 60s radical who is now the chairman of the Ph.D. program in Communications at Columbia University, opposed BDS on the grounds that its stated goal is “too broad.”

Vagueness generally helps the proponents of BDS.  At the New School event, Sultany refused to be pinned down as to what exactly its goal might be.  The eradication of Israel?  He wasn’t ruling it out, but neither did he own it.

This reticence about goals may help proponents of the movement to draw in supporters who feel sympathy with dispossessed Palestinians but haven’t thought very much about the implications of the movement’s broad claims.  When those claims come into sharper focus, campus support dwindles.

That lesson was displayed on May 2, when students at Bowdoin College in Brunswick, Maine voted down a proposal to support the boycott. With 1,619 students voting, only 228 voted in favor of the boycott, and 1,144 voted against it. (247 abstained.)  The students showed collective wisdom, and in this case they were influenced by Bowdoin’s out-going president, Barry Mills, who in 2014 issued a strong statement rejecting the boycott movement.

The National Association of Scholars pays special attention to Bowdoin College.  Our 2013 study, What Does Bowdoin Teach?  How a Contemporary Liberal Arts College Shapes Students, picked out Bowdoin to serve as a representative institution—one that was small enough to study in depth, but also one whose strengths and weaknesses are widely shared by other elite liberal arts colleges.  In that vein, we took a critical view of the readiness of the Bowdoin administration and the students to embrace fashionable progressive causes.

In this case, however, Bowdoin has demonstrated a more thoughtful and deliberative side.

The idea of getting Bowdoin to boycott Israel had come up before.  Mills’s 2014 statement was a response to an earlier round of advocacy.  A new round began this spring and eventuated in a petition circulated in April by the Students for Justice in Palestine (SJP).   According to one of the group leaders, the goal wasto isolate parts of the Israeli state apparatus that are normalizing the maltreatment of Palestinians and abuses of their human rights.” The petition got enough student signatures for a student-body-wide referendum.  The success of the petition drive, however, alarmed other students who organized a counter campaign.

The Bowdoin Orient, the student newspaper, quoted students as saying that they didn’t know enough about the conflict in order to vote with a clear conscience.  This is noteworthy in that it means that students did not take the boycott as the default position.  They did not just assume that the case for the BDS movement was right.  Other students voiced more particular objections such as their preference for a two-state option.  Still others complained that the boycott “threatens academic freedom,” especially the “free exchange of ideas” about the conflict itself.

All of this is encouraging—encouraging that a college community that has often fallen into lockstep conformity on political issues and shown very little interest in allowing a diversity of opinions to flourish re-discovered the value of open debate.

It is especially encouraging because we are in a strange moment in American higher education:  a moment in which intellectual freedom seems terribly imperiled.  The rhetoric of “rape crisis”; the insistence that there is a “climate consensus” that obviates the need to hear from skeptics; the post-Ferguson hyping of the idea that America uses violence to maintain a racial hierarchy—these and many more pronouncements have fostered a campus climate across the country in which students congratulate themselves for shutting down discussion, dis-inviting speakers who might disagree with prevailing opinions, and attacking those few students who stray from the new orthodoxies. Intimidation is the hottest campus trend.

Bowdoin is far from immune to these disorders. It is not a place where intellectual freedom generally flourishes. But as we showed in What Does Bowdoin Teach? there is another, older, and better Bowdoin.  It is reassuring to see the college in this instance find its better self.  And if Bowdoin is indeed representative of elite higher education, perhaps the vote on May 2 is a sign of a broader recovery in American higher education. The BDS movement is an ugly retreat from academic and intellectual freedom. It is heartening to see it beaten back so decisively in a place where its proponents might well have expected an easy win.

‘Dignity,’ Another Legal Trojan Horse

“Dignity” has been taken out for another walk around the block.  In February 2014, I wrote an essay on Minding the Campus in which I commented on Attorney General Eric Holder’s speech to the Swedish Parliament, wherein he spoke of his nation’s commitment to the “dignity” of “every human being.”

Over the last couples of decades, “dignity” acquired a secondary meaning something like “the intrinsic integrity of lesbian, gay, bisexual, and related sexual minority lifestyles.”  The word has become an assertion, first, of equality, as in ‘We all have human dignity and therefore a legitimate right to participate fully in society.’  But the word also asserts, rather aggressively, a moral authority to silence and to punish those who in some way endanger an individual’s experience of “dignity.”  Here the word drifts into the sphere of reasons why people need “safe spaces,” “trigger warnings,” and protection against “micro aggressions.”

The invocation of dignity, I wrote last year, has become an escalation in the rhetorical arms race.

It appears Jeffrey Rosen, the George Washington University law professor and ubiquitous liberal commenter on all things legal, has stumbled into agreement with me.  Writing in The Atlantic on “The Dangers of a Constitutional ‘Right to Dignity,’” Rosen considers what will follow if the Supreme Court “strikes down same-sex marriage bans […] on the grounds that they violate the dignity of gay couples.”

As I pointed out last year, “dignity” is an extra-Constitutional principle in the United States, and one with only a sketchy presence in case law.  If the high court finds a right to gay marriage on the basis of “dignity,” it will be conjuring a new fundamental principle into our legal system.  Justice Kennedy seems eager to do just that, and it rightly worries Rosen.  Rosen notes that the word has appeared in more than 900 Supreme Court decisions.   But that’s misleading.  “Dignity” isn’t to be found in the Declaration, the Constitution, the Bill of Rights, or anywhere really, until it was invoked in 1944 by Justice Frankfurter.  After Frankfurter’s opinion, it furtively hung around until our legal elites found themselves in need of rhetoric that would justify making up new law to support the overarching claims of sexual autonomy.  Rosen quotes the Kennedy’s opinion in the 1992 abortion case, Planned Parenthood v. Casey, where he invoked “choices central to personal dignity and autonomy,” as the ground of the “liberty” to abort.  Kennedy also invoked “dignity” in his 2003 Lawrence decision, sweeping away legal bans on homosexual behavior.

Rosen provides a phrase I had not seen before to describe the matters that Casey and Lawrence now put at the discretion of judges:  curing “dignitary harm.”  If “dignity” were to be further elevated as a principle (“Kennedy’s new synthesis of dignity with liberty and equality”) we will well and truly be launched into a voyage on the high seas of legal improvisation.  Rosen observes that in Kennedy’s mind the dignity that follows an individual’s efforts to gratify his sexual appetites (his “interest in dignity”) trumps “traditional moral values.”

What is to constrain “dignity” as the all-purpose demand that the state protect the sexual adventurer from any and all forms of social disapproval?  The court may yet find a limiting counter principle, but it hasn’t found one yet.  Protecting procreation and preserving “tradition” as such have been “ruled out of bounds.”

So why is Rosen, who is all in favor of gay marriage, worried about the direction of the Court?  He is worried that Constitutionalizing “dignity” would undermine First Amendment rights to free speech.  He also thinks that conservatives could discover their own ways to weaponize the concept, as when Justice Scalia found “a dignitary interest attached to the right to bear arms.”  And Rosen observes, correctly, that the concept is vague and ill-defined.  So vague that conservatives might use it to strike down “progressive legislation.”

Right about that.  If we get stuck with this as a Constitutional principle, I’m not going to waste time fighting it.  I will be lining up with those to express the profound injury to my dignity caused by campus leftists who say mean things about me and in sundry ways show disrespect for my ideas.  Maybe I can get a class action suit going with the other contributors of Minding the Campus who are similarly situated in “dignitary harm” from left-leaning faculty councils and progressive editorial pages.

In truth, I’d rather the Supreme Court back away from making an overriding principle of dignity, and I’m glad Rosen sees it as folly as well.  Rosen and I disagree on the reasons.  In my view, the core                  Constitutional principles of liberty and equality are ample and introducing a Kennedy amendment to our basic law by making “dignity” coordinate with liberty and equality would profoundly undermine our basic freedoms.  At a still deeper level, I don’t believe there is such a thing as a “right to dignity” that can legislated or promulgated by courts.  Not for lack of trying, of course.  The EU recognizes such a right, but the EU case also demonstrates that, as a legal right, human dignity is reduced to mere superficialities.

Actual human dignity exists, where it does exist, in the composure of the individual regardless of his circumstances.  It cannot be conferred by a legal decision or a law.  The attempt to confer it that way is doomed to failure but it won’t be a clean failure.  It will take the tablecloth, the glassware, and the soup down with it.



Is ‘Get a Job’ the Purpose of College?

Wisconsin Governor Scott Walker just made an unforced error.  He proposed—then backed away from—a change in the mission statement for the University of Wisconsin.  I admire Walker and view him as among the more attractive candidates for the Republican nomination.  And in that spirit, I’d like to offer him some friendly advice on a potentially troublesome issue.

Walker’s misstep was a piece of a budget proposal where he removed from the state code the words in the University of Wisconsin mission statement that committed the university to “search for truth” and “improve the human condition” and substituted “meet the state’s workforce needs.”  This proposed change instantly set off political reaction in Wisconsin and created a field day for leftist pundits.  Salon headlined the story, “Scott Walker’s Hilarious Screwup.”  Once again, a conservative politician tagged himself as an anti-intellectual ignoramus and a man for whom the idols of the marketplace trump every high-minded ideal.

Walker quickly backed off, and lamely explained the language as a “drafting error.”  But I am ready to assume with the critics that Walker himself signed off on the substitution and that he and some of those around him must have thought it was a good idea at the time.

It plainly wasn’t a good idea.  But the sneering from Walker’s progressive opponents doesn’t explain anything.  Those opponents have simply seized an opportunity to pose as defenders of the search for truth and the improvement of the human condition while casting Walker as a vulgar materialist.  Where have we heard that storyline before?

Reagan in ‘67

As it happens, just a few weeks ago, Dan Berrett, a senior reporter for the Chronicle of Higher Education, declared in “The Day the Purpose of College Changed” that on February 28, 1967, Ronald Reagan singlehandedly reduced college education from a “vehicle for intellectual development, for cultivating a flexible mind,” to a means to “prepare students for jobs.”

Reagan accomplished this neat trick, says Berrett, at a press conference a month into his first term as governor.  Asked about his proposed budget that trimmed some money from the University of California, Reagan explained that there would be “some belt-tightening” in all areas of state government but that he would do nothing “harmful to education.”  (The remarks are on pages six and seven of the transcript of the press conference.)

Then Reagan added, “But we do believe that there are certain intellectual luxuries that perhaps we could do without for a year or two without hurting the cause of education.”

A reporter asked him what he meant by “intellectual luxury,” and Reagan answered with a quip.  He mentioned a course at UC Davis “where they teach you to hang the Governor in effigy.  That in my mind is an intellectual luxury.  Of course, I may be prejudiced.”

The back-and-forth with the reporters continued, with Reagan giving several examples of college courses that seem trivial, and one—on repairing band instruments—that he said was “sort of subsidizing intellectual curiosity.”

That, for Berrett, is the smoking gun.  Governor Reagan, way back in 1967, treated “intellectual curiosity” as secondary.  Berrett recounts that the Los Angeles Times seized the phrase to editorialize against Reagan’s supposed philistine views:  “If a university is not a place where intellectual curiosity is to be encouraged, and subsidized, then it is nothing.”

Berrett, like the LA Times in 1967, sees this as all or nothing.  Reagan was not criticizing a frivolous or excessive expenditure, according to Berrett, but was “staking out a competing vision,” namely, “Learning for learning’s sake might be nice, but the rest of us shouldn’t have to pay for it.  A higher education should prepare students for jobs.”

Berrett’s explanation of what Reagan meant, however, is conjured from thin air.  Reagan himself said nothing like that in the press conference or, as far as I can tell, at any other time.

It is rhetorically convenient to set up a little polarity like this:  the good proponents of education for the sake of pure learning vs. the bad proponents of education for vulgar utilitarian ends.  One or the other.  This is, of course, the same trap into which Governor Walker has stumbled.

Higher Education Has Four Purposes, Not Two

Berrett and others like him who set up this either/or are badly mistaken.  Education is and always has been about both learning for learning’s sake and practical training—and two other things as well.  Higher education, at least since the time of Plato, has always had four basic purposes that tug in different directions: the pursuit of truth for its own sake (“intellectual curiosity”); preparation for practical life (“jobs”); the transmission of culture (“civilization”); and the shaping of good character (“citizenship”).  Sometimes these can be brought into balance, but usually they jar against each other.  Aristophanes, who favored the transmission of traditional culture, mocked Socrates as a charlatan; Cicero criticized the Greek philosophers for distracting student from preparation for public life.  Jefferson extolled education as essential not for the pursuit of truth or the preparation of students for work, but as foundation of civic life.

Those of us today who defend liberal education—and we are many—often make arguments for it that go well beyond the value of “intellectual curiosity,” though to be sure intellectual curiosity is important.  Take, for example, William Theodore De Bary, the provost emeritus of Columbia University, whose recent book, The Great Civilized Conversation:  Education for a World Community, is a richly elaborated defense of liberal education as “education for the world community.”  De Bary’s book is one of many in arm’s reach on my book shelf that defy Berrett’s simplistic dichotomy.

Climate Citizens, Rape Culture, ISIS, Employment

Liberal education serves all four purposes at once, but seldom in equal measure.  Because we have limited time and limited resources, and because the world throws up specific challenges at particular historical moments, we do have to choose where to throw the emphasis.

Is this, for example, the moment to emphasize a college education as the tool to turn students into “fully aware, service-oriented climate citizens,” as a dean at The New School explained her university’s new climate commitment?  The New School has just announced its decision to divest from fossil fuels as part of a comprehensive re-orientation of the university to “think differently about climate change.”  The New School  is far from alone in American higher education in its enthusiasm for this cause—a cause on which I am prominent critic and on which my organization is about to issue a major report.  But the point to make here and now is that focusing a college on “climate change” is not about “intellectual curiosity” or “job preparation,” though it can partake a little of both.  Rather, the climate change focus is primarily a version of what I called “shaping good character.”  It aims to produce a particular kind of citizen—what the dean calls a “climate citizen.”  And it does so in the name of a greater good:  saving the planet from the supposed danger of eco-apocalypse.

But other defenders of liberal education offer quite different diagnoses of what should count as the issue that should most inform liberal education today.  If we are, as many feminists and college administrators say, in the midst of a “rape culture” in higher education, what could be more pressing than to stop it?  Would protecting “intellectual curiosity” trump protecting undergraduate women from sexual predators?

Then again, many Americans attentive to the rise of ISIS and the aggressive new barbarianism of radical Islam have come to believe that liberal education today ought to prepare students to stand in defense of Western civilization.  It is a hard sell, since a substantial portion of the American professoriate is ambivalent whether the West is worth defending.  Still the idea of defending the West against Islamo-fascism is one strong way to uphold the importance of liberal arts education.

Let’s add to this list the concern that many Americans have over our nation’s economic prospects.  Officially, the unemployment rate stands at 5.6 percent, but many observers see only statistical deception in that figure.  According to Gallup, only 44 percent of adults age 18 and older have jobs in which they work 30 or more hours a week and get a steady paycheck.   The employment-population ratio stands at about 58.6 percent—more than 100 million Americans do not have jobs.  Granted, some of those don’t want jobs or cannot work, but we add to this picture that recent college graduates are having a very hard time finding work that matches their credentials.  Richard Vedder, Christopher Denhart, and Jonathan Robe have documented that 48 percent of employed college graduates are working in jobs that the U.S. Bureau of Labor Statistics classifies as requiring less than a four-year college education.

With this picture in mind, Governor Walker’s clumsy attempt to change the mission of the University of Wisconsin has at least some warrant.  As a nation, we are spending vast amounts of money on higher education, but achieving rather poor results as far as matching education to “workforce needs.”  Add to this President Obama’s unrelenting insistence on sending greater and greater percentages of students to college, and the $21.8 billion shortfall last year in the federal student loan program, and it is fair to say we have a crisis.  That $21.8 billion, according to Politico, is “apparently the largest ever recorded for any government credit program.”   And President Obama, of course, has just proposed a gigantic increase in federal spending on higher education—in the name of improving the financial prospects of the young and making America more prosperous.

Unlike the global warming crisis and the rape culture crisis, this one is not imaginary.  The gap between what we spend on education and what we achieve in preparing young people for high-skill employment is very large, and the social consequences of the gap are significant.

How we should respond to it, however, is an open question.  Not long ago, Charles Murray in Real Education argued that we should let much of higher education subside into its obsolescence and replace it with a regime of skills-oriented tests.  “Badges” based on such tests are indeed becoming more common.  Others, such as Harvard’s Clayton Christensen in The Innovative University, have called for hybrid models that combine online education with residential experience.  There are plenty of observers  who think the time has come to let “creative destruction” rip through the groves of academe like an army of loggers intent on a clear cut.

And there are plenty of others who shrink in terror at the prospect.  Viewed in that light, Governor Walker’s revision of the University of Wisconsin’s mission statement was the intervention of a moderate.  If we are going to save the contemporary university, some significant and painful changes may be necessary.

Back to Berrett

Berrett isn’t entirely unaware that higher education serves multiple ends.  Though he has reduced it to only a two-sided battle between advocates of pure learning and advocates of job preparation, he claims, “These two theories had long existed in uneasy equilibrium.”  The past perfect tense of that sentence is because Reagan, he believes, tipped the balance.  “On that day in 1967, Reagan crystalized what has since become conventional wisdom about college.”  The utilitarians have won.

Berrett’s account of what has happened in higher education is profoundly false, but as sometimes happens, it is false in ways that make it interesting to examine.  How in the world does someone get to the view that Ronald Reagan of all people undermined liberal education?  It is not for lack of attention to the various strands of American thinking about higher education.  Part of the answer is that Berrett follows the lead of the president of Wesleyan University, Michael Roth, whose recent book, Beyond the University: Why Liberal Education Matters, cites the views of Thomas Jefferson, Benjamin Franklin, Booker T. Washington, and W.E.B. Du Bois.  Berrett does too.

I’ve reviewed Roth’s book elsewhere, which is a piece of leftist self-pleading.  He sets up the same either/or as Berrett.  He vindicates the nonutilitarian value of education, viz. “to teach students to liberate, animate, cooperate, and instigate,” and sets this against the notion of mere vocational training.  Roth is pretty happy with the status quo and seems mostly intent on persuading Americans to continue footing the bill.  His reply to those, like the NAS, who criticize the overexpansion of higher education at the expense of academic standards is that we “seem to be very comfortable with the kinds of inequality that were characteristic of [preindustrial] societies.”  That’s a serpentine way of calling us racists, and it is an interesting subtext to the strawman argument that distinguishes progressive support for “intellectual curiosity” (or Roth’s term, “instigation”) from vocational training.  Who is more concerned about the prospects of young people from impoverished backgrounds—those who want to immerse them in sustainability/women’s studies/identity group studies programs, or those who are concerned about preparing students for the workforce?

But let’s not give this strawman matches to play with.  The truth is that higher education has to find a balance between its competing ideals.  Anyone who sets them off as mutually exclusive is on a path to non-creative destruction.  And therein lies both Governor Walker’s and Dan Berret’s mistake.

But not, as it happens, Ronald Reagan’s.  Dan Berrett himself ends his Chronicle article on what is supposed to be an ironic note, as he quotes Reagan’s address at the inauguration of a new library at Eureka College in 1967—the same year as the press conference.  At the dedication Reagan gave a full-throated defense of the liberal arts tradition, saying that the “answers to all the problems of mankind” could be found in the library.  In fact, there is no irony at all.  Reagan was a civilized as well as a wise man, who could distinguish between the frivolous forms of intellectual curiosity and the profound ends to which intellectual curiosity could be put.  The problem we have today is that so many of those who purport to defend the liberal arts no longer know how to draw that distinction.

Last Thing

The re-emergence of the fatuous distinction between the liberal arts as self-justifying intellectual curiosity and the right’s obsession with bottom line results deserves just a little bit more attention.  Berrett’s Chronicle essay offers a lengthy chronology of how conservatives supposedly tipped the balance in favor of the utilitarian view.  It includes things like the 1973 oil embargo which prompts students to “flock to practical and pre-professional majors,” and the rise of the internet as shifting liberal arts education away from “facts” and toward “habits of mind and skills like critical thinking.”  Berrett also recounts the rise of various higher education organizations in the battle against reducing college to job training.

I take Berrett’s narrative as representative of an argument that I have often heard from left-of-center defenders of contemporary higher education.  They see the peril to the liberal arts as arising almost exclusively from external forces, both material (the oil embargo) and ideological (proponents of free market ideas).

Missing altogether from Berrett’s timeline are things like the 1962 Port Huron Statement, in which the SDS laid out its agenda for using colleges and universities as instruments for radical transformation of American society; the 1964 Berkeley Free Speech Movement, which was the opening phase of the successful effort by the campus left to unseat the curricular authority of the university; or the success of the effort to establish racial preferences in admissions and “diversity” as the regnant campus creed.  The fragmentation and trivialization of the liberal arts curriculum had a lot to do with these developments.  And the disaffection of millions of Americans with higher education today is in very large part a consequence of the self-destruction of the ideals on which the liberal arts were once based.

Berrett is among those defenders of the status quo who are unable or unwilling to look more than momentarily at the bonfire the left has made of the curriculum and of academic standards.  Where will liberal education go next?  I hope it not only survives, but that it thrives in years to come.  That will happen, however, only if we get serious about defending the essentials and return to Reagan’s distinction between the higher uses of liberal learning and the distractions that merely appropriate the name of the liberal arts.  We need more discernment, less distraction.

My advice to Governor Walker:  don’t fall into the simplistic distinction between the “search for truth” and “workplace needs” that your opponents have set up.  It is a net at your feet meant to trap you.  Asking and expecting universities to address workplace needs is legitimate—more than legitimate, it is urgent.  But it is a goal that can be pursued without making yourself a supposed enemy of open-minded inquiry.  It is your foes who, rightly understood, have snapped their minds shut against the danger of new ideas.

The Muslim Call to Prayer at Duke

On January 14—a Wednesday—Duke University announced its decision to broadcast a Muslim call to prayer (the adhan) on campus at 1:00 every Friday afternoon.  An uproar ensued, fueled in part by Franklin Graham (son of Billy Graham) writing about the decision on his Facebook page.  The next day, Duke backed down, canceling its plan to issue the amplified adhan from the bell tower of the campus chapel. 

Duke, of course, is a private university and is free to use its bell tower to announce any religious summons (or other message) it chooses to.  The university traces itself to 1838 when it began as a “subscription school” organized by Methodist and Quaker families in rural North Carolina. The Quakers later split off and the school kept going with some state support but backed primarily by the Methodist Episcopal Church. As Duke’s official history puts it, “The trustees agreed to provide free education for Methodist preachers in return for financial support by the church, and in 1859 the transformation was formalized with a name change to Trinity College.”  

Times change, and so do names and missions. Duke acquired its current name in the 1920s. It acquired its infamy in 2006, when its president, Richard Brodhead; a corrupt district attorney, Michael Nifong; a “Group of 88” faculty members; and the gullible press led by the New York Times elevated into a national scandal a fictitious allegation of rape against some of the university’s lacrosse players. That scandal sits restively in the background of many subsequent developments at Duke. President Brodhead still presides, which among other things means that Duke remains a university inclined towards reading any situation as ripe for a demonstration of proactive social justice. 

Duke Responds to Paris

The most recent situation to be read that way was the January 7 invasion of the Paris headquarters of the French satirical magazine Charlie Hebdo, when Muslim terrorists murdered eleven people for blasphemy against Islam.  On January 11, some two million people, including 40 world leaders, participated in a Paris rally of national unity. The topic of Islamic terror was once again in focus for much of the world. The powers that be at Duke took counsel and decided that their response would consist of announcing that the adhan would henceforth issue from the Duke chapel on Friday afternoons.

The cause-and-effect is not in doubt. The day that Duke announced its decision to do the Friday call to prayer, Christy Lohr Sapp, the associate dean for religious life at Duke, published an essay in the News & Observer explaining that Duke was acting in response to “the recent attacks in Paris and Pakistan and renewed conflict in Nigeria.” These events had focused “negative press” on the Muslim world, and in particular had led to the portrayal of Muslims “as angry aggressors driven by values that are anti-education and anti-Western.” The Muslim community at Duke, wrote Lohr Sapp, “represents a strikingly different face of Islam,” one that is “peaceful and prayerful.” Duke’s decision to do the call to prayer on Fridays was intended to give “more of a voice” to “this face of faith.” 

Lohr Sapp’s essay is important because it exemplifies what Duke officials were thinking before the decision attracted controversy. She captures the university’s self-approbation as it takes advantage of “the opportunity” to show its “commitment to religious pluralism.” She is especially happy that, at Duke, respecting religious pluralism doesn’t mean asking all the different groups to share a “multi-purpose prayer room.” Instead, Duke strives to support each group according to its “particularities and practices,” each in “its own unique way.” Hence it made sense that “the neo-gothic cathedral at the heart of Duke’s campus” be repurposed “as a minaret.”

As astonishing as these declarations sound to many Americans, they are a fairly straightforward expression of the multicultural orthodoxy found on most college campuses.  It is, of course, a troubled orthodoxy: the parts do not cohere. A Gothic tower is not a minaret, except by a strenuous act of cultural appropriation. Dean Lohr Sapp’s commitment to the ideal of each-religion-in-its-own-particular-way runs smack against the reality that the “particularities” of many religions are mutually exclusive.  And few religions are content to subordinate themselves to the supposedly higher principle of “religious pluralism.”

The Multiculturalist View of Religion

A hard problem—and a very old one—lurks beneath Lohr Sapp’s glib formulations. The plurality of religious beliefs in America requires of us all a certain disciplined abstinence. We cannot always be actively minding one another’s religious commitments. Quite often we need to get out of the way.  We have two formulations of this respect for religious diversity that are older than multiculturalism. One approach now out of favor was religious “tolerance.” To tolerate, of course, is to suppress something:  dislike, aggravation, hostility. The tolerant do not refrain from judgment. They judge negatively but keep their judgment to themselves. But, by judging at all, the tolerant are guilty in the eyes of multiculturalists of assuming unwarranted superiority.  Multiculturalists don’t tolerate difference; they celebrate it.

The other old approach to religious plurality was to create a secular public order that legally curtailed the exercise of religious authority by one community of faith over any other. The boundaries of this secular order were never defined once and for all and continue to provide grist for litigation and judicial interpretation. This month, the Supreme Court handed down a unanimous decision in Holt v. Hobbs, granting on religious liberty grounds a Muslim convict the right to wear a beard in prison. Despite the many twists and turns in the law, the broad principle is clear:  no particular religion could be allowed to trump the liberty of adherents of other religions (or no religion). The courts have constantly recalibrated the balance between the rights of people to uphold their beliefs and the rights of others to be left relatively unhindered by those beliefs, and the balance between religious liberty and the rule of law.

Contemporary multiculturalism has some sympathy with secularism, but it is not so much focused on maintaining a public square swept free of religion as it is concerned with the cultural battles over which religions should be encouraged.  Multiculturalists would like to diminish what they regard as the undue privileges of the older, more established traditions in the United States, and they would like to enhance instead the recognition and esteem of less established religions.  Behind the slogan, “commitment to religious pluralism,” lies a thinly disguised distaste for traditional forms of Christianity and Judaism.  Versions of Christianity and Judaism that conform themselves to the prescriptions and anathemas of multiculturalism are welcome. But, ironically, those that stick to their older “particularities and practices” encounter a great deal of static from the guardians of the new dispensation.

Duke’s short-lived effort to amplify the Muslim call to prayer on Friday afternoons was correctly understood by critics as an instance of this underlying animus. 

Varieties of Religious Experience

Islam is of special interest to many in the campus left because it stands in opposition to Western civilization.  It is perfectly true that many Muslims in America are “peaceful and prayerful,” and many others are indifferent to jihad and Islam’s more bloodthirsty versions. But the irenic side of Islam isn’t really what prompts the assiduous efforts of people like Dean Lohr Sapp to draw attention to Muslim worship.  Rather, they are engaged in a game of leapfrog.  Radical Islamists commit an atrocity in the name of Islam; large numbers of Westerners respond with revulsion; and the stage is set for a drama in which the multiculturalists blame the West for “Islamophobia.”  One dramatic way to drive that point is to showcase Islam from the Gothic tower. 

The reactions from evangelical Christians were predictable, though it isn’t quite clear that Duke officials actually predicted them.  The officials may be sufficiently cocooned in their subculture as to have had no real sense of how provocative their innovation would be within days of the Paris massacre.  In any case, once Duke decided to cancel the call to prayer, the door was open to those whose preferred narrative is that the Islamophobic Christians had once again mistreated peaceful Muslims.  “An Old Bias Found a New Target at Duke U.,” headlined an essay in the Chronicle of Higher Educationby Eboo Patel, founder of the Interfaith Youth Core.  Patel compared the evangelical response to Duke’s call to prayer to the evangelical prejudice against Catholics when John F. Kennedy was a candidate for president.  Patel also argued that American Muslims are unfairly asked “to answer for Muslim extremists.”  Such questions are “denigrating” and violate the “dignity” of Muslims. 

Franklin Graham’s and Eboo Patel’s postings will have to stand in for the 600,000 or so articles and reports that have so far been published about this affair.  Clearly we have had another skirmish in the never-ending culture war.  Duke’s official climb down from the minaret of multiculturalism took the form of the university PR flak, Michael Schoenfeld, reassuring everyone, “Duke remains committed to fostering an inclusive, tolerant and welcoming campus for all of its students. However, it was clear that what was conceived as an effort to unify was not having the intended effect.”

“Conceived as an effort to unify” is an odd way to put it.  Setting up the Muslim call to prayer from the Christian bell tower was an effort to rally support for a particular religious-cum-political view. It is the view that, deep down, all religions, rightly understood, are legitimate expressions of the same peaceful aspirations.  That isn’t really a “unifying” view because it is intended from the outset to emphasize the distinction between those who embrace the reductionist idea and those who stick with the claims to exceptionality for their own faiths.

We need to distinguish between the fact of our pluralist society, which calls for tolerance and the rule of law, and the ideology of multiculturalism that commands us to pay deference to a self-evidently false proposition: that all religions say more or less the same thing. In his comment on the Duke affair, Shio Chong, a campus minister at York University in Toronto, warned us away from the “simplistic” view that this is a matter of fundamentalists facing off against “relativist, postmodern pluralists.”  Chong explained, “Duke University has confused syncretism with hospitality, while Franklin Graham and his ilk have confused hostility with defending the faith.” 

“Immediate Global Repercussions”

Is it too much to expect our universities, with their deep roots in the systematic study of world religions, to approach these matters with some level of cultural sophistication?  Maybe. The dean of the Duke Divinity School, Richard Hays, declared that he had not been consulted and was taken by surprise by the university’s decision to authorize the Muslim call to prayer from the chapel tower.  He wrote, “Any decision to permit the use of a prominent Christian place of worship as a minaret for Muslim proclamation will, in our time, have immediate global repercussions. Any discussion about such a proposal should take into careful account the perspective of millions of Christians living in Islamic societies where their faith is prohibited or persecuted.”

In short, Duke acted without even consulting some of its own scholars who might have tempered the university’s haste towards a clumsy form of hospitality towards campus Muslims. Some sense of how this message would be received by the broader community was clearly lacking.  Duke’s eagerness to line up in solidarity with the small Muslim community on campus and to conform to the multiculturalist orthodoxies of the moment is, unfortunately, pretty much what we have come to expect from universities that no longer take the time to think.

Ferguson and the Decline in Anthropology

As examples of what my academic field, anthropology, has sunk to, here are four responses to the shooting and riots in Ferguson appearing in the current issue of Anthropology News. Each is  a retelling of what might be called the left’s canonical myth of Ferguson: facts submerged in a sea of fiction.

Pem Davidson Buck, a faculty member at Elizabethtown Community and Technical College, writes in “The Violence of the Status Quo” that the importance of Ferguson is that the events there “make it impossible for the rest of the country” to ignore the violence with which white people routinely oppress blacks.  That violence is typically ignored, but it is pervasive because it is “a violence that is critical in maintaining the privilege that accompanies whiteness.”  “Continuous low-level violence [is] required to maintain inequality.”  The people of Ferguson, according to Buck, refused to accept this status quo and in so doing have “torn off the mask that hides these truths.”  Buck has a lot to say about masks, and mask-removal turns out to be what anthropology is good for as well:  “Anthropology can furnish analysis of the state, of the use of force, of whiteness, of structural inequality, segmented labor forces, and structural violence.”

Steven Gregory, professor of anthropology and African-American Studies at Columbia University writes in “Ferguson and the Right to Black Life” that Michael Brown was “gunned down” in Ferguson for being “a black male walking.”  The restoration of peace in the city means a return to conditions that led to Brown’s death in the first place. “It was this peace and this normalcy that killed Michael Brown.”  Brown had merely been engaged in the “right to assert and defend” his “humanity.”  He was killed because he was among those black people “who had the audacity to comport themselves as if their rights as citizens were inalienable and protected by the full weight of the law.” Gregory’s lessons for anthropology are more specific than Buck’s.  “We must be critical of how discourses of black violence, chaos and criminality are mobilized to delegitimize black resistance while conferring carte blanche to police repression.” But Gregory rises to the larger point as well:  “We must fight this battle with and not against those who agitate for freedom, democracy and human rights.”

Raymond Codrington, anthropologist in residence at the New York Hall of Science, writes in “Ferguson:  An American Story” that, “apparently, a struggle of some kind ensued” after Brown “ignored” Officer Wilson’s order to move out of the street.  But this concession to the facts is immediately followed with the fable that Brown was shot “with his hands and arms raised in the air in surrender” – or so “witnesses state.”  What followed demonstrated “the impact of racism and inequality in this country.” Codrington characterizes Brown’s caught-on-video strong-arm robbery of a convenience store as “shoplifting” that would have been excused as “youthful indiscretion” if done by a white teenager.  Then it is on to the deep analysis:  “the events in Ferguson demonstrate the cumulative impact of structural and individual racism.”  Codrington recommends that anthropologists compare what is happening in the U.S. to treatment of minorities in the UK and Brazil as a step toward “developing strategies and frameworks for dismantling structural disparities.”

Lydia Brassard, a graduate anthropology student at CUNY, and Michael Partis, an instructor in the Center for Ethnic Studies at CUNY, contributed the last of the articles, “Standing Their Ground in #Ferguson.”  They explain “#Ferguson can be used for our anthropology students as a way to analyze the relationship between contemporary power structures and the trajectories of sociopolitical mobilizations over time.”  (“#Ferguson” is the Twitter hashtag used by many of the protesters.)  Brassard and Partis are especially interested in “digital sharing and exchange” as part of the protest movement.  “Digital activism,” they argue, is a way to escape the “hegemonic narratives” of the “hegemonic news outlets.”  They welcome brevity and “sentence fragments,” not least because “rather than nailing down ‘facts,’’’ they create “the most nuanced landscape of understanding.”

Nuancing the Landscape

I have no reason to think that Buck, Gregory, Codrington, Brassard, and Partis represent fringe views in my discipline.  Their declarations sound entirely within what is now the mainstream.  For instance, I just received an announcement of a meeting in April in New York of the Society for Anthropology of North America.  The topic is “Inequality, Equality and Difference,” and is rooted in the idea that “many have come to doubt the ability of the present social system to produce an equitable, sustainable society.”  North American anthropologists have “a great deal to say about inequality.”  They do, they do.

But is what they say reliable?  Is it true?  And is concern over reliability and truth a matter of indifference to those who treat “facts” as mere encumbrances to deeper truths—“a nuanced landscape of understanding”—or strategies for combating oppression?

The editors of Anthropology News plainly saw no need to present alternative views of what happened at Ferguson, including any views that match with reasonable accuracy the record of events established by the grand jury.  Those matters aren’t even dismissed by the five contributors.  They are simply ignored. The point of all four of the articles is to reaffirm a mythic narrative:  An innocent black teenager was murdered by a white cop in an exercise of the structural violence that is part of America’s system for maintaining racial inequality.  The event stands out in significance only because the people of Ferguson spontaneously rebelled.  We anthropologists can use the murder and the subsequent rebellion to further our own activist agenda aimed at recruiting our students to the larger struggle against inequality.

Every part of this myth deserves to be challenged.  Michael Brown, who had physically attacked Officer Wilson, was neither peaceable nor innocent. Officer Wilson fired his weapon in self-defense.  “Structural violence,” and kindred terms such as “structural racism,” “structural inequality,” and “structural disparities” are intellectually lazy simplifications of complex social circumstances.  The appeal of such phrases is political.  They remove all moral and social responsibility from the actors who are portrayed as the victims of violence, racism, inequality, and disparity.   An anthropology that simply erases the motives of key participants and reduces them to objects acted on by invidious external forces is no anthropology at all.

The only motive attributed to the supposed victims is their heroic decision, at long last, to rebel against their “structural” oppression.   But the five authors seem oblivious to the numerous reports that the protests and the subsequent riots were mainly instigated by activists from outside Ferguson who saw an opportunity to exploit for their own political gain.  (One of the writers, Gregory, alludes to this dismissively by citing a riot in Harlem in 1935 in which police blamed much of the violence on the Young Communist League—which indeed played a major role. But Gregory’s point is ‘don’t blame outsiders.’)

Masks Beneath Masks

The “anthropology” on display in these four articles and in a great many more such declarations is a profound misappropriation of an intellectual discipline.  Anthropology, rightly understood, is an effort to understand human nature through systematic study of those qualities in us that vary in time and place—and those that don’t.  Anthropology looks at how we emerged as a species and how we have diversified into thousands of languages, tribes, and civilizations.  The field became a “discipline” by sternly demanding of itself rigor in how it went about this inquiry.  Mostly that rigor required a steadfast determination to stand outside the myths people tell themselves and, by standing outside, to see things as they really are.

Buck’s references to “tearing away masks” are, in this sense, pertinent.  Anthropology, at its best, does reveal things about human nature that are not easily seen by the people who are busy living their lives.  But therein lies a temptation.  Anthropologists can also adopt an oracular mode in which they present their personal and political preferences as what really lies behind all those masks.  Margaret Mead made a career out of such pseudo-profundity.  She made up stories and grossly misreported ethnography to back up her views—and it all played extremely well with an audience primed to hear what it wanted to hear.

Anthropology, in other words, learned the trick of promoting new myths in the name of demythologizing.  Rip off the mask in order to promote a new mask more in keeping with a different political agenda.  This breeds a great deal of cynicism within the field and a feeling that the masking never stops.  Everything is a mask, and if that is the case, why not devote your effort to the mask you like best?  That’s how devotion to facts and the pursuit of truth withers away.

Today we have anthropologists eager to lend their intellectual authority to the just-so story that America is a nation run by privileged whites determined to maintain their privilege. This is, quite plainly, a myth.  There is nothing in the realm of fact to support it.  But it is, of course, a politically useful myth for those seeking to obtain power and influence by marshalling social resentments.  How much fictionalizing can an academic discipline bear before it altogether loses its credibility?

Marquette’s Reputation at Stake

“Be the difference” is the motto of Marquette University, the generally not-very-newsworthy Jesuit university in Milwaukee.  Marquette is in the news now for reasons that it cannot be very happy about.

First a teaching assistant at the Catholic institution, Cheryl Abbate, a doctoral student in philosophy, was caught on tape earlier this year giving a very un-Catholic answer to a student who wanted to write about his objections to same-sex marriage in a course titled, “Theory of Ethics.”  The student complained to an associate dean and to the chairman of the Philosophy Department, neither of whom saw a cause for concern. The student then played the recording to a Marquette professor of political science, John McAdams, who after listening to the recording, blogged on November 9 about the incident, making some pointed criticisms of Abbate’s refusal to countenance the expression of opinions counter to her own.  The story began to attract significant public attention, including an article on Inside Higher Ed, November 20, which reprised the story and gave links to accounts supporting McAdams’s views and others attacking him.

The story might have ended there, but it was about Continue reading Marquette’s Reputation at Stake

Campus Tolerance for Violence

The post-Ferguson and post-Garner racial agitation has led to a wave of violent rhetoric and actual violence in the United States. Street protesters have called for “pigs in blankets,” declaring, “Arms up, shoot back,” and asking, “What do want? Dead cops.  When do we want it? Now.” This rhetoric has campus amplifiers.

Is the infatuation with violence among some professors connected to the violence in the streets?  At some level, the two are certainly linked.  Let’s start with some of the immediate connections.

Since the 1960s, American higher education has played host to a small segment of faculty members and students who advocate for violent resistance to the rule of law.

Gordon Barnes, editor-in-chief of The Advocate, the CUNY graduate student newspaper, published an editorial in fall 2014 “In Support of Violence.”  It is accompanied by the newspaper’s logo, a giant “A” in a circle, the symbol of the international anarchy movement.  Barnes attacks “the anti-democratic nature of the grand jury process” and declares, “The time for peace has passed; indeed it never existed in this country.”  Barnes’ prescription is for the protesters to shift from violence against property to violence against the police:  “Violence directed towards state representatives is not only warranted, it is necessary.”

Eric Linsker, an adjunct professor of English composition at CUNY was arrested on December 13, after he had carried a large garbage can onto a walkway on the Brooklyn Bridge, apparently in an effort to drop it on the heads of police officers below.  Linsker was ordered by the police to put it down but fled the scene, dropping his backpack, with two hammers inside, and, among others things, his CUNY ID.

Cindy Gorn and Zachary Campbell were among the academics arrested for assaulting police on the Brooklyn Bridge in an effort to help Linkser escape. Gorn is a graduate student at Columbia University where she is also a lecturer in environmental health sciences in the Mailman School of Public Health. Her “areas of work” are “geography from the perspective of Marxist philosophy, social movements, autonomous labor movements, health, and the environment.” Campbell is a Spanish instructor at Rutgers. In the melee, one police officer had his nose broken. The police posted photographs of the suspects and offered a $12,000 reward.

Ismaaiyl Brinsley on December 20 shot and killed police officers Wenjian Liu and Rafael Ramos while they sat in their marked police car in Brooklyn. Brinsley, as we now know, had declared his intent to murder police as an act of retaliation for the deaths of Eric Garner and Michael Brown. Brinsley was a career criminal: 15 arrests in Georgia ranging from assault to grand larceny; four arrests in Ohio, including robbery; and multiple jail sentences, including two years for criminal possession of a weapon. Brinsley’s Instagram postings declaring his determination to kill police did not require Gordon Barnes or Eric Linsker for inspiration.

“I’m going to put wings on pigs today. They take 1 of ours…Let’s take 2 of theirs #ShootThePolice #RIPErivGardner [sic] #RIPMikeBrown This may be my final post.”

The idea of killing police—any police—as retaliation for the deaths of Michael Brown and Eric Garner was reverberating so loudly in street protests, on the internet, and in conversations that Brinsley would have heard it many times from many people.  How and exactly when Brinsley decided to act on the exhortation we will probably never know.

The Trayvon Martin Organizing Committee (TMOC), however, has played an especially large role in the protests that call for violence. Whether TMOC itself has incited violence against police seems doubtful. The organizers probably know better, and TMOC’s webpages, though adorned with images of street fires, masked men throwing Molotov cocktails, and flaming American flags, steer clear of specific instructions.  The “About” page for TMOC simply declares, “There is no justice here, only us and the future we seize.”   Individuals who post to the TMOC Facebook page are less inhibited and frequently express their relish for violence.  And TMOC has at least some academic support.

George Ciccariello-Maher, a professor of political theory, has called for people to donate to TMOC. He is also the author of an essay, “The Dialectics of Standing One’s Ground,” in which he draws on the work of radical philosopher Frantz Fanon to argue that the proper response to “white supremacy” is “a voluntarism of sorts to push history into motion.” That’s opaque, but in context Ciccariello-Maher clearly means to endorse the use of violence, as in Trayvon Martin’s violent attack on George Zimmerman.

Aaron Samuel Breslow, a doctoral candidate at Teachers College, was an active participant in at least one TMOC event—judging by his on-scene photographs and his tweet asking his friends to join him in the protests.  Breslow has recently been tweeting frequently on “Black Lives Matter” themes and “Justice for Michael Brown, Eric Garner, and All Victims of Killer Cops!” He self-identifies as gay, has spent “ten years” in HIV prevention and care, and professes “a fierce passion for social justice grounded in queer theory, feminism, and anti-racist ally activism.”

His doctoral work at Teachers College has led to his being part of “Dr. Brewster’s Marginalization, Mental Health, and Empowerment Team.” None of this suggests that Breslow is complicit in calls to violence.  Rather, he appears to be the sort of fellow caught up in ideological efflorescence.  As of this writing, he hasn’t posted his thoughts on whether the protests that he previously extolled bear any moral responsibility for the murders of officers Liu and Ramos.

TMOC isn’t alone in staking extreme and pro-violent positions. Noel Jackson is an MIT professor of literature with no connection to TMOC who unleashed a steady barrage of rage on his Twitter account after the shooting of Michael Brown in August. Among his declarations were his hope to slice the “weak guts of white supremacy” with a “[Expletive deleted] SCALPEL.” The university, concerned for his mental health, “forcibly hospitalized” Jackson for his unhinged response.

This list by no means exhausts the connections between campus supporters of violent protest and the actual protests, but it is enough to show the links don’t have to be guessed at. They are there to be seen.  Some of the connections are in the form of forceful declarations, such as Barnes’ front-page “In Support of Violence” editorial.  Some of the connections are in the form of heedless enthusiasm from individuals who have no sense of where this goes.  The harder questions are finding out what these connections really mean and how deep they are.

Professing Violence

It would be comforting to think of campus figures who profess violence as an inconsequential fringe who have no real influence over education. That might account for figures such as Ward Churchill, the University of Colorado Boulder ethnic studies professor who rose to infamy for his applauding the 9/11 attack on the World Trade Center and who was subsequently fired for academic misconduct. But it probably misses the mark on figures such as Bill Ayers who have had undeniably successful academic careers.

Which raises questions.  How does it happen that American higher education provides cosseted professional careers to so many who disdain the basic conditions of free inquiry on which colleges and universities depend?  Those conditions surely include settling disputes through reason and peaceful political process.  Why does the academic world so often turn a blind eye to advocates of unlawful violence?

Part of the answer, of course, is that most academics regard it as none of their business.  They are in their labs, writing their articles and teaching their classes, which have nothing to do with the radical politics of a few of their colleagues.  It might also be true that ignoring the antics of some proponents of crazy ideologies is a way of marginalizing those ideologies.

But such comforting thoughts seem increasingly ill-founded. Those who condone or actually encourage violence of various sorts are more common on campus than many realize.

Violence against Property

Instead it appears that the established authorities on the nation’s campuses help to sustain the conditions under which advocates of violence promote their views.  We should distinguish, of course, between advocacy per se—the use of words, ideas, and images—and actual acts of violence. The latter are rare enough that they stand out.  At the University of Virginia, five men and three women incited by the Rolling Stone article about an alleged gang rape, and by the university’s apparent endorsement of the article’s claims, vandalized the Phi Kappa Psi fraternity house.

No one was injured, but the attack, which included throwing bricks through the windows, was sufficiently violent that residents could have been.  The perpetrators are known to many in the community.  One even granted an interview. There were witnesses.  It wouldn’t take Philip Marlowe to track them down.   But so far the University of Virginia has not filed charges or, apparently, taken any action at all to identify the perpetrators.

Let’s get this into perspective. Teresa Sullivan, the president of the University of Virginia, felt compelled by the publication of an unsubstantiated article in Rolling Stone to suspend all fraternities until January 9.  When it emerged that the magazine story was false and that, in all likelihood, no rape had taken place, President Sullivan left the collective punishment for a fictitious crime in place. But faced with the real crime of serious vandalism against a fraternity that had been falsely accused, and having the opportunity at hand to charge the culprits, President Sullivan decided to take no action.

Passivity in the face of attacks on property that are made in the name of some progressive cause appears all too often at colleges and universities. Oberlin knew the names of the real perpetrators (Dylan Bleier and Matt Alden) who had scrawled racist graffiti on public walls in February and March 2013.  Bleier and Alden had gone on their month-long spree not out of racism but as a consciousness-raising exercise for the college, which they hoped would react forcefully.  Even though Oberlin caught them and understood their motives (and quietly expelled them), the college went ahead with what my colleague Ashley Thorne has called a “staged emergency” involving cancelled classes and a day-long teach-in by the Africana Studies Department.  Only five months later, when the Daily Caller broke the story, did Oberlin admit that the racist provocation was a hoax—though the vandalism was real.

Campus “hoax crimes” can get perpetrators in trouble, but usually not very much. Colleges do sometimes rouse themselves to action in the face of politically motivated vandalism. For example, on September 11, 2013, protestors at Middlebury College vandalized a memorial to victims of 9/11, removing 3,000 small American flags. The protestors said they were protesting “American imperialism.” The president of Middlebury condemned the act, and the one protester identified as a Middlebury student was suspended for a year.

These stories, however, do not always play out so neatly. When students at DePaul University in 2013 put up a flag display on the 40th anniversary of Roe v. Wade, the flags were torn down by vandals and thrown in trash cans. The university investigated and identified the perpetrators. Young Americans for Freedom obtained the report and posted it. DePaul then charged and convicted the student head of Young Americans for Freedom with “disorderly, violent, intimidating or dangerous behavior to self or others” for releasing the names of the vandals.

College authorities may decide not to react too strongly to low-level lawlessness on campus for fear of inflaming a situation. Such an approach contrasts rather strongly to “broken-windows” theory of keeping public order. Instead of heading off escalations of lawlessness by responding swiftly to relatively minor transgressions, this approach offers a kind amnesty.


Bill Ayers gave up calling for mayhem when he transitioned from Weatherman Underground terrorist to University of Illinois professor of education, but in his memoirs, Fugitive Days (published September 10, 2001), Ayers offered a generally unapologetic account of his prior commitment to political violence.  His oft-quoted remarks to a New York Times reporter on the book’s publication were that he “would do it all again,” “I don’t regret setting bombs,” and that “we didn’t do enough.”

Ayers, unlike Churchill, had a long and substantial career in higher education, and is a now a professor emeritus.  He partnered with the young Barack Obama for the Chicago Annenberg Challenge and the Woods Fund of Chicago, where they collaborated to help fund ACORN. We don’t spend much time these days remembering President Obama’s ideological roots, but for those who want to refresh themselves on the details, Stanley Kurtz’s Radical-in-Chief (2010) is the indispensable scholarly source.  Kurtz’s excavation of the archives, however, is only one of many strands of evidence that Ayers has to be reckoned as a fairly influential figure in modern American higher education.

And if we go looking for influential campus figures who saw practical and legitimate uses for political violence, Ayers is far from alone. The list includes Noam Chomsky and the late Edward Said, both of whom have served as apologists for international terrorism.  Generally they reserve the word “violence” for the activities of states attempting to resist the terrorists. Chomsky’s 2013 “Edward Said Lecture” was titled “Violence and Dignity:  Reflections on the Middle East.” Said’s own contributions included essays such as “Identity, Negation and Violence,” (1988). Chomsky’s and Said’s writings have spurred a whole industry of academic writing that might be called rationalizations for anti-state violence. The flavor of this scholarship is suggested by Jeffrey Guhin’s and Jonathan Wyrtzen’s “The Violences of Knowledge:  Edward Said, Sociology, and Post-Orientalist Reflexivity,” (2013).  Guhin and Wyrtzen do not position themselves as advocates for violence. They simply call for sociologists to produce work “that does not consolidate state power but rather works within civil society to lessen suffering, increase freedom, and make lives more meaningful.” But the effect of treating the state as the only significant perpetrator of violence is to minimize all the other forms of non-state violence.

For decades schools of education, along with some other parts of the academy, have made Paulo Freire’s Pedagogy of the Oppressed (1970) a central required text.  Freire famously exculpated the violence employed by the left as a proper response to the “institutional” violence of the state: “Violence is initiated by those who oppress, who exploit, who fail to recognize others as persons—not by those who are oppressed, exploited, and unrecognized” (Chapter 1).  Where there is Freire, Fanon—whom Wikipedia describes as “an existentialist humanist concerning the psychopathology of colonization”—is often not far behind.  Frantz Fanon’s Wretched of the Earth (1961) no less famously called for violence in “decolonization,” both as a practical tool and as a form of “rehabilitation” for the oppressed:  “violence alone, violence committed by the people, violence organized and educated by its leaders, makes it possible for the masses to understand social truths and gives the key to them” (p. 96).

“At the level of the individuals, violence is a cleansing force. It rids the colonized of their inferiority complex, of their passive and despairing attitude. It emboldens them, and restores their self-confidence” (p. 51). It should be no surprise that Fanon shows up frequently in the postings of the Trayvon Martin Organizing Committee, and that one of the original organizers of the committee, Taryn Jordan, wrote her Women’s Studies thesis at George State University (The Politics of Impossibility: CeCe McDonald and Trayvon Martin—The Bursting of Black Rage”) as an application of Fanon’s theories.  “Bursting” in rage is one of Fanon’s key concepts.

The sub-culture of the university that has been awash in these radical exhortations to and justifications for violence has created a context in which the theme can flourish and at least a few can be moved to go beyond theory to practice.

Owning It

We pay a large price as a society for the politicization of our colleges and universities.  One part of that price is that far too many institutions have lost sight of basic principles. Academic freedom permits many things but it is not a legitimate pretext for advocating violent lawlessness.  Colleges and universities exist within a civil order.  They have an obligation to foster the basic conditions of a free society.

These days, colleges seem eager to impose on themselves onerous sets of rules meant to govern in fine detail how people can talk with one another, how sexual interactions should proceed, what kind of container can be used to drink water, and how often a toilet may be flushed.  No function of life in the community is too small to be regulated and no personal freedom is so important that it can be exempt from collective control.  Yet these campuses that are in the midst of embracing total social control over their members are also places where people extol racial violence and other forms of extreme lawlessness to be imposed on the rest of society.  It is one of the great ironies of contemporary academe that the same phrase is employed to justify the regime of petty tyranny on campus and murderous anarchy off-campus.  Both are pursued in the name of “social justice.”

Indulging in murderous fantasies à la Frantz Fanon and Bill Ayers may seem harmless when the killing was long ago or far away, but when we have mobs in the street with professors mingling among them or grad students inciting them, we need to pause. Higher education needs to own its share of responsibility for fostering both the bitter divisions we are witnessing and the resort to violence that has become part of that division.

How Data-Mining Hurts Higher Education

Many colleges and universities have adopted data-mining to improve student retention and to channel students to courses and programs that the institutions judge most appropriate.  The future of higher education is here and it is, in spirit, benignly totalitarian.  Goldie Blumenstyk, writing about it in the New York Times, (“Blowing Off Class?  We Know”) emphasizes the benign part.  Colleges, in her view, have found a tool that can “cut the cost of higher education for students and their families, as well as for taxpayers.”  And data-mining, she says, is “showing many of the most tangible results” among the innovations aimed at solving “the higher education crisis on America.”

The totalitarian side of “big data” is, in Blumenstyk’s phrasing, a rhetorical question, “Big Brother-esque?” and a passing shadow, as it “might also be tinged with a little paternalism.”

Continue reading How Data-Mining Hurts Higher Education

Libertarians vs. Progressives: The New Campus Divide

American college students these days seem to divide by moral temperament into two very different cohorts.  On one hand, a large number of students prize the freedom to do and say what they want and deeply dislike the constraints of external authority.  On the other hand, a large number of students prize social control as an instrument of justice and are deeply committed to a regime of close regulation of their fellow students’ behavior.  While there are, of course, students who fall into neither camp, they are relatively few and not very conspicuous.  The outliers include adherents to traditionalist religious groups, self-professed conservative students, and students single-mindedly focused on career preparation.

Of the two large and conspicuous cohorts, one may be thought of as “libertarian” and the other “progressive,” but these labels are only approximations and in some respects inaccurate.  They suggest more in the way of ideological clarity than either cohort really possesses.  The libertarian side includes a large contingent of young people who aren’t so much checked in to libertarian ideology as they are checked out of larger social and political issues.  And the new generation of campus progressives are more radically anti-freedom than their predecessors and a lot more willing to forego the search for knowledge for the excitements of immediate power.

To say that these cohorts are in stark opposition is also misleading.  They do diverge, but they are rooted in the same soil.  On matters such as the legalization of marijuana, same-sex marriage, acceptance of sexual minority lifestyles, and—more abstractly—personal autonomy, they are almost indistinguishable.  And this makes it difficult for observers to see the real differences between the two cohorts.

A Libertarian Moment?

It has been widely reported that American college students are more libertarian than ever before.  The observation is supported to some degree by polls. The polls are suggestive but it seems no one has tackled systematically the question of just how many college students fit into the libertarian-leaning category.  Depending on the diagnostic question (marijuana?  same-sex marriage?), the percentage might be 70 to 80 percent of college students.  But “leaning” doesn’t mean “all in.”  The hard core of the libertarian movement on campus is probably better captured by the students who join libertarian organizations.  The two main ones are Students for Liberty and Young Americans for Liberty, which grew out of Students for Ron Paul.  Both are growing rapidly.  Young Americans for Liberty boasted in February, “Since our founding just five years ago, our network has ballooned to include more than 500 chapters; 7,000 dues-paying members; and 162,000 activists.” Students for Liberty, by contrast, explains that it “does not utilize a strict membership model,” but operates as something “more akin to a coalition.”

It is easy to see why, when it comes to gauging the scale of libertarian attachment among college students, it is hard to come up with solid numbers.  That said, the phenomenon itself is plain.  Step onto almost any college campus and start striking up conversations, and you will soon find yourself among people whose views are strongly flavored with “small ‘l’ libertarian” views and some who are full-on supporters of the movement.

Blue Jays

Most seem to be drawn to the lamp of freedom to make their own lifestyle choices unhindered by laws meant to restrict the options.  That seems to be what “libertarian-ish” and “small ‘l’ libertarian” mean.  These are students who have a strong sense of personal autonomy, but not necessarily an ardent admiration for the magic of free markets; or a swelling interest in the ideas of John Locke, a burgeoning enthusiasm for the views of Murray Rothbard or Ayn Rand, or a keen appreciation for the anarchist gurus of the Occupy movement.  The large “L” Libertarians are there too, and are to be seen staging colorful events such as YAL-oween at the University of North Carolina at Chapel Hill and the “graveyard of civil rights” at Elizabethtown College and numerous other colleges.

The small “l” libertarians, however, are larger in number and more indicative of the spirit of the times.  These folks are “libertarian” today in the way that blue jays are bossy. They simply want what they want. Relatively few are prepared to offer a cogent explanation of their views, and fewer still can summon meaningful arguments against contrary positions. Those are indications, of course, of how liberal education has been hollowed out in the colleges.  Students pick up the campus zeitgeist easily enough; attitudes and values come across; but the kinds of reading and instruction that would equip students to make meaningful sense of the larger debates about civic order in our self-governing republic have become rare.

The rise of this blue jay libertarianism cross-cuts and contradicts several other trends.  Our colleges more than ever preach the importance of “critical thinking.”  But the blue jays aren’t really interested in pecking out hidden assumptions or weighing alternative views.  They are comfortable with their certainties.  A student at Williams College recently told me that in the college debate league, everything is open for debate except legalization of marijuana and same-sex marriage.  Those subjects aren’t debatable because “those matters are settled.”  ‘

Conformity of Opinion

Where else have we heard that?  “Settled science” is, of course, one of the rhetorical sledgehammers wielded by those who argue that man-made global warming is upon us and that the time for debate is over. We now seem to be hearing that more matters of social policy as well as science are settled, not in the sense that we have arrived at answers that convince all reasonable people, but in the sense of something like, ‘We will make life difficult for you if you express disagreement.’

Such threats are sometimes explicit but often merely implied, and they succeed to a surprising degree to enforce conformity of opinion.  But they reveal a fracture in the small “l” libertarian view of things.  How can those who elevate the principles of volition and personal autonomy as the highest good of the social order make peace with a regime based on exclusion of dissent and stigmatization of dissenters?

One possible answer is that the blue jays are less “libertarian” than they have been given credit for.  Mark Bauerlein in his post mid-term elections essay, “Are Democrats Losing the Youth Vote?” offers such an assessment.  Observing the erosion of support for Democrats among voters under age 30, he notes that young people are “just as liberal as ever on social issues,” such as same-sex marriage and legalization of marijuana, and “more likely than older generations to say they support activist government,” but that they have adopted “a laissez-faire posture in moral and private matters,” that Bauerlein calls “a soft libertarianism that makes individual preference king.”  This attitude “blunts lasting commitments to any political organization” and breeds “mistrust of institutions of any kind.”  He sees in this “a new kind of constituency, fluctuating and unpredictable,” socially liberal but subject to gusts of enthusiasm for other ideas and the candidates who embody those ideas.

The blue jays in this view are more like leaves in the wind than principled supporters of anything in particular. What’s “settled” at one moment might be unsettled the next.  Bauerlein’s strongest points are his emphasis on the elevation of “individual preference” and the broad mistrust of institutions.  These look like some form of libertarianism, except that they veer away from ideas about individual rights, including the rights of ownership, and they fail to touch the ideals of “spontaneous order” that are central to the libertarian vision.  “Preference” is to “right” as sand is to bedrock:  not a very good foundation.  And “mistrust” is to “spontaneous order” as dissolving is to crystallization:  not much of a positive outcome.

Progressive Hostility to Individual Rights

Blue jay libertarianism would appear to be at odds with the attitude of the other great cohort of students today:  the “progressive” students who favor harsh impositions on speech and behavior to achieve greater “social justice.”  The spirit of the censor has been welcomed back to campus.  It has three distinct manifestations:  the impulse to keep representatives of disfavored views off campus; the readiness to label some speech as so intrinsically dangerous that it should be accompanied with “trigger warnings”; and the determination to create a version of lynch law for students accused of sexual assault.

The contemporary college campus looks like a pancake swimming in the syrup of politicized sensitivity.  Intellectual exchange and intelligible debate are flattened and then drenched in self-congratulatory rhetoric about the need to suppress hate speech, protecting the “victims,” and all the while advancing “critical thinking.” The censors are primed to jump on anything that suggests deviation from the approved positions on race, sexual preference, and feminist doctrine.

This side of the picture of campus life today is too well publicized to need much elaboration.  We can pick the incident of the week to capture it.  Recently it was a group of students at Dartmouth assaulting Texas governor Rick Perry with scabrous questions focused on anal sex.  The political point was to dramatize the students’ disagreement with Perry’s views on homosexuality.  The tactic was to destroy the “decorum” of the occasion.  At least that’s what one of the perpetrators, Emily Sellers, explained in her article in the Dartmouth student newspaper.  What’s most striking about Ms. Sellers’ defense is the utter absence of contrition and her explicit rejection of “decorum” as an ideal.  She and her fellow activists set out to destroy the possibility of civil exchange, and having more or less succeeded, retired to bask in their victory.  Offenders against the rules of civilized discourse on campus are generally proud of their offenses, which they conceive of as hammer strikes against patriarchy, heteronormativity, petrol-capitalism, or whatever.

Non-aggression Pact

The differences between the libertarians and the progressives on free speech are stark and look made to order for conflict.  The libertarians on campus treat individual rights as the bedrock of liberty.  The progressives are at war with individual rights as an obstacle to achieving their version of social justice.  This applies to dozens of progressive causes.  Due process and the presumption of innocence for men accused of an ever-expanding definition of “sexual assault”?—gone.  Individual property rights for those who might decide to use their land or their possessions in a way that impedes the sustainability agenda?—doubtful.  The free expression of religious beliefs if these views derogate homosexuality or sex outside marriage?—banished.  The free expression of ideas that cut against progressive views?—obsolete.

All these instances would seem to provide a clarion call for the libertarians to turn out in defense of individual rights.  But the silence is deafening. There is a very large gap between the energy and commitment of the libertarian just-relax cohort and the progressive social justice cohort.  The latter are marching around campus carrying mattresses on their backs to protest “rapes.” The rapes may or may not be instances of coercion, since the favored definition has been expanded by the protesters to include sexual encounters in which both parties eagerly proceeded, but in which the woman failed to provide “affirmative consent,” in the form of explicit, positive, unambiguous words.

One might think the libertarian-ish cohort would be appalled by this rejection of personal responsibility for their voluntary actions on the part of the women who freely participate in these sexual encounters.  A friend suggested to me that the ennui of the libertarian students might reflect their sense that the college would side with the progressives anyway, so why bother?  But that seems less an explanation than a description of how deep their apathy goes.  Faced with the growing power of moral scolds, outraged defenders of psychological “safety,” avengers of heterosexual male lust, and histrionic dramatizers of identity-group grievance, the great cohort of student libertarians can barely rouse themselves to answer any of this.  The burden of work is carried by outside organizations such as Reason and the Foundation for Individual Rights in Education.

My own political views are better classified as conservative than libertarian, but I have sympathies with libertarians who take a principled stand against the race-class-gender ideologues who now dominate the positions of power on most campuses.  But why has the movement proven to be so toothless in confrontation with those who are determined to supplant individual liberties with an authoritarian regime based on group identity?

Narrowing Rights

One possibility is that knowledge of civil rights has fallen to the point that many libertarian-ish students simply don’t understand that their “rights” extend beyond the realm of what they might want to ingest and whom they want to have sex with.  The right to confront your accuser; the right to see the evidence against you; the right to legal representation; and the right to a fair trial might warrant a place in the list of individual rights students should care about.

At a deeper level, students ought to demand an education that is as free as possible from political and ideological impositions.  At public universities, this might be phrased as a “right,” but at all colleges and universities it is the foundation of genuine liberal education.  And because it is centered on the ideal of the individual mind in pursuit of truth through critical examination of competing views, it is an understanding of higher education that should have profound appeal to both libertarians and the libertarian-ish fringe.

The passivity of this cohort when faced with a hard core challenge by those intent on replacing liberal education with illiberal social control is, in that sense, a troubling mystery.  One way to resolve it is to conclude that the “libertarian moment” in higher education is mostly an illusion. Is it possible that the small “l” libertarians are themselves not really libertarian at all?  Could they be simply the crowd that follows where the progressives lead?

I’d like to think that there is more heft to the libertarian moment than that, and it simply hasn’t yet found its leaders or its voice.  The history of confrontations between the party of liberty and the party of social control, however, is marked with many occasions where the champions of liberty came to the field with too little too late. Cromwell wins. The maypole of Merrymount is razed.  Prohibition wins.  Eventually people tire of Puritanical control and restore some ordered liberty.  But the wait is long and very unpleasant.

Common Core–The Elites Did It

At the invitation of the Alabama chapter of Eagle Forum—Phyllis Schafly’s pro-family conservative organization—I flew to Birmingham last week to give a talk on the Common Core K-12 State Standards.  Alabama was one of only a few states I had never set foot in.  When I mentioned that to an elderly gentleman I met at the event, he returned volley: I was the first person he had ever met who had not been to Alabama.

I am not entirely lacking in Alabama credentials.  My Wisconsin-born father took his undergraduate degree from the University of Alabama in 1937. America is a big place but we move around and connect with one another, at least if we want to.  Sometimes, of course, we don’t.

Cook’s Tour

The day before I arrived, Apple CEO Tim Cook gave an address in Montgomery on behalf of the new inductees into the Alabama Academy of Honor at the State Capitol.  The state governor, Robert Bentley, greeted him by pulling two iPhones from his pockets and Cook received a rousing welcome.  But then Cook, who grew up in the state, changed the tone of the event.  He lambasted Alabama: “As a state, we took too long to steps toward equality. We were too slow on equality for African-Americans. We were too slow on interracial marriage, and we are still too slow for the equality for the LGBT community.”

The speech now appears to have been Cook’s warm-up to his announcement later in the week that he is “proud to be gay.”  He made his declaration in an essay published in Bloomberg Businessweek.  It surprised few but the New York Times treated it as major news and ran a follow-up front-page story weighing its cultural significance.

Be that as it may, the folks in Birmingham I talked with felt stepped on.  Alabamians don’t seem to care what Cook’s sexual preferences are, but they clearly didn’t like having their honorific ceremony re-purposed as a demonstration of how far above his local roots Cook has risen.

Standard Gauge

I was in town to talk about the Common Core, not Tim Cook, but several connections became clear.  Both matters entail a stand-off between national elites and inhabitants of Middle America.  Both involve rich and powerful people denigrating the views of others whose views are generally marginalized. And both center on questions about standards.

The core question of the Common Core is who should set standards.  “Who decides?” is at least as important as on the question of what the standards should be. The answer to “Who decides?” in fact determines what we mean by “standards” –a word which has surprisingly diverse meanings.  The “standard” of a standard-gauge railroad, for example, is fixed convention that has no intrinsic meaning.  It just ensures that the same tracks can carry a freight train from Newark to Los Angeles.  We have a lot of standards of this nature:  time zones, and weights and measures among them.

The “standard” when we speak of a “standard treatment” for an illness is something else.  It means something like the generally approved practice that we follow because we know it has pretty good odds of working.  The standard in this case is something we feel free to depart from depending on circumstances.  Maybe something better than the standard treatment will benefit a particular patient.

The “standard” we have in mind can vary in lots of other ways too.  The “standards” with which we judge performances in the arts are nothing like the “standards” we apply to behavior at the bus stop.  Standards may be intellectual, moral, aesthetic, scientific, legal, or medical; rooted in social convention or derived from deep principle; conceived as universally applicable or locally variable; and so on.

Once you begin to reflect on the breadth of the idea, it makes no real sense to ask, “Are you in favor of standards?”  The question always resolves into “Whose standards and for what purposes?”  When it comes to the Common Core K-12 State Standards, these are the key questions.  Educational standards, at least at first pass, should be something more like “standard treatment” standards than “railroad gauge” standards.  They ought to represent our evolving approximations of best practices, not conventions determined for the sake of uniformity.  But it isn’t clear that the proponents of the Common Core see it that way.  Uniformity ranks very high in the list of desiderata for many of its advocates, who seem at times to view the sheer variety of America as a danger to national prosperity and even dignity.

The Common Core is a top-down affair. It arrived with a veneer of being an initiative led by the states.  It wears that veneer proudly in its name: The Common Core K-12 State Standards.  But scratch the 1/64 inch maple burl surface and it is all federal particleboard underneath. We have the Common Core in 40-some states because the Obama administration devoted $4.35 billion of the stimulus package to the Race to the Top, which offered major financial incentives to the states to sign on to Common Core before it was even written.

A National Elite

Defenders of the Common Core like to look back to the period before President Obama made the cash offer to the states to sign on.  It is indeed true that the Common Core was rooted in developments that preceded the Obama administration’s enthusiasm for it.

In 2008, the National Governors Association (NGA)—a private body, not an open forum for sitting governors—adopted the idea of the Common Core.  It worked with the Council of Chief State School Officers (CCSSO) and its own daughter organization, Achieve, to develop the idea of voluntary state standards.  Before NGA and CCSSO took it up, the Common Core was the 2007 brainchild of three individuals who were the principals of a non-profit called Student Achievement Partners (SAP):  David Coleman, Jason Zimba, and Susan Pimentel.

The names and the acronyms swirl around.  It is hard for anyone who isn’t bound and determined to follow this trail to keep the details straight.  The players also move.  In 2012 David Coleman became president of the College Board, where he pledged to “align” the SATs and the advanced placement tests with the Common Core.  Susan Pimentel is now an “educational consultant” and serves on the National Assessment Governing Board, the body that sets policy for the National Assessment of Educational Progress (NAEP), “the nation’s report card.”   The path to the Common Core was greatly smoothed by the generosity of the Bill and Melinda Gates Foundation, which has spent over $200 million in the quest. ($200 million is The Washington Post’s figure; some estimates are far higher.)

The details do, of course, matter, but so does the larger picture.  The broad view is that the Common Core didn’t grow out of grassroots demand for curricular reform.  It isn’t the work of school boards or an initiative put in motion by dissatisfied parents.  It didn’t come about because good teachers convened and came up with a proposal to improve K-12 education.  It was instead the product of a handful of very-well connected intellectuals who dreamed up their own vision of what American schools should do.

Coleman, a Rhodes Scholar and the son of the then-president of Bennington College, was and is a smart player.  Backed with Gates money, he devised the basic strategy of lining up the states by appealing to the current and former governors in the National Governors Association.  A key recruit was Gene Wilhoit, executive director of the Council of Chief State School Officers.  Wilhoit was instrumental in selling the plan to the NGA members.  Later he became a partner with Jason Zimba and Susan Pimentel in SAP, which greeted his appointment appropriately acknowledging, that Wilhoit “helped spearhead the historic state-led effort to develop the Common Core State Standards.”

Which is to say, wherever you turn, the Common Core is the work of people who comprise a national elite. The kind of standards that the Common Core represents are the kind an intellectual elite prefers:  an orderly regimentation of knowledge and skills marching to a fixed destination.  The Common Core destination is “college and career readiness.”  How did its architects know what it will take to be “college ready” thirteen or more years into the future?  Or what career readiness will mean in 2025?  Be that as it may, one of the most significant flaws in the Common Core is that it is not designed to be revised, updated, or improved.  It is what it is and is frozen into place.

Robots and Lawyers

No one can really know what “college and career readiness” will entail a decade or more from now.  That’s one good reason to avoid over-standardization in K-12 education. Standards that elaborate the concerns of the moment are almost certain to be obsolete in a few years.  The Common Core is top heavy with them.  One of its overriding emphases is to teach students to how to process information, and to this end it lays out hundreds of “standards” that focus on teaching students to be efficient performers of various simple routines.  Read the standards through, beginning to end, and what you find is not a humanizing education but one that outfits the student to be a robot.

That’s a bit unfair, but only because the Common Core is also concerned to teach students that everything, in both math and the English language arts should be approached as “evidence” and grist for argument.  That’s to say that the Common Core also wants to turn students into lawyers, or at least to think like lawyers.  Perhaps its highest ideal is the robot-lawyer, who treats everything in the world as potential evidence to be efficiently processed into argument.

There is a rough fit between the student who finishes high school as partly assembled robot-lawyer and what that student will encounter in today’s college.  We are, after all, in the age when college has become dominated by ideologies that demand allegiance and that repel independent thought.  On matters such as race, sex, class, and climate change, the student confronts an orthodoxy and is handed the task of making sure that everything is cut to conform with it.  Sandra Korn, at Harvard last spring, gave us the perfected version of this with her call to replace “academic freedom” with “academic justice.”  With legions of Common Core-prepared students headed for campus in years to come, Ms. Korn may well have her wish.  They will be lawyer-robots perfectly suited to prosecuting the curriculum on behalf of “academic justice.”  We already are more than half way there with the new “affirmative consent” rules and sexual assault codes on campus.

But to say that the Common Core is consonant with the reigning spirit on campus is not to say that it really makes students “college ready.”  It is only to observe that the same misunderstandings are present in both the Common Core and in contemporary colleges.  Both reflect an odd turn away from culture and an embrace of a new kind of utilitarianism.

Leaning Mostly on Utility

Utilitarianism has always had a strong appeal to the American character.  A frontier society inevitably cared more about “what works” than about the life of the mind or the shaping of a person for a well-lived life.  But the utilitarian element has always been met by another side of the American character that is bent on high ideals.  The new kind of utilitarianism offers a fusion of “what works” and “high ideals,” but a fusion that leans mostly on utility.  In the Common Core, students are trained to read “informational texts,” and to extract the pertinent information.  This operation is packaged in lofty rhetoric about “higher order thinking skills” including “critical thinking, communication, collaboration, and creativity.”  It is buttressed with “media awareness” and “systems thinking.”  But it comes down to mining the written word for usable “information” and treating mathematics as problem solving.

To be sure, those are important parts of both reading and math.  But they far from the only parts, and if they are over-emphasized our children are left with a great hollow in their preparation for life, regardless of whether they go on to college.  What is missing is perhaps those things that the Common Core must regard as “lower order thinking skills.”  These would include furnishing the memory with things worth holding; puzzlement; uncertainty; and even distaste.  Educating someone toward morally-grounded maturity requires openness to some qualities we might regard as negative.  But to be moved by ideals and to have high aspirations requires that we go searching in many directions.

How Many Dogs in London?

Bill Gates a few years ago told the Wall Street Journal’s Jason Riley that, “It’s ludicrous to think that multiplication in Alabama and multiplication in New York are really different.”  This, in Gates’ view, justifies “common standards.”  But that begs the question of what standards really are.  Multiplication is the same in Miami and on the Moon too, but the circumstances differ, and it would be wise to approach the task of teaching multiplication and anything else with an eye to local realities.

The Common Core has now famously run into widespread mockery for the elaborate procedures it insists children in the early grades use to solve simple math problems.  YouTube videos display children using Common Core math to turn simple problems into Himalayas of complexity.  Comedians have mined it too, such as Louis C.K.’s example on the Letterman Show, “Bill has three goldfish.  He buys two more.  How many dogs live in London?”  The upshot has been plummeting popular support for the Common Core.  A Gallup poll released on October 28, reported that 58 percent of Republicans who have children in public schools hold a negative view of the Common Core, and only 19 percent have a favorable view.  Parents overall are more even divided:  35 percent negative, 33 percent positive, and 32 percent unfamiliar or having no opinion.  Among teachers, the split is 44 percent negative, 41 percent positive. But all these figures have shifted dramatically toward the negative since the last Gallop poll in April.

One of the local realities is that many parents who are actively involved in their children’s education view the Common Core as state-imposed disaster.  They are immune to the Coleman-Gates appeal for uniformity of teaching for the sake of uniformity itself.  They are suspicious of an approach to math that vaults over tried-and-true approaches for the sake of the unproven benefits of a new “theory.”  They are equally perturbed by an approach that derogates the importance of literature and, when it does introduce literary works, chops them into fragments and short extracts.

I can’t say from my short stay in Alabama whether all the complaints I heard about the Common Core are justified, but I can definitely say that the Common Core in that state has been received as an unwelcome imposition by large numbers of parents and teachers.  As in many other states, the Common Core is treated by educational authorities as a done deal.  The state bureaucracy likes it as does the Chamber of Commerce and a fair number of business leaders.  But they face a large and increasingly passionate opposition.

There are many reasons for this.  Dissatisfaction with the Common Core usually begins with seeing what it looks like in homework assignments and textbooks.  But before long, critics usually find their way to deeper questions about the spirit of Common Core and how that spirit came to have such authority over the whole country.  It seems we woke up one day to find an invading army had taken command of the nation’s schools.  And indeed, something like that happened.   We never asked for “standards” like these.  They were invented elsewhere and with trickery, stealth, and an almost complete lack of transparency, they became law in 45 states.  Trying to win back some reasonable level of control over the schools will require hard political effort.  In Alabama and in many other states that effort is multiplying.  It is the same in Alabama and New York.

Stanley Fish Looks Postmodernly at Academic Freedom

Whatever their ostensible subjects, Stanley Fish’s books tend to be about Stanley Fish. His new one, Versions of Academic Freedom, extends the conceit.

Which is not to say that the book is only a “Version of Stanley Fish.” It is also a succinct, well-informed, and often elegant essay.  Fish’s great talent is compression.  In this case he reduces the overgrown jungle of debate about academic freedom in America’s colleges and universities to a lucid list of five alternative positions:

Continue reading Stanley Fish Looks Postmodernly at Academic Freedom

A Climate March against Capitalism

About 400,000 people assembled yesterday along Central Park West and marched down though Columbus Circle, to Midtown, and then east to the United Nations.  Billed as “The People’s Climate March,” the event was intended to focus the attention of national leaders in town for the United Nations Climate Summit, which starts on Tuesday.

Some 50,000 of the marchers were college students and the event was overripe with student activism.  If there was one theme that dominated, it was hostility to capitalism.  “Capitalism = Ecocide” was one sign that caught my eye, along with “Fracking = Death,” “Consumerism is Killing the Planet,” “Nature Has Condemned Greed,” and “Capitalism is a Crime.” This is the socialist moment for the sustainability movement.  The release of Naomi Klein’s new book, This Changes Everything:  Capitalism vs. Climate has both captured the hardening tone of sustainatopians and accelerated their descent into blind opposition to modernity.

Continue reading A Climate March against Capitalism

‘Bill, You’re Wrong about Common Core’

For years, Bill Bennett, former U.S. Secretary of Education, has avoided taking a position on the Common Core K-12 State Standards.  But yesterday he declared himself in favor. His essay in The Wall Street Journal, under the headline “The Conservative Case for Common Core,” dwells on the idea that conservatives generally favor good books, shared truths, and education that equips students with basic math and the ability to “read and distill complex sentences Bennett drives the point that “certain abilities” ought to be “common knowledge of all.” Just so. But then he proceeds as though “common knowledge” and “The Common Core™” are one and the same. They’re not.

Continue reading ‘Bill, You’re Wrong about Common Core’

Post Traumatic Slave Syndrome

Here comes PTSS, the latest concoction in the crowded field of group grievance. That would be Post Traumatic Slave Syndrome, the invention of “Dr. Joy,” Joy DuGruy, billed as ” the nationally and internationally renowned” researcher and educator.

I will venture a guess that PTSS hasn’t yet caught the attention of many readers of Minding the Campus.  But in view of this summer’s trauma in Ferguson, Missouri, and the escalating rhetoric on racial division in America, it is a good idea to keep up to date on the latest conceits for group grievance.  PTSS is not, or not yet, in the Diagnostic and Statistical Manual of Mental Disorders, the Bible of the psychotherapeutics.  But give it a few more years.

Continue reading Post Traumatic Slave Syndrome

Making Jefferson, Madison and Franklin Disappear

History News Network

In 2012, the College Board released a new set of standards for the Advanced Placement United States History (APUSH) course. APUSH vanishes some figures who would seem indispensable to any basic history of the United States. This is American history seemingly without Benjamin Franklin, Thomas Jefferson, and James Madison. “Seemingly” is a key word. If you examine the APUSH standards for the period 1754-1800, these three figures are not mentioned, but there is a reference to “Jeffersonian Republicans” that could imply the existence of someone named Jefferson. One of the members of the APUSH Redesign Committee teaches at a place called the “James Madison School.” And in an appendix that considers possible AP Exam questions aligned to APUSH, we are counseled that a student could demonstrate how the American Revolution failed to change the structure of privilege in society by citing “George Washington, John Adams, and Thomas Jefferson” as “examples.”

Continue reading Making Jefferson, Madison and Franklin Disappear

Pushing American History as a Long Tale of Oppression

The Republican National Committee adopted a resolution on August 8 criticizing the College Board’s new Advanced Placement U.S. History (APUSH) course and exam. The RNC called for the College Board to “delay the implementation” of APUSH for one year and convene a committee to draft a new framework “consistent with” the traditional mission of the course, state history standards, and the United States’ “true history.”

The resolution quickly caught the attention of the left-of-center media.  MSNBC leaned in; The Daily Beast growled; Right Wing Watchglared; TalkingPointsMemo repeated; and Wonkette sassed. A good time was had by all. Doktor Zoom’s report on Wonkette epitomized the spirit of the left’s response. The good Doktor explained that in the eyes of the RNC, the new “exam framework doesn’t even say that America is the Bestest, Freest, Most Wonderfullest Republic that ever existed in the world, and it also completely fails to say that Jesus handed the Constitution to George Washington.”  Newsweek, on the other hand, took the trouble to explain the opposition to APUSH—though its headline, “What’s Driving Conservatives Mad about the New AP History Course,” assumes that the opposition is primarily conservative.

Making sure that the Advanced Placement U.S. History course is reasonably comprehensive, fair-minded, and accurate ought to concern people across the political spectrum, not just conservatives.  However, the decision by the Republican National Committee to weigh in with a resolution “condemning” APUSH (as the headline in Education Week put it) ensures that partisans of all sorts will pile in.

Why We Should Delay the New APUSH

In truth, I would rather see this matter resolved at the level of good historical scholarship.  Is the new APUSH a good history course?  Does it present a thorough and systematic account of the developments that brought our nation into being?  Does it trace the conflicts, recognize the principal persons, grapple with the ideas, and come to terms with the triumphs and failures of the American experiment in self-government?  Does it give a clear picture of the profound economic changes that led from us from being a collection of mainly agrarian colonies on the east coast to the world’s most prosperous nation?  Does it teach students to be mindful readers of history—students who are capable of reading original documents with unbiased eyes, and who are likewise capable of catching the ideological purposes to which history is often put?

Those are questions in which the new APUSH, as I read it, doesn’t fare very well.  It seems to be an American history curriculum that views the European settlement of North America as mainly an act of dispossession of the native peoples followed by many further acts of oppression.  It tells a story beginning with pre-contact Native Americans in 1491 that is overwhelmingly materialist.  Food production, human labor, environmental factors, population movement, and so on count heavily; but ideas, beliefs, and aspirations play a secondary role, or in some cases no role at all.  The development of the nation as having a genuine common culture, an order based on evolving principles of law, and enriched with religious aspiration is left to the margins or pushed off stage.

But I don’t wish to claim a final word on such matters.  History standards and curricula, and the textbooks and documents used to teach courses such as APUSH are complicated things.  And it would be best if we heard from a broad spectrum of American historians who have had the chance to read the APUSH materials carefully.

In that light, the RNC’s call for a year-long delay strikes me as a very good idea.

Some of the other critics of APUSH have done a good job in noticing particular things—especially people—that APUSH leaves out.  John Winthrop, the first governor of Massachusetts, is missing.  Benjamin Franklin, Thomas Jefferson, and James Madison have faded into APUSH obscurity.  Even the Reverend Martin Luther King, Jr. is mysteriously absent.  We can be sure that none of these figures was overlooked by accident.  They are left out because they do not fit the story that the architects of APUSH want to tell.

Related: Look What the College Board Has Done to AP US History

That story is fairly easy to grasp:  the history of the United States is a history of expropriation, imposed suffering, forced labor, exploitation, environmental heedlessness, class oppression, racism, sexism, and the rule of the privileged few over everyone else.  These themes can be magnified in the hands of teachers who are convinced that they amount to the whole story, or they can be diminished a bit by teachers who have reservations, but they are the chief substance of the new APUSH. There’s really no way around them.

The University Scene

This is not to say that APUSH is drastically out of step with what college history professors now teach.  Two years ago the National Association of Scholars published a study of 85 freshman history courses taught at Texas A&M and the University of Texas at Austin.  In Recasting History:  Are Race, Class, and Gender Dominating American History? we reported that 50 percent of the teachers of these American history courses at Texas A&M devoted half or more of the content of their courses to race, gender, and class, and 78 percent of their counterparts at UT Austin did the same.  Moreover, the younger portion of these faculty members were overwhelmingly self-declared specialists in race, gender, class history.  Among those who received their Ph.Ds in the 1990s or later, 90 percent of the Texas A&M historians were race, class, gender specialists; and among the UT Austin cohort, 83 percent were.

Recasting History provoked a furious response from the American Historical Association and many other apologists for the “new” social history.  Few, if any, denied the accuracy of our data or the substance of our analysis.  Rather, we were taken to task for our finding fault with something that these historians now regard as a positive good.  Teaching against the old “pieties” of American exceptionalism is regarded in these quarters as a combination of joy and duty, and emphasizing the story of America as a saga of unending oppression is seen as nothing but the plain truth.

In that light, APUSH is likely to find many supporters among academic historians.  And academic historians who are skeptical about the APUSH approach have become a beleaguered minority.  As this battle over APUSH proceeds, the critics of APUSH shouldn’t count on robust support from university history departments.  By and large they are the source of the problem, not the last redoubt of sensible scholarship.

Apprentice Skeptics

APUSH also sets out, quite emphatically, to turn students into “apprentice historians.”  This has a certain cart-before-the-horse quality.  How can a student acquire the sensibility and tools of a historian without first gaining a fairly full grasp of historical narrative?  We generally need a context before we can plunge deeper into analysis and re-consideration.  The APUSH emphasis on making students into “apprentice historians” is, I suspect, a roundabout way of expressing the goal of making students into hardened skeptics toward anything that suggests American exceptionalism.  If the students are “apprenticed” to the preferences of politically progressive historians, they will quickly get the idea that American exceptionalism is nothing but the lies and excuses the rich and the powerful use to cover up their depredations.

Again, I don’t insist this is the last word on what APUSH is up to.  The whole thing is put in such opaque ways that it is hard to tell for sure.  But surely we would do better to wait until we have heard from historians who were not part of the College Board team that created APUSH.

So again, the RNC’s call for a year-long delay seems like a constructive suggestion.

Nash vs. Cheney Revisited

Those whose memories extend back to the 1990s will no doubt recall that we have been through something like this before. In 1992, Congress appointed the National Council on Education Standards (NCES), a project approved by President George H.W. Bush, funded by the National Endowment for the Humanities and the Department of Education.  NCES was co-chaired by Charlotte Crabtree and Gary Nash, though in the ensuing controversy it became mainly associated with Professor Nash.  The Standards were released in November 1994 and immediately set off a furor; Lynne Cheney, who had been head of the National Endowment for the Humanities when the project started, repudiated the results.  She began an op-ed in The Wall Street Journal, “Imagine an outline for the teaching of American history in which George Washington makes only a fleeting appearance and is never described as our first president. Or in which the foundings of the Sierra Club and the National Organization for Women are considered noteworthy events, but the first gathering of the U.S. Congress is not.”

The Nash-led NCES Standards began a subject of anguished national debate which culminated in the publication in 1996 of a heavily revised version of the standards.  Nash wrote a book, History on Trial: Culture Wars and the Teaching of the Past (1997), recounting with considerable bitterness his showdown with Cheney.  As the historian Sean Wilentz saw it, the whole thing was about the effort of professional historians to incorporate into American schools the emphasis on “social history” that had come to dominate the field since the 1960s.  Wilentz thought Nash and his collaborators “naïve” in thinking they could do this without provoking a “political response.”

To a fair extent, the APUSH controversy is Gary Nash 2.0.  Many historians were disappointed that the 1994 National History Standards survived only in compromised form.  They wanted the new “social history” straight up.  And they understood that the “social history” Nash had fought for wasn’t just a matter of including the stories of ordinary people who were ignored by previous forms of history writing.  “Social history” is code for history that deconstructs the ideals and the “myths” of the nation.  The Founders are put in their place as patrician slave-owners or propertied men who benefited from other oppressive privileges.  “Manifest Destiny” was a slogan used by rapacious colonizers to justify genocide.  The “Wild West” wasn’t “tamed” but plundered.  Every story “valorized” by older historians had to be unwoven and discredited by the new history.

The Zinnification of American History?

In its most unabashed and vulgar form, the new history is what Howard Zinn served up in his ever-popular A People’s History of the United States.  Nash and his collaborators on the National History Standards were several steps up from Zinn, but aboard the same ideological escalator.

Related: America the Awful–Howard Zinn’s History

The new APUSH is somewhere in this vicinity as well, but the new tactic seems to be to avoid head-on challenges to the mythos of American history—all the people and events that made us foolishly think America was different and special—and focus instead on telling the alternative story of racism, despoliation, and oppression.

Thus, APUSH doesn’t attempt to debunk the American Founding.  It just pushes it aside.

I should add that when I refer to the “mythos” of American history, I don’t mean a collection of falsehood, noble lies, or one-sided distortions.  When we anthropologists speak of mythos, we mean the essential truths that form the vital core of a people’s shared identity and without which we dissolve into disparate parts.  The American Founding is part of our mythos—maybe the most important part, but definitely not the whole.  And a mythos is by no means a story compounded of self-glorifications, as Doktor Zoom at Wonkette seems to think (“America is the Bestest, Freest, Most Wonderfullest Republic that ever existed in the world”).  It is a story that includes failures, ignominies, and tragedies, as well as hard-won triumphs, and it has plenty of room for ambiguities in between—of events like the Civil War that are both triumph and tragedy.

The Challenge

Judging by the responses to the RNC resolution, we are in for a season of slippery misrepresentation.  Critics of APUSH will be caricatured by the left as cultural ignoramuses in search of something to get mad about.  And the College Board’s protests of innocence will be taken at face value.  Already the Daily Beast has blandly reported that College Board’s explanation that the new APUSH exam is meant to be “more flexible” than the old one and that the APUSH framework “has not changed” since 2012.

The idea that the new test will be more “flexible” may be a way of saying that in the new test “facts” matter less than facility in “interpreting” material into the right ideological silos.  But much remains to be seen on that score.  What we have so far is the detailed standards and only a model test.  As for APUSH not having changed since 2012, that’s an impressive bit of stagecraft.  Yes, the current APUSH was finished in 2012, but it was released so quietly that virtually no one outside the charmed circle of APUSH advocates knew about it.  Only in the last few months with the actual rollout of the course scheduled for this fall has APUSH come to public notice.

My guess is the College Board was well aware of what happened in 1994 when the Gary Nash version of National History Standards was released and provoked immediate and harsh pushback.  To avoid that, the College Board attempted a stealth rollout.  That tactic has clearly failed, so it is on to the next tactic, which consists of well-choreographed displays of wounded innocence performed to the tub-thumping music of MSNBC and TalkingPointsMemo.

None of this would matter if APUSH was some obscure academic course, but it is in fact the gateway course on American history for most of America’s most talented high school students.  As the RNC noted, nearly half a million high school students take it each year.  And to that I would add that for many of those students, APUSH will be the last course surveying American history they will ever take.  The partisan view of American history it presents is likely to leave some lasting impressions.  And worse still, the absence of a more abiding vision of American aspiration is likely to leave a lasting emptiness.

Campus Activism: the Fight for Imaginary Victories

This article is third in a series on “the year that was” in higher education. The first two articles are here and here

Campus activism is, by and large, the world of make-believe.  Whenever students occupy a president’s office, Tinkerbell is not far away.  Whenever faculty demand a boycott, Professor Dumbledore winks at Professor Snape.

Continue reading Campus Activism: the Fight for Imaginary Victories