Category Archives: Essays

A Conservative Argues Some ‘Free Speech’ Is Assault

It has become familiar among conservatives, on and off campus, to cast up the warnings about “moral relativism” as they gnash their teeth about the state of the culture. And yet we often find conservatives with a libertarian bent backing into a soft version of relativism. That tendency has been especially pronounced among conservatives who bear the title of judicial office–and with effects far more palpable on our lives. For their judgments have worked to sweep away those local laws that used to upheld codes of decency and moral restraint for people who encounter one another in public places.

Judges vs. Ordinary Folk

There is no more visible arena where the theories that summon the sentiments of judges run counter to the common sense held by ordinary folk. That common sense was expressed in the classic case of New Hampshire v. Chaplinsky in 1942, where Justice Murphy noted that there were certain well-defined and narrowly focused classes of speech have never been given protection under the Constitution:

These include the lewd and obscene, the profane, the libelous, and the insulting or “fighting” words — those which by their very utterance inflict injury or tend to incite an immediate breach of the peace (my italics).

That sense of things reflected the understanding long settled in the law that an “assault” did not strictly require the laying on of hands: one could hold an unloaded gun near someone’s head and pull the trigger. There were expressive gestures, accomplished with words and acts, that would be understood instantly as menacing: phone calls or letters meant simply to scare or terrorize, or the burning of a cross outside the home of a black family. That is why a “verbal assault” could be understood very much as an assault as any act that involved physical injuries.

Ordinary people, used to living with ordinary language, had no trouble recognizing those words and gestures that were established in our conventions as terms of “insult” or attack. Truck drivers and construction workers may have as keen a sense as lawyers about the gestures and words that insult or show disrespect. They know the difference between the N-word, or F***, as opposed to words such as “urologist” or “dentist,” which may also set off tremors for some people.

It was Justice Murphy’s insight also that words and gestures of this kind could be banned without interfering with the freedom of people to pursue all of the legitimate uses of speech. And so, Mr. Rosenfeld, arguing at a PTA meeting in New Jersey, was asked to restrain himself because his only adjective, freely used, was “motherf****ing. Asking Rosenfeld to restrain himself from the use of that word would not have diminished his freedom to make the most substantive critique of the school board.

“F*** the Draft”

But in 1971, the conservative Justice John Harlan took the telling turn to relativism in Cohen v California, where he famously declared that “one man’s vulgarity was another’s lyric”—that the meaning of words was so subjective that it was no longer possible to make any principled distinction between the words that were fit or unfit for public places. The case involved a jacket saying “F*** the Draft” worn in a courthouse. But Harlan thought the meaning of words, even these words, was too slippery for judgments to be made. In this respect, the law would declare itself incapable of judging what any man of sense could plainly understand. Several years ago, the Supreme Court struck down the attempt of the FCC to ban the use of the words F*** and sh*** from television, and yet it was telling that the lawyer arguing that case, did not speak in the courtroom those words he was seeking to defend. His delicacy confirmed the point he had been trying to resist—that certain words were indeed not respectable to use in public without degrading the very climate of discourse.

Ordinary people, even today, have no trouble in decoding the difference between a burning shoe box and a burning cross. And yet, no less a figure than Justice Scalia wrote the opinion to strike down an ordinance in St. Paul that banned the burning of crosses. But this past spring, in Matal v. Tam, even the conservative judges took the Court to the decisive point of making relativism the default position on matters of speech.

The case involved the question of whether the Federal Trademark Commission could deny, to an Asian rock band, the freedom to call itself “The Slants.” The Commission held that the “Slants” was a derogatory term, inspired by the slanted eyes imputed to Asians. The Commission reached that judgment even though the name was chosen by the band itself as a way of removing the sting of the term through self-mocking. By the test of Chaplinsky, the judgment of the Commission could have been overturned for the simple reason that the term just wasn’t widely known in the country as a term of mockery.

Instead of settling the case in that way, the judges took the doctrine of free speech to another level by drawing on an older case and declaring that the decision “offends a bedrock First Amendment principle: Speech may not be banned on the ground that it expresses ideas that offend.” Forty years ago, when a group of Nazis sought to parade through a Jewish neighborhood, David Hamlin of the ACLU declared that “the First Amendment protects all ideas–popular or despised, good or bad.” To be “despised” meant: to be unpopular. What was ruled out was the possibility that certain kinds of speech could be in principle despicable.

Now the conservatives on the Court have embraced the ACLU position, for they have ruled out the notion that certain kinds of expressions may be “offensive” in point of principle. Since that decision by the Court we have indeed had applicants seeking the trademark of “Nigga” for products as varied as clothing, bags and drinks. The law has long banned the advertising of discrimination based on race, and yet it is possible now to have a business with the title “For Whites Only.”

David Hamlin went on to say that we must be free to hear the Nazis because we must be free to choose the Nazis. But to choose the Nazis was to choose to install a government that rejected at the root the moral premises of a free regime and the principles that encompassed the freedom of speech. To say that we are free to reject a regime of freedom is to say that there is nothing in principle good or right about a government of that kind, for if there were, we could not be free to reject it.

Step by step, then the conservatives on the Court have moved down the path of relativism marked by the liberal justices that came before them. And the result: They have proclaimed new, sweeping freedoms of speech, while they insist that there is nothing morally better, nothing more defensible in principle about those rights, or the regime that contains them, than a regime that would dismiss these “rights” with contempt.

 A longer version of this piece was published February 6, 2018, in Claremont Review.

Article photo from MGM’s  parody, The Starving Games

Why ‘Implicit Bias Training’ Makes No Sense

Does it make sense for Starbucks to put its workforce through “implicit bias training”?

Maybe as a public relations gesture to apologize for the arrest of two peaceful black men who were there for a meeting without buying anything at a Philadelphia Starbucks and then asking to use the restroom. But if the company’s goal is to remove hidden prejudice from its employees’ minds, this “training” makes no sense at all.

The reason: despite many attempts to define it, no one knows what “implicit bias” is, and so no one has any idea what “training” might be useful to combat it.

Can It Be Defined?

I attended a special National Science Foundation (NSF) conference last fall to address controversies surrounding “implicit bias.” In the end, there was a discussion to see if the attendees could agree on what the term meant. They (we) could not do so.

Dr. Anthony Greenwald, a social psychologist at the University of Washington, and one of the foremost proponents of implicit bias presented two different definitions (which is fine; scientists changing their opinions or views of some phenomenon or measure is a natural part of the evolution of scientific understanding). Here is Greenwald’s first definition, based on work done in the 1990s:

Implicit bias is introspectively unidentified (or inaccurately identified) as effects of past experience that mediate discriminatory behavior.

Much of the time the term is thrown around, even in scientific articles, without being defined. When it is defined, different people define it differently. Yet training has been instituted at many colleges, universities, and corporations around the country.  This training, endorsed by the American Association of Colleges and Universities, is wildly premature because research has only just begun to examine effectiveness.

The danger comes in the form of a test that supposedly measures a person’s level of implicit bias. It’s endorsed by the American Association of Colleges and Universities in their ongoing crusade to diversify the students and faculty in higher education. Some training may have actual adverse consequences.

The most common measure of what some people refer to as implicit bias is the implicit association test (IAT), which was also developed by Dr. Greenwald. It assesses how closely various concepts in memory are linked (or “associated”). For example, I suspect most people would associate vegetables with green, and comets with space, more strongly, say, than they would associate comets with green or vegetables with space.

The IAT assesses this strength of association by comparing how quickly people complete various categorization tasks (for a simple description of how the IAT works see this article). Although the association of comets with space might be harmless, perhaps other associations, such as Jews with banks, women with family, or African Americans with crime are not so harmless. Or perhaps they are.

How does speed of completing this categorization task map onto Dr. Greenwald’s definition of implicit bias? It doesn’t. The IAT might sometimes reflect a person’s past experience or predict discrimination, but whether it does or not are empirical questions; it does not measure either. It does not measure mediation (mediation refers to the idea that A causes C because A first causes B which then causes C). Because research over the last few years has shown that people are quite good at predicting their IAT scores, one cannot say they are introspectively un- or inaccurately identified. So if we delete what the IAT does not measure from Dr. Greenwald’s definition, we get this:

Implicit bias is introspectively unidentified (or inaccurately identified) effects of past experience that mediate discriminatory behavior.

Dr. Greenwald also offered this second, updated definition

Implicit bias is automatic cultural filtering.

I do not recall him explaining what he meant by this in any detail, and, frankly, I have no idea what it means. What about other definitions?

In many areas of academia, the definition is not as important as the meme. An article in the Brown Political Review, The Dangerous Mind: Unconscious Bias In Higher Education, the authors conclude, “As universities attempt to promote equality for people of all races, genders, and ethnicities, they must combat the unconscious bias that plagues relationships between students and professors. To do this, they should adopt unconscious bias training and help assist all professors counter their biases.”

A recent article in Scientific American defined implicit bias as “the tendency for stereotype-confirming thoughts to pass spontaneously through our minds.” This makes no sense either, because, in fact, the evidence so far shows that stereotype accuracy is one of the largest and most replicable findings in social psychology. Why is it reasonable to refer to accurate beliefs as any sort of bias? It isn’t. This CNN article on the Starbucks incident defined implicit bias as: “The automatic associations people have in their minds about groups of people, including stereotype,” a definition that also runs up against the absurdity of defining beliefs that are accurate as “bias.”

Measuring the Wrong Thing

Much of the usage of the term is tautological. The IAT claims to measure implicit bias. But how do we know people’s implicit associations are biased? That is what the IAT measures.

From a purely empirical standpoint, there is considerable scholarship showing that: 1. The IAT measures lots of things besides prejudice; 2. Its reliability is low; 3. There seems to be something fishy in the test such that scores well above zero (usually interpreted as bias) correspond to egalitarian responses on other measures; and 4. Its ability to predict individual behavior is quite low.   But once there’s a score that could brand someone a racist, it could follow them throughout their lives and affect their futures.

The Scientific American article argued that, even if the IAT is suboptimal, there is other evidence of such biases. But much of what they cite is not “bias” in any recognizable sense. For example, they cite evidence that people are faster to recognize bad words when paired with white faces than with black faces. Whatever theoretical value this might have, linking such findings to discrimination or inequality is rarely, if ever done.

On the other hand, they did review evidence of ongoing discrimination as if that is evidence of “implicit bias.” Is it? If so, then we now have yet another definition of “implicit bias” – discrimination. And if discrimination=implicit bias, why do we need a new term for an old phenomenon? Regardless, they did not cite any of the research showing no discrimination at all, such as this paper reporting the nonexistence of discrimination in 17 large sample nationally representative studies. To be sure, the evidence of discrimination they cited is real enough. But cherry-picking evidence is a sure way to overstate discrimination.

Starbucks’ go-to solution – subjecting the entire company to a day of “implicit bias training” borders on ridiculous, at least as a plan to actually reduce discrimination. Admittedly, it may have other purposes – public relations, preventing lawsuits, etc.

As discrimination-reduction, implicit bias training is silly. Think about it – if “implicit bias” really is the automatic and uncontrolled prejudice so often claimed, one cannot “train” people out of it anyway. More important, it is very difficult to change implicit associations about groups, and no evidence that doing so produces changes in behavior.

Furthermore, much of the impetus behind the push to bring “implicit bias” from the ivory tower to the real world (faculty training, policy, law) has been the assumption that unconscious prejudices lead to biases against individuals. Although that surely does sometimes happen and might have even happened in the Starbucks incident, we recently published a paper showing how easy it is to eliminate biases against particular individuals.

In our study, people evaluated the intelligence of Luke (white) and Jamal (African American). We found a typical bias score on the IAT (favoring Luke) when people only knew the person’s name. However, when they found out that both Luke and Jamal were weak students (low GPAs and SATs), there was no bias. When they found out that both were strong students, there was no bias. One simple solution to “implicit bias “might simply involve encouraging people to be sure to judge others on their merits, rather than their categories. When they do so, biases are often greatly reduced and, sometimes, disappear.

What Should Starbucks Do to Solve an Actual Problem?

First, they need to figure out what the problem really is. With thousands of stores and millions of social interactions a year, some may go bad sometimes. Most likely, the cop-calling employee just needs a good talking to. Perhaps it was simply bad customer-relations, and the cop-caller was also overly aggressive with White noncustomers. Or, perhaps, upon further review, all concerned would conclude there really was a racist element here.

If so, the employee could be retrained or, if needed, fired. The solution would depend on the judgment about the nature of the problem. Personally, I tend to err on the forgiving side; if this was the first time the cop-caller screwed up, I’d lean towards a second chance. If it was not the first, I would be less forgiving.

12 Reasons I Like Jordan Peterson’s 12 Rules

I’m standing in line outside the Beacon Theatre. As the sun goes down, I find myself wondering why I made the drive from Philly to attend a Jordan Peterson (JP) presentation in Manhattan. As a junior in college, I sit through lectures every day. Do I really need another one? Yes, apparently, because I feel lucky to have a ticket. The other people in line probably feel the same way. A sign on the sidewalk says “Sold Out.”

I take my seat and glance around at the 3,000 JP fans crowding into the theater. We’ve all read his international best-seller 12 Rules for Life: an Antidote for Chaos. But it’s strange that a book with a corny title could draw such a diverse and unlikely audience. Why are college kids, seniors, and more than 1,000,000 YouTube subscribers watching and listening to a plain-talking author who tells them that responsibilities are more important than rights?

As JP moves from point to point in his three-hour lecture, I begin to take notes. Specifically, I try to figure out what’s so appealing about this man and his rules for life? And so, here are 12 reasons I like 12 Rules.

Related: Jordan Peterson Provokes the Angry SJWs

1. I like his style. Some critics have called JP a smooth-talking conservative guru who leads people (mostly men) astray with simplistic answers and show-biz sermons. In person, however, he comes across as anti-charismatic; a stern-looking, plain-talking, beady-eyed psychology professor with an Ontario accent.

2. He knows his material. Even though JP talked for nearly three hours, he spoke off the cuff the entire time, no notes or prompter needed. It was amazing to see him focus on potentially-boring psychological and philosophical topics, and yet manage to keep a huge audience sitting in captivated silence or laughing out loud.

3. He understands the importance of purpose. College students today get plenty of experience in the sterile world created by political correctness, a world of freedoms that are largely without meaning. JP makes it clear that politics are not going to save us, either as individuals or as a society. Finding the kind of meaning in life that illuminates our existence is not a freebie. It has to be earned.

Related: Jordan Peterson and the Lobsters

4. He defends his ideas. The first of JP’s 12 Rules is “Stand up straight with your shoulders back.” He also phrases the same idea somewhat differently: “Adopt a stance of ready engagement with the world.” JP makes it clear that – whether or not the idea is compatible with our desire for a kinder, gentler world – hierarchies exist. Struggle for dominance is the norm, in the animal kingdom and in human society. Others may disguise their aggression with politically correct language, but they are really seeking to dominate. Always be aware that life is a struggle. Stand up straight, keep your shoulders back, and be ready to defend what you believe.

5. He shows that structure can be liberating. Each of his 12 Rules is really a starting point for detailed discussions of effective ways to manage your life (in his three hours at the Beacon, he only got through six of the 12). He explains why you should “Treat yourself like someone you are responsible for helping” (Rule 2) and why you should “not bother children when they are skateboarding” (Rule 11). JP’s simple tips lead to big insights.

6. He offers a refreshing framework for America’s civic life. JP argues our national debate is focused too much on the concept of rights. Instead, he argues, there should be more focus on responsibility. Life is suffering, JP claims, but the best way to deal with this unavoidable suffering is by taking up the burden of struggle. The acceptance of responsible struggle is the pathway to living a meaningful life.

Related: Another Speaker Shut Down By College Students

7. JP may be a secret existentialist. I am a big fan of the 20th-century French existentialists, and one does not have to dig deep to see their influence on JP’s philosophy. Another JP, Jean-Paul Sartre, wrote that because man is ultimately free to define his nature, he bears an incredible responsibility. For Sartre, as for Peterson, our freedom comes at a heavy cost; but, it is a cost we must pay in order to live a meaningful life.

8. JP convinced me that majoring in philosophy was not a mistake. My science-oriented peers often argue that the advent of modern scientific findings makes the existential philosophizing of people like JP obsolete. I have been told many times by friends in the biology and chemistry departments that philosophy is useless. Compared with science, they say, philosophy can have no right answer. But in his book, JP shows that the quest for truth in science is directly linked to the quest for truth in philosophy.

9. JP’s book combats poisonous ideas in postmodernism. Most of my friends on campus have been swept up by the ideas of postmodern thinkers and moral relativists. They may disagree with biology students that philosophizing about meaning is “useless.” But you will be hard-pressed to find any student who is confident in the idea that anyone has the right answer to the question “what makes life meaningful.” For JP, the quest for meaning is helped by rules and responsibility.

10. JP understands that lack of purpose has harmful consequences. For someone on the outside looking in, a lack of purpose among college students might seem like a minor problem. But as someone enveloped in campus culture, I strongly disagree. Many students are paying a heavy price for living in a wonderland of choice. Colleges and universities constantly push the ideas of freedom and student rights. Heavy drinking, casual sexual relationships and other forms of hedonistic behavior are commonplace. Most of it is done in the name of being young and free. But if this is an okay way to live, why do so many of my friends complain of feeling anxious and depressed? Why do they often end up in tears at the end of a night of hearty partying? Why was I having similar problems? I think JP’s focus on responsibility offers students a better path, not only for their four years at college but for the rest of their lives.

11. He cautions us to stand our ground against counterpunches from critics, and he shows his readers how to do it. 12 Rules is being constantly derided, mocked and dismissed by liberals and trendy commenters. JP is often accused of the cultural crime of insensitivity. He is censured for being offensive. But JP sticks it to people who make the mistake of thinking he’s an easy target for PC inquisitors, and he does it without being either passive or belligerent. He tells us to stand strong, be precise in our speech (Rule 10) and say what we really think.

12. He tells us to grow up, to replace empty ideology with responsibility. He advises us to get our own lives in order before telling others what to do (Rule 6). Do what’s meaningful, JP writes (Rule 7), not what’s easy. Those are important messages for college students. We have become accustomed to having our every desire satisfied as soon as we first desire it. We don’t like it when someone tells us we are living meaningless lives, and the solution is hard work. JP reminds us of the fact that, to paraphrase the end of William Ernest Henley’s “Invictus,” we are the masters of our fates. We are the captains of our souls. Yeah, I know, that’s a trite thing to say. But, thanks to Jordan Peterson, I also know it’s true.

The Last-Ditch Battle to Protect Racial Preferences in California

California voters made racial preferences illegal by passing Proposition 209 in 1996, but many university officials have ignored the law, especially at the state’s top law schools. Among such officials, it is a deeply ingrained belief that social justice demands measures to close statistical gaps between “underrepresented” groups (particularly blacks and Hispanics) and “overrepresented” groups (which means whites and Asians).

In law school admissions, that means accepting some applicants from favored groups who otherwise would be rejected, and therefore must attend a less prestigious law school or pursue some other career. It also means, of course, that some applicants with higher scores on the Law School Admission Test (LSAT) and superior undergraduate records must be rejected to make room for them.

Related: UC Berkley and UCLA Law Schools: Scofflaws?

Whether it makes sense to admit law students at least partially because of their ancestry has long been a matter of dispute.  Even if you grant the premise that society would be better off if there were more blacks and Hispanics in the legal profession, do we achieve that goal by racial preferences in law school admissions? Economist Thomas Sowell, for one, has argued for decades that such preferences are counterproductive, but Sowell’s views are rejected by mainstream academics because he favors free markets and government minimalism.

Then in 2004, UCLA law professor Richard Sander published an article in Stanford Law Review, “A Systemic Analysis of Affirmative Action in American Law Schools,” that could not be ignored because Sander was a liberal in good standing and his work appeared in a top-notch professional publication. Sander’s argument was that, based on the data he had been able to get from the Law School Admissions Council, affirmative action (i.e., racial preferences for certain minority groups) actually led to there being fewer lawyers from those groups.

The reason was that most of the preferred students were not academically competitive in the law schools to which they’d been admitted. They tended to drop out, rank low in their law school classes and passed the state bar exam at lower rates than did minority students who had attended less prestigious law schools. Sander wrote, “Most black law applicants end up at schools where they will struggle academically and fail at higher rates than they would in the absence of preferences.”

Sander had touched a raw nerve. Defenders of the affirmative action orthodoxy sprang to attack Sander’s research (e.g., this article by David Chambers and other law professors) and Sander quickly took them on here.

In 2012, Sander continued his apostasy by publishing a book (co-authored by Stuart Taylor, Jr.) entitled Mismatch: How Affirmative Action Hurts Students It’s Supposed to Help and Why Universities Won’t Admit It. The book made a strong case against racial preferences, but Sander was still working on his argument. In particular, he wanted to see the law school and bar passage data held by the State Bar of California.

His request, however, was denied on the grounds that releasing such information could lead to invasions of privacy for individuals who had attempted law school and the bar but not succeeded. Sander replied that he did not want personal data and would accept the information with all personal identification redacted Still the bar refused.

Related: Why Was Professor Amy Wax Punished?

Why Was Professor Amy Wax Punished? So Sander went to court against the State Bar, filing suit in 2008. After extensive litigation, in 2013, the Supreme Court of California ruled in his favor in Sander v. State Bar of California. The court was persuaded that “the public has a legitimate interest in whether different groups of applicants, based on race, sex or ethnicity, perform differently on the bar examination.” Unfortunately, rather than ordering the State Bar simply to release the data to Sander, it chose to let the State Bar decide how to disclose the data while preserving the privacy of all individuals. The State Bar continued to drag things out, using dubious concerns over privacy to delay turning over the data.

At the same time, the State Bar chose to continue its fight to keep the effects of racial preferences hidden in another venue – the state legislature. The State Bar has a lot of clout in the legislature and in 2016 succeeded in getting it to pass a bill providing that the sort of information Sander was seeking “shall be confidential and shall not be disclosed pursuant to any state law.”

Based on that legislative change, the judge in the ongoing case, Mary Wiss, ruled in November 2016 that Sander was not entitled to the bar data, declaring that disclosure would violate state law and would be “an unwarranted invasion of personal privacy.” Moreover, she maintained that the information sought by Sander would have “minimal or no value” – a claim contrary to the finding of the Supreme Court.

But that was not the end of the matter. In October 2017, the California Legislature again modified the law, changing the Business and Professions Code to make access to aggregate statistical data such as bar passage possible. The impetus for that change came from the state Supreme Court itself, which had expressed its view that the Bar needed to do more research into the question of falling pass rates on the exam. With that pressure, enough members of the legislature were persuaded to change the law so that research into bar passage would not be stymied.

With that roadblock removed, Sander has taken his case to the state court of appeals, seeking to overturn Judge Wiss and order the State Bar to comply.

In that regard, several amicus briefs have been filed.

Pacific Legal Foundation’s brief points out the obvious truth that “race-preference advocates criticize Professor Sander’s research for not having an appropriate data set, while at the same time they try at all costs to restrict his access to that data.” In other words, the State Bar has acted in bad faith, using every trick in the legal book to keep Sander (and the public) from seeing that racial preferences have adverse effects.

The Pacific Legal brief drives that point home further by noting that the trial court was laden with witnesses who said they feared “group stigmatization” if the law school and bar passage data were made public. Is that speculative result at all likely? No — minority lawyers who have passed the bar have nothing to fear from the finding that some of those who didn’t pass were harmed by being mismatched at law school.

Another amicus brief has been submitted by the National Association of Scholars. It counters the over-hyped privacy concerns of the State Bar. “Social science research of the kind proposed by Appellant Sander and the First Amendment Coalition,” it states, “represents standard, indeed commonplace practice furthering the public interest while employing established methodologies that minimize the risk to privacy.”

And, reflecting the California Supreme Court’s concern over law school admissions and bar passage rates, the brief says, “Struggling to maintain financial stability amid falling enrollments, many law schools, controversially, have dug far more deeply into applicant pools. Bar-pass rates declined, at least in part a result of desperate admission standards…. That link raises the ugly specter, no less real, of students being exploited for tuition money when law schools know that their application data portend significant disappointment. Further, with law schools simultaneously under pressure to enhance diversity and access to legal education, the risk of the afore-described exploitation is compounded by its discriminatory impact.”

With those developments, it seems likely that the State Bar has finally run out of time. Its stonewalling tactics bought it ten years to shield from scrutiny its social engineering scheme of admitting marginally qualified minority students.

Members of the California State Bar and officials at top law schools in the state no doubt believe that they’re doing something good – something to advance “social justice” – by (as they see it) helping students from “underrepresented groups” get into the best law schools. But they would be wise to heed the words of Thomas Sowell, who has written, “Before you can be a partisan of the poor, you must first be a partisan of the truth.” If they really want to help students from “underrepresented” racial groups, they should first find out if they are actually doing them harm.

The shame is that State Bar officials have worked so long and hard to prevent inquiry into the consequences of race preferences in law school admissions.

The Decline and Fall of Sociology

As totalitarian modes of rule continue to decline throughout the world, readers of Minding the Campus will recognize the insidious strain of totalitarianism that has emerged on many college campuses—one that is characterized by the bullying, and sometimes silencing of faculty and students who deviate even slightly from the prescribed progressive campus politics.

Most recently, after decades of distinguished service, Penn Law Professor Amy Wax, was removed from teaching first-year law students because she had the temerity to co-write an op-ed that critically questioned the liberal orthodoxy that “all cultures are created equal;” and then followed the op-ed with an interview in which she suggested that affirmative action in American law schools appears to have imposed unexpected costs on their intended beneficiaries.

A Long List of Special Agendas

There is data to support Professor Wax’s assertions, and not so long ago there were sociologists who were willing to step forward to provide it. Having developed into an ideology, instead of a study of ideology, sociology has become what the late sociologist Irving Louis Horowitz described as: “a gathering of individuals who have special agendas, from gay and lesbian rights to liberation theology.” In his 1994 book, The Decomposition of Sociology, Horowitz writes: “any notion of a common democratic culture has become suspect. Ideologists masked as sociologists attack the very notion of a universal scientific base as a dangerous form of bourgeois objectivism, or worse, as an imperialist pretention. High crime rates were seen only as an expression of capitalist disintegration, and criminal behavior became a covert expression of revolutionary action.”

Related: Why Was Professor Amy Wax Punished?

The study of deviant behavior, once one of the most important sub-disciplines within sociology, became “simply a term of moral opprobrium; all social norms were really bourgeois norms; opposition to such norms represented alternative lifestyles at the least and revolutionary consciousness at the most.” Horowitz recognized more than three decades ago that such a worldview denies an external world of commonly shared experience.

In one of the first sociology texts, Emile Durkheim warned that moral unity could be assured only if all members of a society were anchored to some common assumptions about the world around them. Durkheim knew that without these assumptions, a society was bound to degenerate and decay. The function of society is to constitute a regulative force, setting limits on individual actions—recognizing that deviance is an integral part of all societies because it affirms cultural norms and values. Durkheim realized that it is impossible for any society to be free of deviance—even a “society of saints” will have its sinners—and deviance will always be present in every society at about the same rate.

‘Demands for Correct Politics’

As sociology has become balkanized along political lines, driven by an expanding number of interest groups, Horowitz predicted the discipline itself would become a “series of demands for correct politics rather than a set of studies of social culture.” He was right. There are now 52 sections of the American Sociological Association (ASA), each one devoted to a particular subfield that has almost nothing in common with other subfields. Many have become advocacy groups like the sections on “Sex and Gender,” and a separate section on “Race, Gender and Class.” The section on “Economic Sociology,” has little in common with the one on “Marxist Sociology” and shares nothing with the section on “Inequality, Poverty, and Mobility.”

Related: Identity Politics and the New ‘Me’ Generation

There is often tension within and between the groups as boundaries are tightly drawn, and new sections continue to emerge to meet new demands. In contrast, the section on the “Sociology of Sexualities” is simply stated: “to encourage and enhance and foster research, teaching and other professional activities in the sociology of sexuality.” The section on “Sex and Gender” is devoted to research on “gendered and sexual relations,” and recently awarded their “best paper” designation to “Producing Desirable Bodies: Boundary Work in a Lesbian Niche Dating Site.” Sections rarely dissolve—although sections on Catholic or Christian Sociology or Jewish issues are nowhere to be found. Last year’s prize in the Sociology of Religion section was awarded to a paper entitled: “Christians under Covers: Evangelicals and Sexual Pleasure on the Internet.”

Just as Horowitz predicted, the rejection of a common culture with shared values has spread far beyond sociology, throughout all of the social sciences, and has even infected the Humanities. Within the current campus climate, it’s incorrect for Professor Wax to question the contributors to the culture of poverty that have emerged to keep people poor. In response, she has been publicly pilloried for her incorrect views as the public denouncements from the Law School dean and some of her colleagues call to mind the public shaming techniques used centuries ago in Colonial America, described so well in 1966 by the sociologist Kai Erikson in Wayward Puritans: A Study in the Sociology of Deviance. But, since the recipients of such public attacks are more likely to be those who have deviated from the progressive ideologies of their colleagues and campus culture, little attention is paid.

‘Making Sense of the Madness’

Irving Louis Horowitz paid attention. In fact, he often told his students—and those of us who were the grateful beneficiaries of his often harsh but always insightful editorial critiques of our work—that the real purpose of sociology was “to make some sense out of the madness.” For Horowitz, the real problem in sociology is that the discipline no longer looks at reality. Replacing the real world with a utopian future of classlessness and equality of outcome for all, sociology has abandoned scientism as an official ideology, moving away from valuing an open economic marketplace and a liberal pluralism to embrace socialism and social welfare models.

Related: The Purge of the Deviants May Go Too Far

Horowitz worried that in some ways, the campus had become fertile ground for the kinds of collectivist ideologies that gave rise to the oppressive practices of the last century. Some data—including data describing the unintended consequences of affirmative action, the culture of poverty, the outcomes for children raised by same-sex couples, or the possibility of post-abortion grief—are all off limits for serious social scientists who want to get tenure. This is not unlike the reality for sociologists living under Soviet communism where social information was kept from social scientists. Horowitz reminded us that: “Among the data that were not published in the Soviet Union were statistics on the distribution of crime, on the frequency of suicide, the level of consumption of alcohol and drugs, the condition of the environment in various cities and areas.” Today, despite being mandated by federal law, California refuses to release data on the demographics of abortion rates, claiming that compiling and disseminating abortion rates by race and ethnicity is too politically charged. Most recently, several states have announced lawsuits against the federal government for attempting to compile census data on the rates of undocumented immigration.

Castro and Franco—Two Tyrants

On March 26, 2012, the New York Times published Horowitz’s obituary with the headline: “Sociologist and Ideological Critic Dies at 82.” Horowitz would have been horrified to see himself described as an ideologue because he spent so many of the last years of his life fighting against the ideological thinking that he believed was destroying the discipline of sociology. But, many people did not understand this “man of the left” who came to the realization that those he called “left-wing fascists” and “professional savages” were subverting objective, empirical approaches to the social sciences. While some still think of him as a neo-conservative in the mold of Irving Kristol, even The New York Times had to admit that Horowitz “professed no political allegiance. In a 2007 article, he argued that Fidel Castro, the Cuban Communist leader, and Francisco Franco, the conservative leader of Spain, were equivalent tyrants.”

Horowitz understood that sociology only has a future to the extent that it helps in the betterment of the human condition. He believed that if it remained unrelated to human needs, its future would not be as a science, but as “part of the world of esoterica.” His predictions were correct as much sociology has retreated into a self-contained world with little connection to reality.

Still, Horowitz offers a way out. Claiming that the root of the problem in sociology was epistemological—the abandonment of reason—Horowitz believed that sociologists became hostage to dichotomies between value judgments and the facts of reality; between morality and technical progress; between artistic imagination and sociological inquiry; and between pure and applied research. He believed it is better to view sociologists as “sensitizing agents in moral discourse,” and he understood—like the founders of the discipline—that the ability to get beyond an assumption that sociology and morality are irrelevant to one another would be a significant step forward.

The Overthrow of the Great Books

Many years ago, in the late ‘90s, three professors and I met with the undergraduate dean at Emory University to discuss a Great Books proposal. Steven Kautz, a political scientist, led the effort, and Elizabeth Fox-Genovese, Harvey Klehr, and I backed him up. The idea was to build a Great Books track within the undergraduate curriculum whereby if a student took enough approved courses, he could add a certificate to his record. Kautz had lined up funding for the program and promises of cooperation from (to that point) three or four departments.

The dean was cautious. He wasn’t really a humanities guy, and none of us knew him well. Soon afterward, in fact, he left Emory to become head of the Bronx Zoo. Without his approval, though, we couldn’t move forward. He listened to Kautz’s pitch without any show of enthusiasm, simply nodding and posing a logistical query now and then. When Kautz finished, he finally made a substantive suggestion: drop the “Great Books” label.

The Sound of Superiority

He didn’t like the term. I recall him saying something about “Great Books” sounding superior or combative. Professor Kautz didn’t waver on the title, but he didn’t make much of an issue out of it, either. The suggestion trailed off, and we left the dean’s office without any determination one way or the other. As I think back on it now, I can’t conceive of anything we might have said after he offered his criticism that would have gotten past it and left our vision of the program intact. Obviously, we were there because we wanted a mini-curriculum of traditional, Western Civ works available for students interested in that kind of material.

Like everyone everywhere else, we had seen the core of liberal education erode as multiculturalism spread through the professorate and as “education” types called for more student choice in the general requirements for the bachelor’s degree. A policy that allowed a student to take a course on contemporary fiction instead of one on The Odyssey disgusted us. To object to “Great Books” was to ask us to drop our basic philosophy of teaching.

Perhaps that was his intent. A clever bureaucrat doesn’t kill an initiative by beheading it. He goes after a big toe, a small-seeming request or inquest that, in fact, disables the whole project or discourages the leaders of it.

But if that was the case, the resistance didn’t make much sense. After Western Civilization had suffered such a resounding defeat over the course of the 80s and 90s, it seemed bizarre or paranoid for anyone to worry about a tiny re-institutionalization of it. The modesty of our ambitions, which would only touch, most likely, 50 or so students per year, made the anxiety over the very name “Great Books” clearly overdone.

Nothing came of the proposal. Professor Kurtz left Emory awhile later and, I presume, took his idea and funding with him. I never spoke to him about it, but I’m sure the tepid, quibbling response to his initiative spurred him to search the job market for more congenial climes. He should have known better, though, and so should we. As anyone who has attempted such a program has seen, resistance on the part of campus leftists to small and non-competitive projects with a traditionalist flavor is a common occurrence. You must tread carefully to establish something that might smack of reactionary motives, even one that will be beneficial for all and won’t impinge on others’ turf. People are quick to express mistrust and worse.

The dean understood that, and his misgivings should have given us that lesson in resistance. But we could learn it only by not taking his words in earnest. He wasn’t speaking for himself. He was anticipating the reception of the initiative by our colleagues. Administrators, you see, are not the problem (Emory’s leadership has been quite helpful with conservative-oriented programs in recent years). It’s the other professors who don’t want to let it happen, no matter how unambitious the effort. A few weeks later, one of my closest friends at the time, an art historian who was gay and had close ties to Women’s Studies professors (I was still voting Democratic back then), told me that he’d heard Betsy Fox-Genovese was trying to push some conservative program under the radar. “Conservative” here meant “vile.”

The Moral Frame of Victimhood

The professors act this way because they are suffused with ressentiment. Ressentiment is, of course, Nietzsche’s term for a certain state of mind, or rather, a condition of being. He liked the French word because it signified a deeper psychology than the German (and English) equivalent does. Ressentiment is the attitude of slave morality, Nietzsche wrote, the moral formation of one who feels rage and envy but hasn’t the strength or courage to act upon them. A man of ressentiment knows and resents his own weakness and mediocrity, and he hates the sight of greatness, which only reminds the lesser party of his own inferiority. And so he fashions a new moral system whereby victimhood becomes a high badge, suspicion signifies a sensitive eye for justice, and group denunciation of lone dissenters is the surest path to virtue.

I am sure many readers of Minding the Campus have come across these types often in their academic careers. I’ve met them again and again, and a great error of my early academic career was to try to befriend them, or at least to try to lay out some common ground of collegiality. How naïve was that! You don’t ingratiate yourself with people who set their vindictiveness behind an exterior of sympathy for the disadvantaged and hurt ones among us. It obligated me to a degree of grubbing. The dynamic is never straightforward. They speak the words diversity and tolerance and inclusion, but they don’t mean them. In their mouths, those make-nice sounds are weapons of reproach. When you first encounter these colleagues, they seem tentative and probing, but not out of friendly curiosity about who you are. It’s a fraught examination of where you stand, for such creatures are acutely conscious that everyone takes a side, and they want to figure whether you’re with-us-or-against-us. Harold Bloom called them the School of Resentment long ago, and he was absolutely right. It took me awhile—far too long—to figure out that the chips on their shoulders had nothing to do with me, only with something they fancied I represented.

Defeating Male Writers

They won’t leave you alone because your very existence troubles them. One of Nietzsche’s best interpreters, Max Scheler, put it this way: “the origin of ressentiment is connected with a tendency to make comparisons between others and oneself.” They haven’t the integrity to be what they are, accept themselves, and affirm their status. They aren’t comfortable in their own skin. Other people keep reminding them of what they are not, and it bothers them. This explains the characteristic impulse to detract, to find flaws in George Washington and belabor sexism in Paradise Lost. Scheler again: “All the seemingly positive valuations and judgments of ressentiment are hidden devaluations and negations.”

When, for instance, during the Canon Wars of the late-80s a banner was unfurled atop the façade of Butler Library at Columbia showing the names “SAPPHO MARIE de FRANCE CRISTINE de PIZAN SOR JUANA INEZ de la CRUZ BRONTE DICKINSON” above the carved names Herodotus, Sophocles, Plato etc., the organizers weren’t celebrating a female tradition to go along with the male tradition. If that were the case, then students would know more today about the cultural past than they did before. The feminists would have ensured a curriculum that taught students male greats and female greats both. But, no, the real aim was to tear down the male lineage, to displace it and then to forget it. Students today know less about ancient Greece and Rome and the Middle Ages than they did during the ancient regime of the pre-60s. Multiculturalism didn’t enrich the streams of thought and creation. It only blocked the dominant one. And that was the point.

The bare presence of a Great Books program in one unobtrusive corner of the campus concerns them. They can’t help but draw comparisons. The organizers, few and humble though they be, presume to call their materials “great.” Who are they to say so? No question embodies the attitude or ressentiment better than that one. Who are we to judge? What licenses you to decide what’s great? And don’t you know that when you call some things great, you call other things not-so-great?

The adjective hits them as a challenge; their ressentiment asks them to take it personally. Great Books organizers don’t seem to realize that History passed them by, that they were routed and should NEVER return. In 1989, multiculturalists could get irritated and impatient with Great Books requirements. The War was still on. Ten years later, we were supposed to be past the whole shebang, as the title of a 2001 story in The New York Times captured so well: “More Ado (Yawn) about Great Books.”

Fewer Canonical Works, More Diversity

In the School of Resentment, it’s personal—it’s very personal. A couple of years ago, a distinguished literary critic and teacher in the New York area told me a story about a curriculum revision in his department way back in the early 90s. Several people on the faculty set about doing the customary thing—fewer requirements of canonical works and fields, more diversity. My friend, a solid liberal who would never, ever vote Republican, leaned over to the chairman before one meeting started and said something about the necessity of preserving the classics. The chairman said to him that in ranking the classics above other things, he was saying that people who teach the classics are better than people who teach the other things. My friend replied, “That’s one of the stupidest statements I have ever heard.” The chairman didn’t speak to him for five years.

People who suffer from this kind of resentment don’t like to lose. It’s not enough for them to win, either. They don’t even like to have any adversaries. They must defeat the other side, again and again, their envy ever unsatisfied by any single victory. This is the problem they have with Great Books programs. They make the eradication process more difficult for professors of the left. The professors have all the institutional power on their side, it seems, yet these old-fashioned, atavistic conceptions of tradition and greatness keep popping up like weeds through the concrete. They thought they won, and they did, but the triumph they needed had to be absolute.

It’s a tragic situation for them. Professors who sneered at the tweedy fellows who gave us standard editions of Dryden and Hawthorne, approached the traditional curriculum as if their moral make-up were so superior to that of the Old Times, and used diversity as a screen for tearing down the monuments . . . well, they chose the wrong battle. They seized humanities departments, altered the syllabus, and set identity politics at the core of disciplinary know-how, all in an effort to displace Great Books and the appreciation of them. But the aspiration to greatness is written in the human heart—as long as that heart hasn’t been warped by ressentiment (which is itself a twisted respect for greatness). Students and readers, young and old, still want them, and long after this generation of academics is gone, Great Books will be there to edify and entertain the next.

Image: The Great Books – Wikipedia

Why Was Professor Amy Wax Punished?

Nearly 10 years ago, Penn law professor Amy Wax wrote an excellent book, Race, Wrongs, and Remedies: Group Justice in the 21st Century. Last summer she co-authored a Philadelphia Inquirer op-ed arguing that all cultures are not equal. It provoked a virtual implosion at Penn and beyond. Now she’s done it again, becoming a larger and stronger voice for sanity simply by stating facts that have caused large numbers of Penn law deans, colleagues, and students to erupt.

In a March 21 video interview with Brown University economist Glenn Loury, Wax stated that she didn’t think she’d ever seen black law students in the top of their class because they had been admitted with lower qualifications than their peers. This time, after the predictable response from progressive sources, Wax was “barred from teaching” her required first-year course. She replied in an incisive Wall Street Journal op-ed, and the controversy, with an emphasis on Penn’s cravenness, was ably summarized by Heather Mac Donald here.

It should no longer be surprising that progressives want to ban or silence views that make them uncomfortable, but it is nevertheless striking that in this case, Professor Wax’s offense consisted of nothing more than stating, in effect, that her personal observations confirmed the findings of Richard Sander’s and Stuart Taylor’s “magisterial” (from my review on Minding The Campus) Mismatch: How Affirmative Action Hurts Students It’s Intended to Help, and Why Universities Won’t Admit It.

Two-thirds of Black Students in Bottom 15%

Sander’s empirical analyses, summarized here, found, for example, that “two-thirds of black [law] students end up in that bottom 15%” of their classes and that that “the median black student at all of the schools using substantial racial preferences had an LGPA that placed her within only the sixth percentile of the white students. In other words, 94 percent of whites were getting better grades than the median black. Conversely, only about 10 percent of all black students were making it into the top half of their classes.”

Incidentally, or perhaps not so incidentally, Sander’s initial publication of his findings, “Systematic Analysis of Affirmative Action in American Law Schools,” 57 Stanford Law Review 367 (2004), had first been submitted to the University of Pennsylvania Law Review. “Within a couple weeks, they got in touch with news that the piece had been accepted for publication,” Sander reported in Mismatch. “But a few days later a very embarrassed editor called back.  Word about the article had spread to the entire Law Review membership, and a battle had erupted over whether UPLR should be associated with something so controversial. The membership had eventually voted to rescind the offer.”

Law Review Refuses to Run Article

One suspects that the “battle that erupted” among the membership of the Penn law review may have been related to the fact that the UPLR had adopted an affirmative action appointment procedure. As described by The New York Times, “the key to the new system seems to be keeping secret newly established numerical goals, so that law students will not be able to determine who is an affirmative action appointment.”

Law School Dean Ted Ruger asserted that “the Law Review does not have a diversity mandate,” but, of course, that does not mean or even claim, it does not have an affirmative action selection procedure. In a similar fashion Dean Ruger has claimed that Wax’s statements are false, but he has not released any data to refute her. In any event, nothing seems to have improved at Penn law since its shameful treatment of Sander in 2004.

Although the spectacle of snowflake students, professors, and deans responding with shock and awe to controversial ideas is all too familiar, there are implications of this latest Wax contretemps that should raise troubling new concerns. Since Professor Wax’s offense is that her personal observations tended to confirm Professor Sander’s analysis, does it follow that Sander’s work should not be assigned to any class Penn students are required to take?

“Oh, no, that’s not the same!” Penn’s skittish deans and sensitive students might insist since work critical of Sander could also be assigned to assure “inclusiveness” etc. But in that case, wouldn’t a professor suggesting, implying, or — heaven forbid — stating that she agrees with Sander also make those objecting Penn students and their enablers feel uncomfortable and “unwelcome”?

The attack on Wax makes clear that the defense of racial preferences requires not only discrimination on the basis of race — that has long been clear — but protecting students from being exposed to unwelcome evidence and suppressing unpopular expression and, perhaps ultimately, even subversive thoughts.

How ‘White’ Western History Has Become an SNL Skit on Campus

In National Review this week, George Weigel writes a pointed commentary on another example of humanities professors undermining their own field. It’s a curious phenomenon, but one you often see. A scholar-teacher steps forward to condemn or distort the materials of his own field, or to rebuke past and present practitioners of it, not realizing the consequence of his actions. I just opened the Chronicle of Higher Education website and found an essay at the top of the Opinion section entitled, “The Whitesplaining of History Is Over.”

It’s written by a history professor at Stanford who begins by noting how the academy used to be “the exclusive playground of white men [who] produced the theories of race, gender and Western cultural superiority that underwrote imperialism abroad and inequality at home.” Fortunately, we are told, women and people of color are now correcting that vile record of white supremacy and changing the profession for the better.

Many conservatives might judge the Chronicle harshly for publishing an essay so packed with ressentiment and clichéd in its reasoning. Instead, however, we should thank the editors for revealing the presence of such bilious attitudes among the elite professorate. It helps explain why history enrollments are plummeting. Would you like to spend an entire semester at Stanford proselytized by this personality?

Weigel’s exhibit comes from College of the Holy Cross. There, he reports, the incoming chairman of religious studies wrote some years before this about how the Gospel of John presents the last days of Jesus:

Oddly, John defines Jesus’ masculinity with a body that is open to penetration. . . . Even more oddly, Jesus’ ability to face his “hour” is repeatedly associated with his acknowledging and communing with his Father (12.27–28; 14.12,28; 16.10, 17, 28; 17.1–25; 18.11), who is, as Jesus explicitly states, “with me” (16/32) throughout this process, which Jesus describes as one of giving birth (16.21–22). What I am suggesting is that, when Jesus’ body is being penetrated, his thoughts are on his Father. He is, in other words, imagining his passion experience as a (masochistic?) sexual relation with his own Father.

This is beyond parody. We would wonder about the mental state of the author if such readings weren’t so common. As Weigel notes, it is “exceptionally inane,” just another hack rehearsal of queer interpretation, one that says more about “the exegete and his imagination than about St. John and his intentions.” It has a high “yuck factor,” too, but that hasn’t stopped the author from enjoying considerable authority at his Catholic institution.

One thing we can be sure of when we read such things by the Stanford historian and the Holy cross theologian: they don’t give a thought to the impression they make on outsiders. Their colleagues may nod in approval, but the vast majority of Americans, including students who attend such schools, realize instantly that this is a teacher they shall avoid.

If you love history and learned it from popular historians such as Bruce Catton and Stephen Ambrose, you will learn from the Stanford prof that those figures were just a pack of “whitesplainers.” And your enjoyment of them is shameful. If you’re a devout Christian who regards the Last Supper as the first step in the “holy hush of ancient sacrifice” (to borrow a line from Wallace Stevens), the Holy Cross theologian’s speculations are sickening. One couldn’t imagine a mathematician “queering” topology in this way. An engineer can’t denounce the white men in his own field and profit from it. Progressives are, certainly, trying to introduce identity politics into STEM fields, of course, but the empirical element and analytical rigor are steady forces of resistance.

No doubt many humanities professors would praise these and similar figures as brave and edgy intellects, but that’s only comforting pretense. The truth is the opposite. A scholar who shows gratitude for traditionalist projects such as Great Books of the Western World is the one who risks his reputation among those who count—that is, among the people who will review him for hiring and promotion.

There is a greater sin than self-regard here, though. These fools don’t realize that you can’t knock your own field and be an advocate of it. As long as history and English and theology were secure in the undergraduate curriculum, with enrollments high and resources steady, one could play the adversarial game, treating your own field and colleagues with a hermeneutics of suspicion. But now that the Golden Age has passed, when Boomers poured onto campus and the humanities fields were a default major, the professors must think more about their image.

And there’s the problem. They don’t want to do it. To inspire students with the greatness of the materials of their field, they would have to believe in that greatness. They don’t. To allow students to savor literary and history and art would be to fall short of the critical thinking professors say is essential to humanistic study. These figures prefer guilt, or risqué irreverence, or demystification. They don’t realize how much of a downer they are. They have presided over the fields during their downfall, but they won’t look in the mirror and acknowledge their responsibility. Their shtick is set, and it’s worked for them for so long that they don’t see any reason to change.

Will Colleges Divest from The Gun Industry?

Bowing to pressure from MoveOn.org, Monsignor Franklyn M. Casale, the President of St. Thomas University in Miami, issued an ultimatum on March 13 to Anita Britt, his Chief Financial and Administrative Officer: choose between your work for this university or membership on the Board of American Outdoor Brands, the former Smith & Wesson Holding Company cited as the maker of the AR-15 that Parkland shooter Nikolas Cruz used –you can’t do both. Britt chose the gunmaker.

It was an abrupt reversal for Monsignor Casale who as recently as March 9th issued a letter of support for Britt—stating that the university has a policy aligning with one adopted by the U.S. Conference of Catholic Bishops, “calling for reasonable approaches to gun violence.” Casale claimed that “American Outdoor Brands provides her the opportunity to participate in helping the company achieve its objectives of making our communities safer. Her role with the company does not conflict with her responsibilities here at St. Thomas. We look forward to her continued participation in our leadership.”

Across-the-Board Divestment

Four days later, in the wake of a well-publicized MoveOn petition that garnered nearly 300 signatures, Casale changed his mind and issued the ultimatum to Britt. Casale tried to explain his reversal to the Miami Herald: “After my statement of this past Friday, it has become clear that many of the sensible and reasonable solutions to this gun epidemic, which have been discussed previously, were becoming less and less clear.”

The MoveOn petition calling for Britt to step down was created by Praveen Kathpal, an Alexandria, Virginia resident who is a Vice President for an energy storage company. Kathpal also serves as Chair of the Board of Directors of the Energy Storage Association—an energy advocacy group. Claiming to have become an anti-gun advocate following the 2017 shooting at the Congressional a baseball practice at the YMCA ball field his son had attended, Kathpal sent an email to the Miami Herald warning “if individual influential people are urged to confront their place in America’s gun violence epidemic through societal pressure, they might make different choices.” Kathpal has also created a second MoveOn petition calling for the resignation of Smith and Wesson Board member Greg Gluchowski, a CEO for a Cincinnati company.

The St. Thomas reversal is just the latest attempt by MoveOn to control campus conversations on a long list of progressive causes including divesting from fossil fuels, increasing access to reproductive rights, expanding transgender rights, and most recently, gun control. Badgering board members is just the start. There are now calls for universities to divest from the gun industry.

Divestment campaigns at universities began in the 1970s when activists demanded divestment in South Africa because of its apartheid. Bloomberg reported that by 1988, 155 educational institutions had severed investment ties, pressured by protests at Harvard, Columbia, Michigan State and others. Aligning an institution’s money with its stated values is reasonable, especially when socially responsible investing might have positive impacts. But there are many gray areas.

More recently, divestment has focused on fossil fuels. A few years ago, the University of Dayton—a Catholic university in the middle of the rebounding coal and shale industry—announced its intention to become the first Catholic university in the country to divest coal and fossil fuels from its $670 million investment pool.

Although pressure for universities to divest from the gun industry has been unsuccessful in the past, it is likely to gain traction in the wake of the Parkland shooting. According to Bloomberg, the University of Notre Dame, with $11.8 billion under management, “adheres to guidelines from the U.S. Conference of Catholic Bishops, which asks members to avoid companies that do harm.”

Notre Dame declined to say whether it holds firearms investments, citing a policy of not commenting on its investments. Bloomberg reports that some colleges and universities are prohibited from social activism. The University of Texas, with its $40 billion, is governed by a policy banning investments that would “advance social or political purposes.”

While some colleges, like Grinnell, have continued to honor agreements with donors who hold the highest positions at the National Rifle Association, it is likely that others will attempt to divest. Inside Higher Ed reports that “interest in divestment has spiked in recent weeks.” Some of the largest university money managers, like TIAA-Cref, have been targets of online petitions.

It is likely that the pressure St. Thomas University faced will be replicated elsewhere as anti-gun billionaires like George Soros will continue to bully colleges and universities with ties to the gun industry. Soros has a long history of this kind of bullying. In 1998, Soros funded an unsuccessful lawsuit against gun manufacturers. At the World Economic Forum in 2016, when Soros was asked by International Business Times whether he believed investors who support gun control should divest from firearms companies, he responded: “I’m very much against guns, and if it can be organized on a large enough scale, I wouldn’t be opposed to it.”

Still, his antipathy to guns has not prevented Soros from benefiting financially from the gun industry. Securities and Exchange Commission filings reviewed by the International Business Times demonstrated that Soros Fund Management was a top institutional shareholder in Vista Outdoors—one of the country’s top ammunition manufacturer, and Olin Corporation, which makes ammunition under the Winchester name.

The Soros Fund purchased a stake in Vista in early 2015 and held $11.4 million worth of shares in the company at the end of September 2015. Soros is not alone in this. While the American Federation of Teachers has backed gun control proposals, the pension savings of its members have been invested in the firearms industry, and the California State Teachers’ Retirement System owned $7 million worth of shares in Vista and Olin.

Whether the pressure to divest from the gun industry will succeed on additional college campuses remains to be seen. Northwestern University, Texas A & M University, the University of California system and the University of Michigan have already indicated that they do not have investments in firearm manufacturers. Others are likely to follow.

Jordan Peterson Provokes the Angry SJWs

I didn’t really want Jordan Peterson to provide me with 12 Rules for Life. It was enough that Professor Peterson defied the transgender advocates at the University of Toronto who wanted him to adopt nonsense pronouns to address his students. It was heartening to see Professor Peterson stand his ground against that obnoxious guardian of PC verities, Britain’s Channel 4 political correspondent Cathy Newman. But I had no special interest in watching his YouTube lectures and podcasts or plunging into Professor Peterson’s 400-page how-to book (12 Rules) for those suffering the anomie of modern life.

Then along comes Pankaj Mishra in the New York Review of Books to explain the link between “Jordan Peterson & Fascist Mysticism,” and I see no real choice but to pay some attention to the slightly eccentric Canadian defender of English pronouns. Is he the emerging leader of a crypto-fascist cult? Someone leading youth down the path of dangerous lies and illusions? Or is he, as I had supposed, a well-spoken contrarian who has decided to take a personal stand against some of the self-destructive silliness of our age?

I won’t keep you in suspense. Peterson is pretty much who he appears to be to those who have not become unhinged by their hinge-destroying wokeness. That is, Peterson strives to be a gentleman, but one who has honed some sharp opinions about feminism, social justice warriors, and attempts to put progressive ideology in the center of domestic life. These things mark Peterson as an enemy to Pankaj Mishra, an Indian essayist and novelist, who has something of a side-specialty in penning diatribes against Western scholars who do not come up to his standards.

In 2011, Mishra attacked the British historian and Harvard professor Niall Ferguson for his book Civilization: The West and the Rest, accusing Ferguson of racism. Ferguson responded in strong words, quoted in The Guardian, describing Mishra’s critique as “a crude attempt at character assassination” that “mendaciously misrepresents my work but also strongly implies that I am a racist.” He called Mishra’s article “libelous and dishonest.”

Mishra likewise went after the distinguished British journalist Douglas Murray in a New York Times review of Murray’s book, The Strange Death of Europe: Immigration, Identity, Islam. He characterizes Murray’s book as “a handy digest of far-right clichés,” “fundamentally incoherent,” and marked by “retro claims of ethnic-religious community, and fears of contamination”—all without any attention to what Murray truly says. Mishra goes out of his way to continue his attack on Murray in his essay on Peterson, linking the two as members of the “far right sect” that idolizes Solzhenitsyn and deplores “the attraction of the young to Bernie Sanders and Elizabeth Warren.” The “sect” appears to consist of everyone who has doubts about the left’s current conception of “egalitarianism.”

Mishra no doubt pleased his readership at the London Review of Books and The New York Times with his attempted take-downs of Ferguson and Murray, and I expect no less from the readers of the New York Review of Books in the case of Peterson. The left is perpetually hungry for figures it can demonize. It can gorge on its hatred of Trump but still feel an appetite to devour some other prey. Mishra’s article is an attempt to supply a recipe. Peterson is staked out mainly because he has become so popular, or, as Mishra puts it, Peterson’s “intellectual populism has risen with stunning velocity; and it is boosted, like the political populisms of our time, by predominantly male and frenzied followers.”

So, part of the problem with Peterson is that he attracts those frenzied deplorables. What does Peterson really have to offer in 12 Rules for Life: An Antidote to Chaos?

No, Not Mysticism

Peterson is a clinical psychologist who has wide-ranging interests in mythology, literature, religion, and philosophy. When Mishra jabs at him for “mysticism,” he goes wide of the mark. Peterson is a rationalist attempting to find a core of meaning in the world’s diverse myths and religions. When Mishra doubles up with the charge of “fascist mysticism,” he is apparently extrapolating from Peterson’s adoption of Martin Heidegger’s use of the term Being (“with a capital B”). Heidegger was one of the 20th century’s most important philosophers, but he infamously threw in his lot with the Nazis, and he is not everyone’s bottle of schnapps. Not mine at any rate. Still, Being “with a capital B” is a long-established philosophical term, and Peterson provides a layman’s definition at the outset. “Being” is “the totality of human experience,” in contrast to objective reality. It is “what each of us experiences, subjectively, personally and individually, as well as what we each experience jointly with others.”

That’s as technical as Peterson ever gets, and it is neither mystical nor fascist. The rest of the book consists of happily phrased bits of advice (the “12 rules”) that are really occasions for discursive essays that weave together humor, science, and common sense. The literary tradition that Peterson belongs to is that of 18th century English essayists such as Addison and Steele in The Spectator and Samuel Johnson in The Rambler. Those writers could be framed as “intellectual populists” too since their goal was to uplift the growing middle class through palatable moral instruction. Different times call for different tones, but it would not be far from the mark to say that Jordan Peterson is the Joseph Addison of the 21st century.

No Slouching, Shoulders Back

Rule 1: “Stand up straight with your shoulders back.” I’ve heard this often enough from my gym trainer to wonder whether he should get a cut of Peterson’s royalties. But, no. Along with most people, I heard it through childhood. It is age-old wisdom. I know of no culture where children are taught, “Slouch when you stand, and hunch your shoulders.” So, Peterson starts on firm ground. His first rule isn’t just for desk-bound Americans or slouchy teenagers. It has common humanity written into it, or, if you will, Being.

But it is less the rule than the essay written around the rule that counts. In this case, Peterson begins by comparing the territoriality of lobsters and house wrens, thus establishing that the animals that are wired to defend themselves range from the ocean bottom to the air. We then learn a bit about the dire consequences to a lobster that loses its fight for its territory. (Its brain shrinks.) This leads to some comments on neurochemistry and an observation on the “unequal distribution” of “creative production. Most scientific papers are published by a handful of scientists. Only a relative few musicians produce most of the recorded music, etc. Once dominance is established, the winner usually prevails without a fight. All the victorious lobster needs do is “wiggle his antennae in a threatening manner.”

This takes us but a few pages into the world of Rule 1, but the reader by this point can foresee the destination. Standing up straight with your shoulders back is the way human beings signal confidence and mastery of the situation. We are men, not lobsters, but we are all part of a biological order that follows the same basic rules. “Walk tall and gaze forthrightly ahead,” says Peterson, and good things will happen. People will assume “you are competent and able.” You will be “less anxious.” “Your conversations will flow better.” And “you may choose to embrace Being, and work for its furtherance and improvement.” And after that, “you may be able to accept the terrible burden of the World [with a capital W] and find joy.”

The whole essay flows smoothly with Peterson’s lightly-worn erudition until it hits the curb at the very end with that joyful embrace of Being. Some readers no doubt will find it gives a little spring to their intellectual step—a sense that we have transcended the order of Crustacea and are now in the sad but ennobling predicament of humanity. I don’t mind so much Peterson’s efforts to elevate the prospect, but his stepping stool of Heidegger’s jargon is intrusive.

But does it make Peterson a fascist mystic? No, it makes him, like most scholars, someone who indulges some of his whims.

Hierarchies Found in Nature

In his explanation of why Peterson is so bad, Mishra touches on Peterson’s universalism. Peterson, he says, “insists that gender and class hierarchies are ordained by nature and validated by science.” This is a serious distortion of Peterson’s point. Peterson, it can be fairly said, argues that the principle of hierarchy can be found in nature and that humans are not at all exempt from that principle. But that is a very long way from saying that Peterson validates “gender and class hierarchies” in general. He does nothing of the kind.

Mishra speculates that “reactionary white men will surely be thrilled by Peterson’s loathing for ‘social justice warriors.’” And he proposes that “those embattled against political correctness on university campuses will heartily endorse Peterson’s claims” that whole academic disciplines are hostile to men. Well, now that you mention it Pankaj, the latter statement seems securely grounded in the facts. But isn’t it a bit odd to attack a book and an author by speculating on the sorts of readers the book may attract? For what it is worth, I suspect the core audience for 12 Rules for Life includes plenty of young women as well as young men, launched into adult life from colleges and universities that have given them no serious moral preparation at all—only a basketful of social justice slogans and anti-Western attitudes.

David Brooks extolled Peterson’s book in The New York Times but, like Mishra, takes the book as mainly directed to young men for whom it counsels, “discipline, courage and self-sacrifice.” Oddly, it is only young women reading it on the subway. Perhaps they are looking for those disciplined, courageous, and self-sacrificing young men, who are now so conspicuous by their rarity.

Peterson is stepping into that space with a non-sectarian message that respects the multicultural sensibilities of the young. He is for sure grounded in Western thought and literature but is ready at any moment to draw on non-Western cultures and traditions. This isn’t always to elevate those cultures. When he writes about Western homicide rates, he compares them to the !Kung bushmen, dubbed by anthropologists “the harmless people,” whose annual homicide rate is eight times that of the United States. Of course, the Kalahari !Kung are pretty peaceful compared to other primitive peoples.

At Least Don’t Lie

It seems a bit unfair to Peterson to divulge all twelve of his rules, though they are freely available on the Internet. Moreover, the rules themselves are not the heart of the book. What he builds around the rules is what counts. But for the flavor of the thing, here are a few of Peterson’s aperçus:

Rule 2. “Treat yourself like someone you are responsible for helping.”

Rule 5. “Do not let your children do anything that makes you dislike them.”

Rule 8. “Tell the truth—or, at least, don’t lie.”

Rule 9. “Assume the person you are listening to might know something you don’t.”

They just don’t make fascist mystics the way they used to.

Either that or the vitriol of reviewers for progressive journals is reaching new concentrations. Essays such as Pankaj Mishra’s “Jordan Peterson & Fascist Mysticism” seem designed to give permission to liberals to sneer at writers whom they have never read. An Indian intellectual says that so-and-so is a racist, an ethno-nationalist, a fascist, a mystic. You are therefore on good ground to ignore so-and-so, and if his name comes up in conversation, you know exactly which epithet to apply.

Peterson, as far as I can see, deserves his popular success. He is a morally serious, highly literate writer who has important things to say. He says them rather well in an entertaining manner that doesn’t compromise either his clarity or his essential points. 12 Rules for Life isn’t faultless. Peterson sometimes wanders too far afield, and his forty-some mentions of Being is about 39 too many. But for readers trying to find their way through the “chaos” of contemporary North American cultural decline, these “rules” are a good place to begin. If you don’t like all of them, that’s fine. Peterson will at least make you think about why you don’t like them, and perhaps you will find your way to a better distillation of wisdom. But you probably won’t find a better Virgil to take you safely step by step through today’s Inferno.

Open Season on College Presidents As Faculty ‘Mobs’ Wield Power

The no-confidence season for college presidents got off to an early start this spring with a nay vote from the Michigan State faculty for the university’s interim president and the entire Board of Trustees in the wake of the Larry Nassar sexual abuse scandal. Starting with the angry rebellion against Harvard president Lawrence Summers in 2005, faculties have been increasingly willing to mobilize to bring down senior-level administrators. Summers was targeted after he suggested at an academic conference that innate male-female differences might possibly provide a partial explanation why mathematics and engineering faculties remain so heavily male.

While there was no evidence of discrimination in hiring, Harvard’s hastily formed Caucus for Gender Equality charged Summers with failing to hire enough female professors, and Summers retracted his suggestion and issued what The Atlantic’s Stuart Taylor, Jr. called a “groveling Soviet show trial style apology.” A short time later Summers resigned.

Sometimes faculty-led protests, what social scientists call “mobbings,” can have deadly consequences. In 2006, UC Santa Cruz chancellor Denise Denton leaped to her death from a 42-story San Francisco high rise in the wake of a well-orchestrated attack by the Santa Cruz faculty that included death threats, harassment, vandalism and a hostile media campaign. At the height of the protests, someone threw a large metal pole through a window in Denton’s home, shattering glass throughout her living room. The San Francisco Chronicle noted that Denton, who had received a doctorate in electrical engineering from MIT and won a prestigious national award for encouraging women and girls in science had “very high standards…she expected people to perform.”

Before it was applied to academia, the term “mobbing” was used almost exclusively in zoology, characterizing the behavior of small birds ganging up aggressively on a larger predator bird.” Emboldened by the Summers success, faculty at the New School mobilized in 2008 to remove their President, Bob Kerrey, a Vietnam war hero, winner of the Medal of Honor, and 2-term Senator (D-NE). Faculty complaints focused—as they nearly always do—on faculty perceptions of Kerry’s lack of commitment to shared governance. Professor Jim Miller, the public face of what New York Magazine called “the New School faculty’s discontents,” complained that Kerry never understood “what was special” about the New School. Miller believed that Kerrey “doesn’t get its special anarchy and founding moments. He just sees it as an economic puzzle to be solved.”

The faculty had a special contempt for Kerrey’s war service—leading the interviewer to write: it was hard not to look around and think: “These are the kinds of people who’ve given Kerrey grief from the moment he came home from Vietnam.” Although Kerrey survived for a short while after the faculty no-confidence vote, he told a New York Magazine interviewer: “I was sitting with my 7-year old with a bunch of screaming maniacs outside my building, thinking, who needs this?”

No-confidence votes are almost always about shared governance—usually focusing on how university funds are allocated to faculty-favored programs. Similar complaints emerged at Mount St. Mary’s in 2016 when the faculty voted “overwhelmingly” to ask President Simon Newman to resign. That vote drew national media attention when angry faculty members forwarded confidential emails to the campus newspaper. Newman, a Los Angeles private equity manager and strategic planner, was hired in 2014 amid serious budget concerns at Mount St. Mary’s. In 2013, Forbes ranked it one of the “least financially fit schools in America.” In the Forbes financial rankings of 927 colleges, Mount St. Mary’s was one of 107 colleges to receive the D grade—ranking 888th out of 927 in terms of the balance sheets and operational strength.

Newman was hired to help turn the ailing University around. A year later he found himself at the center of a faculty-led firestorm over some intemperate remarks he made about retention. According to media sources, Newman was talking privately with some faculty members about retention strategies when he jokingly said: “This is hard for you because you think of the students as cuddly bunnies, but sometimes you just have to drown the bunnies…put a Glock to their heads.” The faculty did not appreciate the joke.

But, the truth is that the intemperate remark was not the real cause of faculty anger. Rather, Newman was suggesting curricular changes to reduce the number of required philosophy and humanities courses in the Core. The Core is often “sacred ground” for humanities professors on Liberal Arts campuses like Mount St. Mary’s—a kind of “third rail” that administrators know can never be touched. It is not a coincidence that most “no-confidence votes” emerge from Arts and Sciences faculty.

Newman paid a high price for even suggesting that it would benefit students to have the flexibility to take more science and math courses by reducing—slightly—the required humanities courses. In an 87 to 3 vote on February 12, 2016, the faculty released an open letter saying, “we appeal to your generosity of spirit and ask that you resign your position for the good of our community by 9 am on February 15, 2016.”  The Board eventually demanded that Newman step down.

There is often a critical incident—like Newman’s bunny joke—that provide a rationale for the overt mobbing to begin. Researchers know that these incidents are just the “struck match…the kindling’s been stacking up for years, dry and brittle and some of it drenched in gasoline.” This is the real cause of the conflagration.Issues surrounding shared governance almost always precede the incident—and then, the rumors begin. At Mount St. Mary’s there were rumors that President Newman, a former private equity chief executive, was going to “dilute” the college’s Catholic identity.

Jesuit Universities have been especially hard hit as Fordham University, Creighton, and the University of St. Louis have all experienced “no-confidence” votes in their Presidents in the past few years. On April 7, 2017, faculty members at Fordham voted no confidence in the University’s President, the Rev. Joseph M. McShane in response to a recent salary and benefits package offered by university administrators.  The Jesuit Creighton University faculty issued a no-confidence vote in 2015 against the strategic plan that had been crafted under the former Creighton President, the Rev. Timothy R. Lannon, S. J.—claiming that it was drafted and enacted without allowing for thoughtful criticism and feedback from students, faculty and staff and tended to “import aims, goals, reward systems, and methods of social engineering suited more to corporate America than to an institution of higher education.”

According to media reports, St. Louis University’s Fr. Lawrence Biondi devoted 26 years to leading the Jesuit university to prominence—increasing both the number and the quality of the students, doubling the acreage of the school, and stabilizing a huge swath of the city, making the Grand Center arts district, and extending SLU’s influence throughout the city and the world.” St. Louis’s mayor Francis Slay called him “one of America’s greatest college presidents.” But it could not save him from a faculty mobbing action apparently focused on a tenure dispute and an unpopular academic vice president.

Of course, some of these no-confidence votes are the result of self-inflicted wounds. Last month, Beverlee J. McClure, the beleaguered former president of Adams State University drew national media attention because of allegations that she bullied faculty and retaliated against her critics—and wore an unfortunate “overweight plumber” costume to a faculty Halloween party.

In yet another offensive choice in a Halloween costume appears to have contributed to derailing the career of University of Louisville President James Ramsey when a 2015 photo surfaced showing Ramsey and his staff dressed as Mexican stereotypes with fake mustaches, sombreros, and maracas. While the costume was offensive to some, the critical incident brought attention to serious financial concerns. A 2017 audit revealed that the university suffered more than $100 million losses from mismanagement and excessive spending during his tenure. The Louisville Courier-Journal reported on July 14, 2017, that “the university is poised to go after (his) personal assets.”

An increasing number of college presidents are becoming swept up in the fallout from the #metoo movement. Bates Technical College President Ron Langrell was placed on paid administrative leave last month over allegations that he has been intimidating and demeaning employees, and that he engages in “unwanted hugging.” And, last March, University of Texas, San Antonio’s longtime President, Ricardo Romo, announced that he would retire after he was placed on administrative leave during an investigation of his “improper hugging” of faculty and staff members.

Last year, Briar Cliff University President Hamid Shirvani announced his resignation after only 14 months on the job. Citing a “combination of family, personal and professional considerations,” Briar Cliff’s Board dismissed local media reports that he was investigated for sexual harassment claiming that it “inaccurately and inappropriately cast a cloud over his leadership.” The Sioux City Journal reported that Shirvani lasted just 11 months as chancellor of the North Dakota University System and received a buyout of $925,000 over concerns about his management style and treatment of staff. In a prior presidency at California State University, Stanislaus received a “no confidence” vote in his leadership by 91 percent of the 264 professors on campus.

In October 2017, a total of 56 faculty members out of 120 full-time faculty at Assumption College voted “no confidence” in Assumption’s President Francesco Cesareo because of declining enrollments and layoffs since 2016. Last year, the Catholic University of St. Thomas in Houston voted “no confidence” in President Robert Ivany’s leadership after he sent an email to faculty members in philosophy and English advising them that their contracts had been delayed because the departments were “under review for potential reorganization and/or program elimination.”

Jeanine Stewart, professor of psychology at Washington and Lee University, thinks that the very design of academia fosters these kinds of counterproductive behaviors. She suggests that when large numbers of faculty members report to a single dean or provost, informal pecking orders emerge. She calls these virtual power structures “soft hierarchies” in contrast to the kind of hard hierarchies that you see on an organizational chart. It is within the highest tiers of the soft hierarchies that power is concentrated—often in the humanities. Summers, Newman, Kerrey and other victims of mobbing behavior likely never understood the power of a soft hierarchy. Their lack of understanding of the soft hierarchical power dynamic undermined their ability to work cooperatively with faculty leaders.

Faculty have been empowered by their successes, but their victories may have come at a cost higher than most schools can afford to bear as financial pressures have already begun to take a toll. An article published in Insidehighered.com titled, “The Culling of Higher Education Begins,” reveals that the number of colleges and universities eligible to award federal financial aid fell by 5.6% from 2016 to 2017—the fourth straight year of declines. No-confidence votes create instability and uncertainty for everyone, leading potential students and their parents to lose confidence also. Perhaps it is time to re-think what is becoming a self-defeating strategy.

Why a Penn Professor Was Vilified for Telling the Truth About Race

Professor Amy Wax at the University of Pennsylvania Law School is once again the target of students and faculty members who have ginned up a racial grievance against her. The issue is that she said something that is apparently true that her critics would rather remain unsaid. The immediate consequence is that Penn Law Dean Ted Ruger has stripped Wax of her teaching assignment in the mandatory First Year curriculum.

What Wax said, essentially, is that black graduate students at Penn Law do less well academically than other students.

Probably what lies behind Wax’s observation is that the Law School admits black students at a lower threshold of academic qualifications than it admits white and Asian students. That’s a guess, based on a lot of circumstantial evidence. The University of Pennsylvania is a private university and does not make available a racial breakdown of its admissions standards. Across the country, battles rage to get even public law schools to acknowledge the extent of the racial preferences they use to bolster the numbers of black enrollees.

Richard Sander and Stuart Taylor, Jr.’s book Mismatch: How Affirmative Action Hurts Students It’s Intended to Help and Why Universities Won’t Admit It (2012) remains the definitive statement of the problem. (The National Association of Scholars is not a bystander on this issue. We just filed an amicus brief in Sander v. State Bar, an appeal currently pending in California.)

To understand why Wax’s simple observation would occasion such heated attacks against her, we must keep in mind the furious effort of will by proponents of racial preferences to deny the realities of the situation.

Cadmus and Company

Racial preferences in college admission are dragon’s teeth.

In Greek mythology, when the hero Cadmus kills the dragon that guards Ares’ spring, he plants the creature’s teeth, and up spring ferocious and fully-armed warriors. This odd bit of agriculture isn’t a freak occurrence. The hero Jason also plants a set of dragon’s teeth and likewise harvests a bunch of ill-tempered warriors. Dragon’s teeth is a handy image for what happens when we think we solve one big problem—an unfriendly dragon—but end up creating a collection of even worse problems.

When we deny that racial preferences result in classes in which many of the black students are less qualified and less capable than other students, we are sowing dragon’s teeth. The teeth come back as social justice warriors.

The warriors may silence the messenger, but that can’t extinguish the truth. In 2005, The New York Times published the results of a study that appeared in The Stanford Law Review that concluded, “Affirmative action actually depresses the number of black lawyers, because many black students end up attending law schools that are too difficult for them, and perform badly…. Once at law school, the average black student gets lower grades than white students: 52 percent of black students are in the bottom 10th of their first-year law school classes, while only 8 percent are in the top half. And the grades of black students drop slightly in relative terms from the first year of law school to the third.”

Round One

Professor Amy Wax stirred up controversy last August when she co-authored a newspaper op-ed in which she praised “bourgeois values.” She meant things like hard work and getting married before having children.

Some Penn Law School students and faculty members at the time judged Wax’s thoughts to be racially hurtful and demanded that Wax be punished. They lost that round. Wax had done nothing beyond the scope of her academic freedom, and she held her ground.

Round Two

But her enemies are now back with a new plan to punish her—a plan that has been adopted in part by Dean Ruger.

In September, a few weeks after the famous “bourgeois values” op-ed article, Professor Wax mentioned in a lecture to first-year law students that she had never “seen a black student graduate in the top quarter of the [Penn Law School] class and rarely, rarely in the top half.”

Having discovered a video of this lecture, her critics drafted a petition addressed to Dean Ruger. The petitioners call Wax’s remarks “disparaging, false and deeply offensive claims.” They also assert that her broad statement (Wax mentioned no individuals) was a “clear violation” of “Penn Law’s anonymous grading policy.” And they called on Dean Ruger to “dispel the lies” in Wax’s statement; “Permanently remove Professor Wax from teaching 1Ls” (the mandatory first-year law course she has been teaching);             “Permanently remove Professor Wax’s appointments to the Clerkship Committee, and any other committees that involve leading and directing the law school”; and take all these actions “publicly.”

Dean Ruger accordingly declared publicly that Wax’s statements are false. He wrote:

It is imperative for me as dean to state that these claims are false: black students have graduated in the top of the class at Penn Law, and the Law Review does not have a diversity mandate. Rather, its editors are selected based on a competitive process. And contrary to any suggestion otherwise, black students at Penn Law are extremely successful, both inside and outside the classroom, in the job market, and in their careers.

On its face, Dean Ruger’s statement seems to mean that Professor Wax got it wrong. But we shouldn’t forget that this is lawyer language, and it has built into it some curiously slippery clauses. Dean Ruger doesn’t actually say that Wax’s claims are false. He just says that “it is imperative” that he says they are false. The imperative is that he has a bunch of angry students demanding that he say so, regardless of accuracy. By golly, Dean Ruger is a man who lives up the imperatives, which may not include telling the truth.

As of this writing, no one—not the dean, and not the petitioners—has come forward with any evidence that Professor Wax’s comment was inaccurate. It presumably wouldn’t be hard to check whether any black students had graduated in the top quarter of their Penn Law School classes. I don’t suppose Professor Wax to be error-proof. But if there are one or several such graduates to be found, where are they?

Dean Ruger gave the protesters two more of their demands: he took Wax’s first-year course away from her, and he conducted his actions in public by issuing them as a widely distributed “message.”

Frenzy

The accuracy of Wax’s observation has been challenged, but by means of indignation and sheer assertion, not evidence. If it happens that evidence of overlooked students in that top quarter does emerge, it is likely to be the sort of exception that proves the rule. Clearly, no substantial number of black students are in this quartile. If there were, Wax’s statement would be laughed at rather than made the gravamen of an accusation.

Up from the ground in which the dragon’s teeth of racial preferences were buried have sprung the armed warriors desperate to defend racial preferences. These warriors want Professor Wax silenced, ostracized, and exiled. They may seem to have achieved a good portion of what they wanted, but I wouldn’t count on that as a long-term victory for their cause, or as a moment for Dean Ruger to bask in their approbation.

Professor Wax, who serves on the board of the National Association of Scholars, knows how to defend herself. The spectacle of students and faculty-driven to a kind of frenzy by the mention of facts deemed unmentionable is not likely to redound to the reputation of Penn’s Law School.

The public at large will understand the main point: Admit lower quality applicants to an institution of higher education, and the individuals so admitted will, on the whole, perform more poorly than those who are admitted according to higher standards. It is a hard truth. We have imposed taboos in higher education against talking about it, but that doesn’t change the reality. The taboo merely fuels the rage of those who have invested themselves in keeping up the illusion.

Image: Hendrick Goltzius, Cadmus fighting the Dragon

Five Realities of Tribal Politics

Chattering classes throughout the world are talking about identity politics and with good reason. It is propelling the so-called populist movements, and the response to those movements, which are shaking the foundations of almost every society today. Whether a polity is democratic, authoritarian, or anarchic, it is awash with clamorous appeals to relatively narrow allegiances based on race, religion, class, social position, gender, ideology, party — and typically some combination of them.

In the United States, college campuses, where our future voters and leaders are seeded, are the breeding grounds for these sectarian dispositions. There, young people newly emancipated from their families’ supervision are free to define themselves afresh. There, they are pressured by peers and professors alike, as well as social media, to endorse the orthodoxy of the tribe.

The notion that there are but two sides on important questions – right versus wrong, tolerant versus bigoted, progressive versus conservative — is an unfortunate feature of this war of words. The real world, when they finally enter it, will discipline their minds in ways that their campus lives have not, but the residue of ideologies already implanted there may continue to shape them as voters and fellow citizens. So, here’s this professor’s effort to clarify the nature of identitarian rhetoric.

I offer five propositions that may confound partisans on all sides.

First, all politics is identity politics even though the identities that are emphasized constitute but a small part of who we really are. This is neither a new phenomenon – bitter, seemingly unbridgeable divisions have often occurred in American history — nor the exclusive or even predominant preserve of the left or the right. In any democracy, electoral politics means, among other things, dividing people up rather than uniting them, which is much harder. Getting elected entails “rubbing raw the sores of discontent” and “mobilizing bias” (as two analysts have put it).

All human societies are tribal. As Amy Chua argues in a new book, Political Tribes: Group Instinct and the Fate of Nations, we all seek warmth and solidarity from those who we think are like us in some important respects. But beyond a certain point, tribalism can be pathological. Half of Republicans and a third of Democrats say they would be upset if their child married a member of the other party, and these antipathies are steadily deepening. This growing polarization of the parties parallels clustering of partisans in states, localities, and even neighborhoods, and it is occurring within parties as well.

Second, the appeal to the traditional transcendent unifying norms are debatable – notably “American values” and “the American Dream” –are debatable; they no longer do the unifying work that they once did. This, even though almost all Americans, including the poor, enjoy a rising standard of living. In truth, these appeals beg fundamental questions of morality and complex policy on which Americans significantly differ, so it is not surprising that we cannot agree about value-laden and empirically contested issues like immigration, the government’s role in healthcare, the integrity of law enforcement, abortion, gun control, and many more.

Third, even the terms and categories that we use to think about and discuss identity issues are over-simplified — in some areas grotesquely so. Occupying center stage is the subject of race. Although science long ago showed it to be a meaningless, misleading concept, both sides deploy it aggressively and simplistically to conceal inconvenient truths. The right contends that race is only a battleground because activist groups like Black Lives Matter, campus protesters, and other “outside agitators” exploit it. Leftist groups divide society into whites, blacks, and other people of color even though a significant share of Americans carry other ancestries, and intermarriage among these groups has greatly increased. Campus activists deem whites to be categorically “privileged,” yet the vast majority of poor people are white or non-black, and over half of “Hispanics,” many of them poor, self-identify as white. Only about a third of black students at Harvard had four grandparents descended from slaves; the great majority were West Indian and African immigrants or their children. The good news is that far more young people socialize and marry inter-racially unlike their more restricted grandparents, who in any event are dying out.

Fourth, identity-talk makes no serious effort to engage with the teachings of social science. Yet, empirical facts, careful distinctions, and hard-eyed assessment of policy consequences could complicate the easy moralizing and aggressive guilt-mongering in which identitarians of all stripes wallow. For example, sociologist Orlando Patterson has shown that the life experiences of black men and black women are so different that to treat them as a single “community” is vastly, even tragically misguided. By the same token, “immigrants” are not a single category but rather a congeries of people with sharply different social, cultural, economic, and legal statuses – and hence identities. To speak of immigrants generically, as we all tend to do, obscures their differences and misleads our judgments about them.

Finally, identity-talk is almost always more certain of its own premises — and more ignorant or indifferent to those on the other side of the lines it draws — than it should be. Smugness in the face of contradiction is endemic. Cosmopolitan liberals, for example, feel beleaguered by what they take to be an oppressive conservative hinterland now controlling Washington and the country. (Here, the classic New Yorker cover lampooning this view comes to mind). Yet as others have observed, liberalism has actually won the culture war, which in the long run is far more consequential for how we think, live, and vote. Conservatives have their own grievances, intensified by their own blind spots. Their bitter attack on Obamacare (which borrowed from Republican ideas) despite their inability to propose a viable alternative while controlling the machinery of government is but one example; another is the ease with which evangelical Christians continue to support a president who flagrantly violates their most fundamental moral commitments.

Yes, we are tribal, and yes, our tribes are blinded by ignorance and self-righteousness. Perhaps this has always been true. But our politicians were simply better at both unifying the voters that they had just tactically divided, and the institutions fragmented by our Constitution. In the end, we must reduce the hold that our tribes have over us, and we must elect those who share this goal.

The Real Fallout from High School Walkouts

On February 21, many high school students across the country staged a brief walkout from their classes to protest school shootings. Grieving students at the Marjory Stoneman Douglas High School in Parkland Forest are also helping to organize even larger national student walkouts—hashtags #Enough and #NeverAgain— on March 14 and 24 to protest lenient gun laws. These actions are a mistake. Disruptive activism violates students’ freedom to learn, an essential ingredient of academic freedom.

The students at the Parkland high school who helped organize their own walkout and who have organized the coming national walkouts have been lionized in the media. David Hogg, Sara Imam, Cameron Kasky, and several other Parkland students have been featured in interviews on television and cited in news stories for their roles in calling on legislatures to adopt more stringent gun control measures and calling on fellow students across the country to walk out of class in protest.

Hogg, Imam, and the others may be perfectly sincere, but the story is a little more complicated than it first seemed. The students have received a great deal of help from a teachers’ union (it bussed the students to a protest in Tallahassee) and various progressive organizations, including the Women’s March and MoveOn.org. Conservative media responded with accounts such as David Hines’ “Why Did It Take Two Weeks to Discover Parkland Students’ Astroturfing?” and Charles Cooke’s “David Hogg Is Fair Game for Critics.”

In the meantime, college admissions offices across the country have been rushing out announcements that they will not penalize any students who walk out of their classes because of the protests. One such announcement came from Ken Anselment, dean of admissions and financial aid at Lawrence University in Wisconsin, who wrote:

For students who have been suspended or who face the threat of suspension, fear not: we at Lawrence University will not change your admission or scholarship decision in light of a suspension related to this kind of peaceful civil action.

Lawrence University is among hundreds of institutions that announced similar policies. Yale, for instance, declared:

Here at @Yale, we are proud to support all students for participating in peaceful walkouts for gun control or other causes, and we will not rescind admissions decisions for students who do so regardless of any school’s disciplinary policy.

Brown, Dartmouth, and MIT are in the same camp. By February 27, at least 117 colleges had said much the same thing, and by March 2, the figure had grown to about 250.

Typically, being suspended from school or significantly disciplined compromises a student’s acceptance at a college, but when it comes to protesting America’s gun laws, colleges and universities are in large numbers willing to make an exception.

The mass murder at the Marjory Stoneman Douglas High School is abhorrent, and the students who witnessed it are surely traumatized. Whether stricter gun laws of other sorts of legislative actions are a wise response is a matter I will leave aside for now. All these—the shootings, the trauma, and the calls for new legislation—rightly overshadow the question of whether walking out of class is an appropriate response.

Colleges and universities, to say nothing of numerous editorial boards, are saying, in effect, ‘Yes, it is. Walking out of class is excellent testimony on behalf of a good cause.’ But they are wrong.

Walking out elevates a feeling of moral urgency above respect for education and the rights of fellow students. Walking out during regular school hours is meant to dramatize how deeply students are touched by the latest school massacre and how strongly they support legislative remedies. The walkouts, of course, won’t change the surrounding debate over Second Amendment rights. The students may hope to persuade elected officials to “do something” to stop the scourge of mass killings in schools. But what they are really doing is mistaking moral vanity for genuine “participation in democracy.”

Walking out of class to drive a political point emphatically subordinates education to the quest for power. To many students, the protests are righteous and perhaps cathartic. But underlying their experience of “making a difference” is the message that the pursuit of political ends justifies the sacrifice of educational priorities.

Walkouts are not costless. They elevate groupthink. Children are extraordinarily vulnerable to peer opinion, and if the prevailing view favors protest, millions will conform not because they care much about the cause but because not conforming will expose them to ridicule. The students who hold contrary views—and surely there are some—will be bullied and, in any case, denied their right to a day of regular public education.

School students who are passionate about supporting new gun legislation have every right to speak up. But they could do so on their own time, not during school hours.

Where did the idea of a school walkout come from? The tactic is far from new. In 1968, for example, student walkouts in Los Angeles were organized by social studies teacher Salvador Castro to protest bias against Chicano students. Justified as “civil disobedience” by progressive activists, such walkouts were relatively rare until about 2014, when they seemed to have emerged as a go-to response for all sorts of activists. Sometimes they are merely local events. In March 2014, 200 students in Massachusetts walked out of their classes to call for a “hard line against fossil fuel infrastructure.” In September 2014, students in Jefferson County, Colorado, a Denver suburb, repeatedly walked out of their classes to protest curricular changes approved by the local school board. In December 2014, after a grand jury decided not to indict a police officer in the death of Eric Garner, some high school students walked out in New York City.

But sometimes the walkouts are national. On November 14, 2016, after the election of President Trump, more than 2,000 students walked out of Washington, DC schools to protest. Similar walkouts were staged across the country. On February 7, 2017, Muslim high school students in New York staged a walkout in opposition to Trump’s travel ban—a cause which echoed in many other cities.

The effectiveness of such walkouts in drawing attention to a cause is not in doubt. They get lots of coverage. Disrupting one’s own education is sometimes depicted by activists as noble self-sacrifice, but typically the burden of the disruptions is also borne by those who have no part in the cause and may even strongly disagree with it. Does “civil disobedience” of this sort justify denying educational opportunity to fellow students, particularly when the protesters have non-disruptive alternatives?

A Shameless Title IX Bureaucrat Poses as a Champion of Due Process

During her nearly four years running Barack Obama’s Office for Civil Rights (OCR), Catherine Lhamon was nothing if not consistent. She sought to use the power of her office—chiefly by threatening to withhold federal funds—to force colleges and universities to change their campus sexual assault policies. Every substantive change demanded by the Obama administration made it more likely an accused student would be found guilty.

So it’s been rather startling in recent days to see Lhamon claim that defending the fair treatment of accused students was a cardinal principle of her OCR tenure. On February 17, she tweeted, “The OCR I led insisted on a rigidly fair process for all parties involved in sexual violence investigations. Resolution agreements demonstrate that, notwithstanding baseless claims to the contrary. Fairness to all involved is essential to justice.”

Catherine Lhamon
Catherine Lhamon

Justice Ruth Bader Ginsburg’s comments that she agreed with some of the complaints about the unfairness of Title IX tribunals prompted Lhamon to tweet, “I agree. That’s one among many reasons why we need aggressive federal enforcement of law to ensure fair process to all parties.” As proof, she cited two (Wesley and Minot State) of the scores of resolution letters issued by OCR during her time in office, but no policy document from OCR.

We live in an era of political shamelessness. But Catherine Lhamon positioning herself as someone who demanded fair treatment of accused students is nonetheless remarkable.

Although it hardly should be necessary to do so, it’s worth reviewing Lhamon’s actual record to see the sparseness of any desire for fair adjudications. Her highest-profile policy document—2014 guidance—made clear that due process for the accused took a back seat to the Obama administration’s reinterpretation of Title IX. “Of course,” Lhamon warned, “a school should ensure that steps to accord any due process rights do not restrict or unnecessarily delay the protections provided by Title IX to the complainant.” She offered no explanation (then or later) as to why her interpretation of Title IX could trump the constitutional due process safeguards for students at public institutions.

Nor was there anything in Lhamon’s public comments as OCR head to leave an impression that she was, at heart, a covert campus civil libertarian. With strong support from accusers’ rights activists, Lhamon created what her successor dubbed a “list of shame,” publicizing the names of schools under OCR investigation (without revealing the details of the allegations) in an apparent effort to browbeat them into signing resolution letters with her. The most notorious of these letters required such institutions as SUNY and Southern Methodist to re-open investigations in cases where students already had been cleared.

Insisting That Whites Should ‘Step Back’

In November 2017, a Yale sophomore, Sohum Pal, wrote an op-ed for the student newspaper, the Yale Daily News, titled “White Students, Step Back.” It criticized Yale’s much-promoted “diversity” policies as “focused on a brand of assimilationist politics — the deeply misguided notion that students of color want to be wealthy, that we want to possess the social legitimacy and cultural capital of our white counterparts on terms dictated by white stakeholders.”

Instead of “reaching” out to minority students to ensure their participation in campus life—that’s a dubious “assimilationist model” that assumes “whiteness will always be centered” while “color is constantly peripheral,” Pal wrote—non-whites at Yale should be “seated at the head of the table… because we must dictate our own terms of engagement with white power structures.” In short, whites ought to get out of the way in order to facilitate “a liberation politics that would decenter whiteness.” He summed it up: “I don’t want opportunity: I want power.”

Earlier in 2017, while still a Yale freshman, Pal had described himself in a Yale-funded “Asian and Asian American oral history project” as “queer, disabled, and South Asian.” (Pal suffers from cerebral palsy, as he wrote in an essay for yet another Yale minority-student publication during the fall of his freshman year.) He said that at Yale, as at his high school in San Luis Obispo, California, he had received “microaggressions or actual aggressions everyday [sic].”

Perhaps so, but, his disability aside, Pal doesn’t seem to have suffered unduly. Although his family may not be “wealthy” (to lift a word from his Yale Daily News op-ed), it is undoubtedly quite comfortably off. His father, Nirupam Pal, is a professor of environmental engineering at California Polytechnic State University in San Luis Obispo, with a number of outside consulting gigs, and his mother, Susmita Guptapal, is CEO of Infotech Telecom, a long-distance reseller serving immigrants calling relatives in Southeast Asia and the South Pacific (the 22-year-old company with around five employees pulls in about half a million dollars in revenue each year).

Both parents are immigrants from India but have lived in the U.S. since at least the early 1990s, and both hold advanced degrees from U.S. universities. In May 2016, right after her son was accepted at Yale, Guptapal wrote a letter to IndiaWest, a newspaper for Indian expats, stating that he had been accepted at Harvard and Princeton as well and that he had “received perfect scores in the SAT, all APs and subject tests.” And if Sohum Pal’s LinkedIn profile is any indication, he has been impressively ambitious career-wise during his first two years at Yale, churning through numerous internships and student-job stints related to social-justices causes.

Pal’s call for white people to “step back” so that minorities can be “seated at the head of the table,” while probably shocking to equality-minded readers outside the academy, is actually just part of a trend toward anathematizing whiteness and white people that is ubiquitous on college campuses—all in the name of advancing minority rights.

“Critical whiteness studies” has been a part of college curricula since the 1980s. Unlike black studies, developed during the late 1960s to give academic respectability to examining aspects of African-American culture such as music, literature, and folk traditions, whiteness studies typically pays little attention to actual aspects of historically white culture, whether it be Appalachian dulcimer tunes or Brooklyn-centric “stuff white people like” fads for farmers’ markets and exotic breeds of dogs.

Whiteness studies are instead entirely ideological. Their underlying thesis is that “whiteness” is no more than a social construct dating from the 17th and 18th centuries that has enabled one class of people of European descent to dominate, marginalize, enslave, and even terrorize and murder those it deems to have unacceptably darker skin. At the heart of whiteness studies is the notion that it’s simply illegitimate to be white.

“Decentering whiteness” isn’t a new idea invented by Sohum Pal but a 20-year-old idea invented by Jeff Hitchcock, executive director of the Center for the Study of White American Culture, who began a series of “National Conferences on Whiteness” during the late 1990s. “We must disrupt the historic process of assimilation to whiteness that still continues to this day, and begin a new historic process whereby those who are white begin to assimilate to a multiracial version of America,” Hitchcock wrote in 1998.

By 2003 The Washington Post had counted at least 30 colleges and universities across America teaching “whiteness studies,” or, as is often the case in order to demonstrate by capitalization the presumably arbitrary nature of the designation, “Whiteness studies.” A Stanford University course,“White Identity Politics,” offered during the fall of 2017, discussed the concept of “abolishing whiteness” altogether.

There has been a certain amount of political pushback, especially when taxpayer-funded public universities began offering undergraduate courses titled “The Problem of Whiteness,” as the University of Wisconsin-Madison did during the spring of 2017. That course, as its syllabus stated, aimed to explore how white people “consciously and unconsciously perpetuate institutional racism,” which “devastates communities of color.”

Residents of Wisconsin might have been forgiven for wondering who exactly was trying to “perpetuate institutional racism”—they or the professor who taught the course. But so far their voices seem to have been ignored. So it’s not surprising that a Yale sophomore who has led a comfortable and perhaps even privileged life but happens to be of Indian descent has felt free to sound off on the unbearable whiteness of whiteness.

Identity Politics v. the New ‘Me Generation’

 “The single most important intellectual trend of our time is the popular rediscovery of human tribalism,” Jonathan Rauch wrote earlier this month in an influential op-ed in The Washington Post. Now the conversation on tribalism rolls on. In her new book, Political Tribes: Group Thinking and The Fate of Nations, Amy Chua of Yale Law School turns tribalism into an omnipresent transcendental force that purports to explain conflicts that are both domestic and global.

Writing in a decidedly deterministic vein, Chua contends that:

“Humans are tribal. We need to belong to groups. We crave bonds and attachments, which is why we love clubs, teams, fraternities, family.”

From this standpoint, virtually every association and group mutates into a variant of tribalism. Yet, there are groups, and there are groups and the motives that inspire people to join a tennis club should not be interpreted as a variant of those that lead people to become members of a social justice movement or the Ku Klux Klan. Nor is it particularly useful — as Chua does — to portray the ethnic conflict in Iraq with the explosion of identity politics inspired tensions on American campuses.

Chua’s account highlights the divisive and destructive consequences of the explosion of suspicion and mistrust between alt-right and alt-left and growing variety of identity groups in the US, and as it happens, most of the Anglo-American world. Yet, though outwardly the politicization of ethnicity and identity appears to bear all the hallmarks of a tribal struggle, its most distinctive features have little to do with the human “need to belong to groups.”

I identify as…

There is much more to contemporary identity politics today than the valorization of a group or a tribe. Arguably the emphasis on belonging to a distinct group is the least distinctive feature of identity politics today. Since the 19th century, identitarian movements boasted of the special and distinct cultural characteristic of their group identity. They continue to do so today.

However, identity politics in the current era has seen a fundamental shift in focus from the group to the individual. When a student protestor declares, I identify as…., the message is clearly a statement about that individual person. Typically, student protestors draw attention to their fragile identity and flaunt their sensitivity to feeling offended. They frequently adopt a therapeutic language, and most important of all, they constantly talk about themselves and their feelings. Often what seems to matter is not what you argue, but who you are. Take an article in the Columbia Spectator, the newspaper published by students at Columbia University. The article begins with the statement: “Let me begin by stating some crucial facts: I am queer, multiracial woman of color. I am survivor of sexual assault and suffer from multiple mental illnesses. I am a low-income, first-generation student.”

The ‘crucial facts’ pertaining to her identity serve to endow the writer of this article with moral authority. In this “it’s all about me” call for her identity to be respected; her actual arguments are secondary to her status as a multiple victim.  Moreover, the possession of a multiple victim-identity is far more important to her, than an affiliation to a single tribe.

The misguided slogan of the 1970s, “the personal is political,” has given way to the infantilized rhetoric of “it’s all about me.” The words “I” and “me” have become a central feature of the vocabulary of narcissistic protests that characterize the current era. Protestors chanting “Not in My name” or flaunting their #Metoo badge are making a statement about themselves.

There is something disturbingly immature about individual protestors signaling their virtues through posting selfies of themselves holding up a placard stating, “I am angry, and I demand respect.” The emphasis is not on drawing attention to misdeeds directed at the tribe but on hurt experienced by the individual. The refrain, “I am offended” is not the statement of a tribalist but of an atomized and self-absorbed individual.

In recent years, protest frequently serves as a medium for the affirmation of identity. As Italian sociologist Alberto Mellucci observed, “participation in collective action is seen to have no value for the individual unless it provides a direct response to personal needs.”

What we see on campuses today, is far more the politics of ‘it’s all about me’ than that of old-school tribalism. Of course, the current obsession with self-identity is frequently expressed through a group form. The statement, I identify as…… is followed by a predicate that relates to a particular group. Nevertheless, what really matters to the person making a statement is the “I.” This focus on the personal is echoed by both the alt-right and the alt-left. The hysterical exchanges between the two sides serve as testimony to the polarising potential of the personal is political.

One of the least noticed but most significant features of the current phase of identity politics is its tendency towards fragmentation and individuation. There is a growing tendency towards the proliferation of identity groups and also towards separatism. For example, on February 23, 2018, Stonewall, the lesbian, gay, bisexual and transgender rights advocacy group announced that it had withdrawn from the established London Pride parade. Instead, it will support UK Black Pride because it feels that London Pride is not sufficiently “inclusive.” Outwardly, such disputes between different identity warriors have a tribalistic flavor. But what drives such conflicts is the ethos of “it’s all about me” or what Freud referred to as the narcissism of small differences.

In her book, Chua draws attention to the proliferation of identities. She notes the long list of more than 50 gender designations of Facebook and the Balkanisation of gender identities. Arguably, the dynamic driving the Balkanisation of gender identities is the individualistic impulse of owning your own brand. What drives this process is not the desire to share a sense of solidarity or belonging to a group but the craving to be different from others. Chua’s emphasis of the group and the tribe overlooks the prevailing counter-tendencies towards the consolidation of community — tribal or otherwise.

Given the culturally, racially and ethnically polarized atmosphere in America, it is understandable that observers have sought to interpret these developments through the frame of tribalism. Writing in this vein, Rauch echoes Chua when he argues that the popular rediscovery of tribalism is “the single most important intellectual trend of our time.”

Though, Rauch rightly draws attention to the “ever-narrowing group identities,” he does not reflect on the question of what drives this process of fragmentation. Hyper-atomization of campuses, as reflected through demands for all-black or all-gay dormitories and for other forms of self-ghettoization highlight the prevailing sensibility of “we can’t live with one another.”

The real problem facing western societies is not so much the flourishing of tribal identities by the corrosive power of atomization that expresses itself in the form of an identity group. The ever-narrowing group identities referred to here should be understood as a process that I describe as the “diminishing scale of loyalties.” As loyalty acquires a diminished focus, forms of solidarity that transcend the individual self lose their appeal. That is the predicament facing 21st-century society.

Yes, the Weird Campus Culture Pollutes the Whole Nation Now

Several correspondents send me links to “must read” articles every few days. High up on the list since February 9, has been Andrew Sullivan’s New York Magazine article, “We All Live on Campus Now.” Like most “must reads,” Sullivan’s article is a blazing reassertion of what most people already know. Its claim, as Pope defined “true wit” in his Essay on Criticism, is to present “What oft was thought, but ne’er so well express’d.”

What Sullivan expresses so well is the diminution of the concept of the individual next to the Colossus of Identity Group. He gets there by puncturing the fantasy that the victim culture on campus begins to disappear as you make your way down main street and over to the business district.

We did already know this, didn’t we? When Google fired James Damore in 2017 for writing a memo in which he commented on psychological differences between men and women, we had a clue. When Mozilla fired its CEO Brendan Eich in 2014 for having once donated $1,000 to Proposition 8, we had an inkling. When Harvard ousted president Larry Summers way back in 2006 for making carefully hedged observations about the distribution across the sexes of Himalayan-level mathematical aptitude, we had a whisper.

Plainly we have all known for a very long time that the quips and cranks, and wanton wiles of political correctness had become the jollity of everyday life in America. Yesterday I interviewed a candidate for a position as an editor of my journal, and when I mentioned that we stick with “he” as the third-person generic pronoun, a look of barely veiled horror shrank across her face. By the time we got to my opposition to racial preferences, this poor mortal was ready to flee for her life.

Why? Because all right-thinking people know the new rules. The diversity of victimization is the only diversity that now matters in America. A few days back a reporter called me for comment on whether the new Hollywood blockbuster, The Black Panther, could rightly be faulted for not giving adequate attention to the doubling and tripling of victim statuses called “intersectionality.” Apparently, the filmmakers had cut some Lesbian love scenes that black activist and scriptwriter Ta-Nehisi Coates had added to the fantasy pic. Intersectionality is where all the injustices, phobias, and –isms come together in the great banquet of identity group suffering, something like the palace of the devils, Pandemonium, in Milton’s Paradise Lost.

The attentive reader cannot have failed to notice my various allusions to dead white male poets and living white male overachievers. They are here as my support group. My own cultural identity, which I’ve long understood to be that of an American who has an interest in history, literature, and ideas, has been yanked away by the edict of our Cultural Czars. In its stead, I find I find that I am to understand myself by the coordinates of race, sex, and privilege. (I refuse the word “gender.” It concedes the falsehood that sexual differences are entirely “socially constructed.”)

I don’t care for this new reductionism, and I find it hard to believe that many other people care for it either, except those who derive their livelihoods by striding the webs of identity group affiliation. To be sure, resentment and anger provide a certain source of gratification.

Sullivan observes how “the imperatives of an identity-based ‘social justice’ movement” are dragging America away from “liberal democracy.”  Sullivan should know, as he played his own part in attaching some of the chains to the tow truck. He may regret the zeal with which the next generation of activists continue the work of dismantling the foundations of family and civilized order. As for the “individual,” it is surprising how such a Gibraltar of a concept could crumble into postmodern dust in the space of a generation.

The readiness of students to discard academic freedom for “safe spaces” is a readiness to shrug off their individuality in favor of the supposed comforts of group identity. That this has been carried into popular culture and politics is undeniable. That we can watch it invade the precincts of business and commerce is astonishing. It is as though all the defensive forces have thrown down their weapons and fled.

“The whole concept of an individual who exists apart from group identity is slipping from the discourse,” writes Sullivan, and he is on the money. When he turns to President Trump as the arch-avatar of these sorry developments, however, I am not so sure. Trump, of course, is frequently chastised as having called forth the legions of white identity reactionaries, and his style is often crude, but it is also hard to think of him as anything but an unreformed individual. His bluster is the rodomontade of a self-made man. He mocks the conventions of identity politics, which can be mistaken as indulging those conventions.

But I wouldn’t insist on the point. Sullivan does excellent work surveying the cratered terrain where radical feminists, cultural Marxists, and social justice warriors of all sorts have lobbed their mortar shells and nearly obliterated all traces of civilized culture. Learning how to treat people as individuals again will take a long recuperation. As a misogynist writer once put it, this is our own Farewell to Arms.

Photo: The 5 Factions of DIVERGENT Thought Leaders – Leading Thought (Flickr)

A New Book Takes On 500 Years of Modern Liberalism

Why Liberalism Failed, by Patrick J. Deneen, uses “liberalism” in the oldest, broadest sense of the term. Deneen’s sweeping, severe assessment of all that has gone wrong in our time attacks modernity’s entire package-deal: individuals possessing inalienable rights; representative, accountable governments that exist to secure those rights; the separation of church and state; the commitment to progress, prosperity, and self-determination.

Deneen, a University of Notre Dame political scientist, calls liberalism a “political philosophy conceived some 500 years ago,” a project set in motion by Machiavelli, Francis Bacon, and Thomas Hobbes before John Locke, James Madison, and John Stuart Mill elaborated and systematized it. Though launched with lofty aspirations to promote equity, pluralism, dignity, and liberty, it turns out that liberalism “generates titanic inequality, enforces uniformity and homogeneity, fosters material and spiritual degradation, and undermines freedom.” Liberalism failed because it succeeded, Deneen argues.

Its “inner logic” culminated in crippling contradictions becoming manifest. Communism and fascism, the “visibly authoritarian” ideologies liberalism vanquished, were “crueler,” but less “insidious.” Liberalism’s power to shape our expectations and standards is so great that only as humanity is “burdened by the miseries of its successes” do we begin to realize that “the vehicles of our liberation have become iron cages of our captivity.”

Our existence within those cages is harrowing and false. Democratic politics has become a “Potemkin drama meant to convey the appearance of popular consent for a figure who will exercise incomparable arbitrary powers over domestic policy, international arrangements, and, especially, warmaking.” Purportedly republican governance really consists of “commands and mandates of an executive whose office is achieved by massive influxes of lucre.”

Our economic lives, based on the assumption that “increased purchasing power of cheap goods will compensate for the absence of economic security and the division of the world into generational winners and losers,” are equally fraudulent. And equally malign: “few civilizations appear to have created such a massive apparatus to winnow those who will succeed from those who will fail.” Because of these forces, we are “increasingly separate, autonomous, nonrelational selves replete with rights and defined by our liberty, but insecure, powerless, afraid, and alone.”

That’s one assessment of life in the 21st century. Here’s another:

Many people around the world feel insecure and oppose the spreading of insecurity and war….

The people are protesting the increasing gap between the haves and the have-nots and the rich and poor countries.

The people are disgusted with increasing corruption.

The people of many countries are angry about the attacks on their cultural foundations and the disintegration of families. They are equally dismayed with the fading of care and compassion….

Liberalism and Western style democracy have not been able to help realize the ideals of humanity. Today these two concepts have failed. Those with insight can already hear the sounds of the shattering and fall of the ideology and thoughts of the Liberal democratic systems.

The latter passage does not come from Why Liberalism Failed but appeared instead in an open letter sent to President George W. Bush in 2006 by Iran’s president, Mahmood Ahmadinejad. The striking similarity of the two jeremiads is, at the very least, awkward for Deneen. We know that Ahmadinejad belongs to a broad Islamic movement that, loathing and dreading Western liberalism, wants to extirpate the encroachments it has made in Muslim societies. He offers a critique and a remedy, blood-drenched but nevertheless clear.

There’s no evidence that Deneen favors an American counterpart to Iran’s Revolutionary Guard, but also very little evidence about the solution he does endorse. Like most authors of books on politics and social conditions, Deneen is a loquacious pathologist but tongue-tied clinician. Why Liberalism Failed follows the template: half-a-dozen vigorous, detailed chapters that explicate and decry what’s broken, and assign blame for our dilemma, followed by a single concluding chapter—slender, tentative, vague, and unusable—on how to fix the problem.

Given the depths and urgency of the crisis he deplores, Deneen’s reticence about how to find our way out of it is particularly disappointing. At one point he suggests the difficulty of explaining what comes after liberalism is yet another thing to blame on liberalism since its hegemony over our discourse makes it hard to imagine and describe a post-liberal future. At another, he contends that the absence of standards defining that future is a virtue.

Since one of liberalism’s inherent defects is an excessive reliance on political theory, the remedy must be a firm reliance on political practice. More specifically, he endorses “communities of practice,” such as the Amish or those envisioned by Rod Dreher in The Benedict Option. In them, “people of goodwill” can “form distinctive countercultural communities” that create “new and viable cultures, economics grounded in virtuosity within households, and [a] civic polis life.”

Authors can be revealing without being forthcoming, however, and the suggestions Deneen gives about these communities of practice point to larger defects in his argument. His book relates a conversation he had while teaching at Princeton, about the Amish practice of giving young adults a year-long sabbatical from the austere communities where they grew up, so they can sample modern life before deciding whether to eschew it. “Some of my former colleagues took this as a sign that these young people were in fact not ‘choosing’ as free individuals,” he writes. “One said, ‘We will have to consider ways of freeing them.’”

Deneen treats this chilling Rousseauian remark as exposing liberalism’s malevolent essence. It is not one tenured radical, but all of liberalism, that denigrates “family, community, and tradition.” Deneen does not consider the alternative possibility that his colleague was not a representative liberal but a deficient one, severely lacking in the accommodating spirit of live-and-let-live that characterizes liberal societies at their best.

Elsewhere, Deneen anticipates demands for laws to prevent communities of practice from becoming “local autocracies or theocracies.” Such demands, he warns, “have always contributed to the extension of liberal hegemony,” leaving us “more subject to the expansion of both the state and market and less in control of our fate.” This dismissal does not refute a legitimate concern: the people who form distinctive countercultural communities will not necessarily be of goodwill. Nor will the results of their efforts always be “lighthouses and field hospitals” that guide us through the liberal storm and cure us of the liberal sickness. Sometimes they’ll produce Amish communities, but other times they’ll yield Jonestown, Branch Davidians, or the Church of Scientology.

The “most basic and distinctive aspect of liberalism,” Deneen argues, “is to base politics upon the idea of voluntarism—the unfettered and autonomous choice of individuals.” For the time being, while operating in “liberalism’s blighted cultural landscape,” the communities of practice will avail themselves of liberalism’s “choice-based philosophy.” They can invoke voluntarism to resist it, issuing a defiant “Don’t Tread on Me” to liberalism’s encroaching state, market, and “anti-culture.” After liberalism has collapsed under the weight of its contradictions, however, the voluntarist communities of practice might someday produce a “nonvoluntarist cultural landscape.” In it, presumably, individuals will no longer be burdened by the possibility and necessity of making so many choices, including whether to join or leave a community of practice.

These hints that Deneen is something of an anti-anti-theocrat lead us to Why Liberalism Failed’s most serious lacuna: how did a philosophy he portrays as monstrous and anthropologically absurd not only catch on but come to dominate political thought and practice for five centuries? He emphasizes the guile, malevolence, bad faith, and hidden agendas of liberalism’s architects, but doesn’t account for their astounding success in peddling what sounds like a solution in search of a problem.

By way of not explaining what we should do now, Deneen says that we can only go forward, not back to “an idyllic preliberal age” that “never existed.” But an age can be pretty good without being idyllic. Deneen says that none of liberalism’s ideals—liberty, equality, dignity, justice, and constitutionalism—were innovations. All of them were “of ancient pedigree,” carefully elaborated over centuries in classical and Christian philosophy.

Since liberalism brought nothing new to the table, the only reason for its success appears to be that people were fooled into thinking it would hasten the process of making political practice conform more closely to the standards laid out by pre-liberal political theory. Still, why humans made such a big bet on such a bad pony remains a mystery, as does their needing 500 years to start realizing the gamble hasn’t paid off.

One wouldn’t know from Why Liberalism Failed that the dawning of the liberal age coincided with the beginning of savage religious wars that devastated Europe. Over doctrinal differences, Protestants slaughtered Catholics, Catholics slaughtered Protestants, and Protestants slaughtered other Protestants. After two centuries of this madness, people were both exhausted and receptive to the idea that it was more urgent to end than to win the religious warfare.

The liberal philosophy took shape, largely in response to these traumas, and offered a way out of them. Politics would be about some things but not everything, and especially not about God and how to regard Him. Liberalism created a political space in which people would agree to disagree. When first put forward, his approach struck many people as a good idea and continues to appeal today.

Liberalism remains problematic for many reasons, one of them being the difficulty of drawing the boundaries between those things we must agree on, and those where agreement is unnecessary and seeking it dangerous. There are other challenges. Liberalism prevents religion from becoming a threat to civic peace by “privatizing” it, turning it into a kind of hobby. The resulting secularization of the public realm trivializes both public and private life, however, producing what Leo Strauss famously called the “joyless quest for joy.”

Furthermore, and as Deneen makes clear, liberalism draws upon civilizational inventories it does not replenish. Immanuel Kant was wrong: sensible devils cannot sustain a liberal society, no matter how shrewdly ambition is made to counteract ambition. The character of the citizenry is crucial, but the cultural contradiction of liberalism is that the experience of living in a liberal regime turns a great many of its citizens into people lacking the nobility, virtue, and discipline needed to defend and preserve that regime.

It may be, then, that such serious problems mean liberalism is inherently precarious at best and untenable at worst. Nevertheless, liberalism arose in response to the genuine problem of finding a way people of diverse creeds could live together peacefully. Getting rid of liberalism will not get rid of this necessity. Ahmadinejad’s solution is to banish the diversity liberalism presupposes, to hasten the process whereby “the world is gravitating towards faith in the Almighty and justice and the will of God will prevail over all things.”

Deneen’s solution, so far as he has one, sounds like solving diversity by increasing it through an archipelago of micro-polities, different from one another but each committed to its internally unifying vision of the good life. Neither solution sounds plausible or enticing. If, as Deneen contends, we got into our difficulties with liberalism and its attendant difficulties by not asking enough hard questions, there’s no reason to believe we’ll get out of those difficulties without asking hard questions about what comes next, questions for which Why Liberalism Failed offers no answers.

The College Endowment Tax: A Good Idea, Sort of…

Starting next January, some 35 very wealthy private colleges and universities will start paying an annual 1.4 percent college endowment tax under the new tax reform law. That’s very few of the nation’s institutions of higher learning, and the tax will not apply to assets that directly contribute to an educational purpose. When you hear wisecracks such as “Harvard is a hedge fund with a university attached,” you are listening to one reason for the tax. Other reasons include resentment toward elite universities for allowing leftwing domination of modern faculties and rising campus disrespect for free speech and intellectual diversity.

Related: The Case for Taxing Endowments

The precedent to exempt colleges from taxation emerged during the colonial era when newly established colleges were subsidized, in part, by exempting them from property taxes. Given their mission to educate young men for civic leadership and the clergy, the employment of an infant industry policy to exempt colleges from taxation to encourage their growth and sustainability seemed reasonable. Colleges, however, are increasingly astray from the mission of creating and disseminating knowledge, which serves a useful social function that arguably merits subsidization. They are increasingly engaged in revenue-generating activities that resemble those pursued by taxpaying commercial enterprises.

This includes endowment investment portfolios at some universities that look like highfalutin hedge funds. The commercial interests of universities should be taxed in the same manner as taxpaying enterprises and individuals, not granted the special privilege. The endowment tax moves us closer to this ideal.

The endowment tax mainly applies to wealthy universities such as Harvard, MIT, Princeton, and Yale. These four institutions collectively control about a quarter of the $500 billion assets held by college endowment funds, providing them an unprecedented advantage in attracting top students and faculty. The tax may reduce endowment inequality and improve the competitiveness of higher education. Some donors may redirect their philanthropy from the wealthy institutions to less well-endowed ones where their gift will have a higher long-term impact because it will grow tax-free. This would improve the financial position of institutions benefiting from such reallocation of gifts, allowing them to invest in strategic areas to better compete for top students and faculty.

Finally, the endowment tax may send a symbolic message to colleges that the public is increasingly dissatisfied with their behavior. Lawmakers with the ability to subsidize colleges also have the option of taxing them. This could serve as an impetus for university leaders to control profligate spending, improve affordability, enhance learning, and promote intellectual diversity.

Will It Reduce Financial Aid?

Some college officials have suggested that the endowment tax will reduce access among talented low-income students because a portion of their endowments is earmarked for financial aid. Returns attributable to such funds may end up exempt as an argument could be made that scholarships directly contribute to an institution’s educational mission.

While the endowment tax will nonetheless result in a modest revenue loss for wealthy institutions, most of these schools have what economists refer to as highly inelastic demand curves. This means they could raise tuition without significantly reducing the number of qualified students willing and capable of paying sticker price. The loss in revenue from the endowment tax could be made up by charging full price payers more, without adversely impacting access to low-income students. Proponents of redistribution should favor this. But then again, affected colleges might respond by reducing the number of low-income students admitted or the aid packages offered to them.

The endowment tax of 1.4 percent is lower than the 2 percent rate imposed on net investment income of private foundations. Meanwhile, individual investment income is taxed at the marginal rate (up to 37 percent post-reform) and long-term capital gains up to a 20 percent tax rate, plus any state levies. The net investment income and capitals gains of corporations are taxed at the corporate rate (21 percent post-reform). Why should wealthy universities such as Harvard, whose $37 billion endowment exceeds the GDP of countries such as Bahrain and Latvia, pay a lower tax rate than a middle-class family or small business for performing the same economic activity?

In addition to the direct revenue loss from the endowment tax, the new policy will also impose indirect costs. The higher education community is likely to increase its lobbying efforts to try and shape the final details of the policy in their favor to minimize losses. The policy will likely be complex, imposing new compliance costs. Lobbying and regulatory compliance are costly and will divert resources from more productive uses.

A Small Tax Needn’t Stay Small

Though the new tax is small, we should learn from history. The Revenue Act of 1913 imposed a very modest 1 percent federal income tax but has evolved into the federal government’s largest revenue stream, propagating a Leviathan central government.

While the endowment tax is likely to have a modest impact, it is a slippery slope for further federal meddling in and politicization of higher education. Faced with a rapidly expanding national debt and unfunded liabilities, lawmakers may view universities resources, including their endowments, like a pot of gold at the end of an ivory tower. They may also increasingly use the power of the purse to coerce university conformity to whatever ideology is in vogue, further reducing intellectual diversity.

Federal intrusion into higher education has been a root cause of many of the issues fueling growing public resentment towards it. Calling upon the government to fix problems that it helped create may prove to be foolish and perpetuate them indefinitely. As Milton Friedman once said, “there is nothing so permanent as a temporary government program.” His wisdom suggests that we ought to move in the direction of reducing government involvement in higher education, not increasing it.

What Professors Ought to Tell Students

We professors should transmit to our students three simple but ancient truths: (1) in many important matters in our fields, the ignorance of experts vastly exceeds our knowledge. (2) Much of what we think we know is hard to verify and may well be wrong. (3) We, and the materials that we will assign and discuss with students are their best route to learning.

Our vastly increased understanding of our world and universe over the centuries is wondrous, but it is mostly in the hard sciences and mathematics. Progress in the social sciences, which examine how we feel, behave, and interact with one another, is spotty and will probably always remain so due to the elusive complexities of causation, psychology, will, and the methodological impediments to rigorously studying and analyzing these issues. The humanities greatly enrich our lives, of course, but they mostly deepen the mysteries of life rather than dispel them.

Uncertainties Are Our Companions

Wise teachers, of course, already know this. They communicate it to their students in hopes of arousing their curiosity (at the risk of encouraging a lazy, mindless nihilism). I suspect, however, that many other professors are so eager to thrust their views on their students in a show of brilliance, self-confidence, and subject-matter expertise that they forego this wisdom and the intellectual and personal humility that should go with it. After all, they have earned doctorates, worked hard to master their fields of expertise, and gained faculty positions at fine institutions which in effect certify their own intellectual excellence. Why be humble and confess much ignorance, especially to students who probably don’t know any better?

We podium pundits should not merely acknowledge the considerable uncertainty that surrounds our fields; we should emphasize it from the very first class. Why? First and foremost, it is true — and teachers are obliged to speak the truth both to power and to ignorance. Only if students appreciate the uncertainties in what they are studying can they apply important distinctions. There is what we “know” to be true (or false) with a high degree of confidence, though always subject to refutation. There is what is provisionally true (or false) but not yet firmly established as such. There is what is plausibly true in the limited sense that respectable arguments can be made on various sides of the question. And there is a matter for pure (though hopefully informed) speculation – an invitation to new theories, methodologies, and evidence.  Students need to understand and apply these gradations of knowledge in their fields of study.

Holmes’s Famous Dissent

But professors should emphasize our ignorance about important questions for another reason. The students who join elite campuses (where I have mainly taught) come with surprisingly firm, entrenched political identities and views. Their premature certainties exist even though – or more probably, because — few of them have much experience of life and its myriad complexities. Not surprisingly, they know little of the diverse values, perspectives, and methodologies with which serious thinkers in their fields of study have grappled with these conundra, and of the weak analytical and evidentiary foundations of many of our firmest commitments. Justice Holmes put this point well in a famous dissent almost a century ago, one that presciently captures a major source of conflict on today’s campuses:

      “Persecution for the expression of opinions seems to me perfectly logical. If you have no doubt of your premises or your power, and want a certain result with all your heart, you naturally…sweep away all opposition.… But when men have realized that time has upset many fighting faiths, they may come to believe the very foundations of their own conduct that the ultimate good desired is better reached by free trade in ideas….”

Students’ striking political and intellectual smugness is both predictable and understandable. In this, they ape the certitudes of so many of their elders. Our liberal culture demands little critical thinking from young people and tends to applaud their idealistic bien pensant views. After all, the main reason they came to campus in the first place was to supplant their ignorance and inexperience. (There are also less lofty reasons, of course). But their greenness only heightens professors’ duty to pierce students’ ideological armor and challenge their preconceptions immediately and constantly.

Most professors would surely acknowledge this duty; the notion of robust debate that challenges preconceptions, including our own, is a hoary academic mantra. The vast majority of faculty on elite campuses espouse political liberalism that they think their years of scholarly work have only confirmed and deepened. For them, and for the small cadre of conservative professors, intellectual humility and self-abnegation are neither congenial nor easy.

All the more reason, then, for faculty to commit ourselves to these academic values and to recruit more young professors with intellectually diverse views – as reflected in their normative commitments, disciplinary methodologies, and empirical interests, not their partisan preferences. This commitment will enrich our students’ lives on campus and beyond.

Why College for All Is A Big Mistake

More and more Americans are going on to post–high school education, encouraged to do so by both governments and nonprofit organizations. According to the U.S. Department of Education, for example, “In today’s world, college is not a luxury that only some Americans can afford to enjoy; it is an economic, civic, and personal necessity for all Americans.”

One of many nonprofit organizations that convey the same message is the Lumina Foundation. Its mission is to expand post-secondary educational attainment, with a goal of having 60 percent of Americans hold a college degree, certificate, or other “high-quality postsecondary credential” by the year 2025. Its “Stronger Nation” initiative, as the foundation declares on its website, “is all about the evidence of that learning—quantifying it, tracking it, pinpointing the places where it is and isn’t happening. . .  Lumina is also working with state policy leaders across the nation to set attainment goals and develop and implement strong state plans to reach them. So far, 26 states have set rigorous and challenging attainment goals—15 in the last year alone. Most of these states are taking concrete steps—such as implementing outcomes-based funding, improving developmental education, and making higher education more affordable—to increase attainment and reach their goals.”

The Lumina Foundation is steeped in metrics and proselytizes on its behalf: its website proclaims, “As an organization focused on results, Lumina Foundation uses a set of national metrics to guide our work, measure our impact and monitor the nation’s progress toward Goal 2025.”

The Lumina Foundation’s mission comports with a widely shared conviction about the role of higher education in American society: the belief that ever more people should go on to college, and that doing so increases not only their own lifetime earnings but also creates national economic growth.

More Winners Mean Less Value in Winning

That article of faith, and the performance targets to which it gives rise, may simply be mistaken. As Alison Wolf, an educational economist at the University of London, has pointed out, it is true that those who have a B.A. tend to earn more on average than those without one. Thus, on the individual level, the quest for a B.A. degree may make economic sense. But on the national level, the idea that more university graduates means higher productivity is a fallacy.

One reason for that is that to a large extent education is a positional good—at least when it comes to the job market. For potential employers, degrees act as signals: they serve as a shorthand that allows employers to rank initial applicants for a job. Having completed high school signals a certain, modest level of intellectual competence as well as personality traits such as persistence. Finishing college is a signal of a somewhat higher level of each of these.

Once It Signaled Superiority

In a society where a small minority successfully completes college, having a B.A. signals a certain measure of superiority. But the higher the percentage of people with a B.A., the lower its value as a sorting device. What happens is that jobs that once required only a high school diploma now require a B.A.

That is not because the jobs have become more cognitively demanding or require a higher level of skill, but because employers can afford to choose from among the many applicants who hold a B.A. while excluding the rest. The result is both to depress the wages of those who lack a college degree, and to place many college graduates in jobs that don’t actually make use of the substance of their college education.4 That leads to a positional arms race: as word spreads that a college diploma is the entry ticket to even modest jobs, more and more people seek degrees.

Thus, there are private incentives for increasing numbers of people to try to obtain a college degree. Meanwhile, governments and private organizations set performance measures aimed at raising college attendance and graduation.

Higher Metrics Through Lower Standards

But the fact that more Americans are entering college does not mean that they are prepared to do so, or that all Americans are capable of actually earning a meaningful college degree.

In fact, there is no indication that more students are leaving high school prepared for college-level work.  One measure of college preparedness is the performance of students on achievement tests, such as the SAT and the ACT, which are used to predict likely success in college (they are, in part, aptitude tests).

For the most part, these tests are taken only by high school students who have some hope of going on to higher education, though in an effort to boost student achievement, some states have taken to mandating that ever more students take such tests. (Probably a case of misplaced causation. Students who took the tests tended to have higher levels of achievement. So, it was mistakenly reasoned, by getting more students to take the test, levels of achievement would be raised. The flaw is that better-performing students were more likely to take the test in the first place. That is, policymakers mistook cause for effect.)

The ACT tests four subject areas: English, math, reading, and science. The company that develops the ACT has developed benchmarks of scores that indicate that the test taker has a “strong readiness for college course work.” Of those who took the ACT test most recently, a third did not meet the benchmark in any of the four categories, and only 38 percent met the benchmarks in at least three of the four areas. In short, most of those who aspire to go on to college do not have the demonstrated ability to do so.

The results are predictable—though few want to acknowledge them. Since more students enter community colleges and four-year colleges inadequately prepared, a large portion require remedial courses. These are courses (now euphemistically rechristened “developmental” courses) that cover what the student ought to have learned in high school. A third of students who enter community colleges are placed in developmental reading classes, and more than 59 percent are placed in developmental mathematics courses. Students who are inadequately prepared for college also make additional demands on the institutions they attend, thus raising the costs of a college education: the growth on campuses of centers of “educational excellence” is a euphemistic response to the need for more extracurricular help in writing and other skills for students inadequately prepared for university-level work.

Graduation Rates Count

Colleges, both public and private, are measured and rewarded based in part on their graduation rates, which are one of the criteria by which colleges are ranked, and in some cases, remunerated. (Recall the Lumina Foundation’s encouragement of state governments to engage in “outcomes-based funding.”) What then happens is that outcomes follow funding. By allowing more students to pass, a college transparently demonstrates its accountability through its excellent metric of performance. What is not so transparent is the lowered standards demanded for graduation.

More courses are offered with requirements that are easily fulfilled. There is pressure on professors—sometimes overt, sometimes tacit—to be generous in awarding grades. An ever-larger portion of the teaching faculty comprises adjunct instructors—and an adjunct who fails a substantial portion of her class (even if their performance merits it) is less likely to have her contract renewed.

Thus, more students are entering colleges and universities. A consequence of students entering college without the ability to do college-level work is the ever-larger number of students who enroll but do not complete their degrees—a widespread and growing phenomenon that has substantial costs for the students who do so, in tuition, living expenses, and earnings foregone. High dropout rates seem to indicate that too many students are attempting college, not too few. And those who do obtain degrees find that a generic B.A. is of diminishing economic value because it signals less and less to potential employers about real ability and achievement.

Recognizing this, prospective college students and their parents seek admission not just to any college, but to a highly ranked one. And that, in turn, has led to the arms race of college rankings, a topic to which we will return.

An Air of Unreality

Lowering the standards for obtaining a B.A. means that using the percentage of those who attain a college degree as an indicator of “human capital” becomes a deceptive unit of measurement for public policy analysis. Economists can evaluate only what they can measure, and what they can measure needs to be standardized. Thus, economists who work on “human capital” and its contribution to economic growth (and who almost always conclude that what the economy needs is more college graduates) often use college graduation rates as their measure of “human capital” attainment, ignoring the fact that not all B.A.’s are the same, and that some may not reflect much ability or achievement.

This lends a certain air of unreality to the explorations of what one might call the unworldly economists, who combine hard measures of statistical validity with weak interest in the validity of the units of measurement.

One assumption that lies behind the effort to boost levels of college enrollment and completion is that increases in average educational attainment somehow translate into higher levels of national economic growth. But some distinguished economists on both sides of the Atlantic—Alison Wolf in England, and Daron Acemoglu and David Autor in the United States—have concluded that that is no longer the case, if it ever was.

In an age in which technology is replacing many tasks previously performed by those with low to moderate levels of human capital, national economic growth based on innovation and technological progress depends not so much on the average level of educational attainment as on the attainment of those at the top of the distribution of knowledge, ability, and skill. In recent decades, the percentage of the population with a college degree has gone up, while the rate of economic growth has declined. And though the gap between the earnings of those with and those without a college diploma remains substantial, the falling rate of earnings for college graduates seems to indicate that the economy already has an oversupply of graduates.

By contrast, there is a shortage of workers in the skilled trades, such as plumbers, carpenters, and electricians— occupations in which training occurs through apprenticeship rather than through college education—who often earn more than those with four-year degrees.

To be sure, public policy ought to aim at more than economic growth, and there is more to a college education than its effect on earning capacity. But for now, it is worth underscoring that the metric goal of ever more college graduates is dubious even by the economistic criteria by which higher education is often measured.

This is an excerpt from Jerry Z. Muller’s new book, The Tyranny of Metrics, published by Princeton University Press. Jerry Z.  Muller is a professor of history at the Catholic University of America in Washington and the author of many books, including The Mind and the Market: Capitalism in Modern European Thought.

Is Half of College Education Wasted?

Trigger Warning: If you fancy yourself smart enough to understand complex social science, Bryan Caplan’s book, The Case Against Education: Why the Education System is a Waste of Time and Money, may lower your self-esteem. This is a serious, “academic” effort, six-years-in-the-making, and while Caplan, an economist at George Mason University and the Cato Institute, can be witty, this is not the breezy rant so common among today’s alarmist books on education.

The gist of Professor Caplan’s case is that there is way too much education, students waste hundreds of hours and millions of government-supplied dollars learning material that adds nothing of productive value or personal enrichment. Yes, high schools and colleges may occasionally produce a genius who invents Microsoft Word, but such accomplishments are exceedingly rare and cannot justify society’s massive investment in schooling. Learning history, for example, is only valuable for future history teachers, and how many history courses enrollees will pursue that vocation? Nor does the college experience broaden student cultural horizons. Most students, Caplan claims, are bored by “high culture” and even those who ace English Literature quickly forget everything.

Is It Just ‘Signaling?

Wastefulness understood, why do millions embrace the “more education” and “college-for-all” mantras? Is everybody delusional regarding the alleged financial payoff of a high school diploma or a college BA? Caplan explains this oddity with the concept of “signaling.” That is, a student’s educational record tells a potential employer a great deal about a person’s intelligence, conscientiousness, and conformity, so students will invest prodigious (or minimal) effort to demonstrate worthiness largely independent of what is substantively acquired in the classroom.

Thus, a young man who completes a four-year degree at MIT in three years signals a potential employer that he is a great catch even if the acquired learning is, for the most part, vocationally irrelevant. Conversely, an equally talented youngster unable to graduate from a community college will not even be invited to a job interview. Who wants to hire somebody—no matter how smart–who lacks industry and perseverance? Employers cannot determine exactly what you learned, but they will happily pay a premium for those surviving the ordeal necessary to get the degree.

Caplan hardly argues that schooling is generally wasteful; only some of it, undoubtedly at least a third, he says, maybe even 50% or higher. Even the low side estimate of a third signifies an enormous squandering of personal time and government money.

Might this troubling calculation counsel that students should shun college, learn marketable skills elsewhere and invest saved tuition in the stock market? Hardly. The inherent nature of signaling dictates following the mob—not attending college only works if millions likewise share this disinclination to get a BA given that employers will judge your lack of a sheepskin as proof of unworthiness, regardless of your smarts and industry. The parallel is the futility of standing during a concert to see better; a strategy instantly defeated when everybody else stands.

Fluff Courses Make Sense

Going one step further, since employers only look at the credential as proof of worthiness, it is rational for an MIT student to enroll in as many fluff courses as possible from easy grading professors since employers cannot tell the difference and, to some extent, don’t care. The MIT degree itself suffices.

To make his case scientifically, Caplan marshals massive quantities of evidence and is totally unafraid of offering personal judgments. For example, he personally classifies both high school and college courses into three categories: high usefulness, medium usefulness, and low usefulness. High school subjects deemed highly useful are English and mathematics (further sub-divided so Algebra I is highly useful” while Geometry is of low usefulness”); low usefulness includes foreign languages and the social studies.

College courses are similarly classified—highly useful are engineering, health professionals, and agricultural majors. Wasted learning, predictably, is fine arts, psychology, journalism and the Liberal Arts more generally. All and all, judged by the distribution of college majors in 2008-9, 40.5% of college students are squandering their time and money, at least according to Professor Caplan’s judgment.

It gets worse: this learning, however modest, evaporates with age. When adults are quizzed about reading, math, history, civics, science and foreign languages, Americans can recall almost nothing despite years of exposure When 18,000 randomly selected American adults in 2003 were quizzed about reading, math, history, civics, science and foreign languages, they recalled almost nothing despite many of these subjects having been covered multiple times. Ample data also suggest that among today’s college students less and less time is devoted to learning so what ultimately remains in the brain will drop yet further. No wonder employers frequently complain about the difficulty of hiring good help!

No Gateway to High Culture

Nor does schooling instill an appetite for high culture, a love of “the best and the brightest. The market says that this endeavor is largely pointless—only a tiny portion of adults pursue “high culture” so schools are trying to satisfy minuscule future demands. If Caplan is right, returning to a cheap, bare-bones education that largely ends at 8th grade would not be a national catastrophe.

The bulk of The Case Against Education is spent disentangling the countless factors that contribute to the economic success that, at least partially rival the signaling explanation. This can get tedious and, alas, often relies on incomplete data and the intricacies of specific analytical techniques. It is conceivable, for example, that attending Harvard may be a low-yield learning experience, but it might help you to meet fellow students and alums able to offer you prestigious, well-paying jobs. Likewise, that Harvard graduates may get rich may have less to do with classroom learning than a person’s innate intelligence. Or, as some radical egalitarians insist, rich kids have the inside track to Harvard and join the elite thanks to their family’s pre-existing fortune. Nevertheless, the signaling explanation holds up rather well against rivals.

What does Caplan counsel after all the slash and burn analysis? His advice seems sensible: more and better vocational instruction, everything from classroom training to apprenticeships. In concrete terms, America employs roughly 900,000 carpenters but only 3,800 historians, so why not teach more carpentry than history? The Professor even puts in a word or two for child labor—better than boring fifteen-year-olds with how to diagram a sentence. Alas, that the government (and some private firms) already offers dozens of under-utilized vocational training programs receives scant attention.

The Case Against Education is a tour de force of modern economic analysis, but it skips over the payoff of “wasted” educational spending for society more generally. Even academically marginal schools with half-awake students can generate genuine value, for example, invigorating rustbelt towns hanging on for dear life. Hundreds—perhaps thousands– of these third-tier schools and their party-animal enrollees exist, and this “wastefulness” might be the most effective way to deliver the socially desirable economic uplift.

All Those Unemployed Professors

Similarly, what would Caplan do with all the unemployed professors (and armies of adjuncts and administrators) who would have taught such “useless” subjects as history, psychology, foreign languages? Easy to visualize thousands of unemployed Marxist Ph.D.’s scheming to elect a Bernie Sanders who promises college “for everybody.” In the grand scheme of things, it may be preferable to having all the Ph.D.’s “gainfully employed,” albeit pointlessly, versus working part-time in Starbucks. Keep in mind how much better the planet would be if the young Karl Marx had been able to secure a professorial appointment at the University of Jena.

So, if you have the Sitzfleisch and relish complex, clever and occasionally counter-intuitive, long-winded arguments, this is a great book. Even if you cannot fathom a word, carry it around and impress your friends with your erudition. As Oscar Wilde said, only shallow people do not judge by appearances, and Professor Caplan probably agrees though he would call it signaling.

Sexual Abuse Gets a Free Pass on Campus

Amid the tidal wave of sexual abuse allegations against powerful individuals in politics, sports, the media, the entertainment industry, and in academia, one stands out because it has not inspired the kind of collective outrage that the others have. Ithaca College’s new President, Shirley M. Collado, was accused—and convicted—of sexually abusing a female patient in 2001 while working as a psychologist at the Psychiatric Institute of Washington.

According to court records, and an article in a recent issue of The Ithacan by the student newspaper’s Editor-in-Chief Aidan Quigley, President Collado pleaded nolo contendere to sexual abuse in the Superior Court of the District of Columbia. She also admitted to living with the female patient in her home after the patient was discharged from the psychiatric hospital. Collado, who was 28 years old at the time, accepted the conviction and received a 30-day suspended sentence, 18 months of supervised probation, and 80 hours of community service at a site that the court-mandated “should not directly involve vulnerable people.”

Seven Colleges Let It Go

For the past 17 years, Collado has held teaching and administrative positions in higher education, working with students at New York University, Georgetown, George Mason University, The New School, Middlebury College, Lafayette College, and most recently at Rutgers University, where she was executive vice-chancellor and chief operating officer at the Newark campus. She has also served as Executive Vice-President of the Posse Foundation, a non-profit organization that enables low-income minority students to attend college.

Ithaca College hired Collado last year, and according to The Chronicle of Higher Education, she revealed the “claims” against her in a campus interview shortly after she was hired as president in February 2017.   A Chronicle article, “How a Nagging Detail Plays Out in a Presidential Search,” explained that the search was “closed,” meaning that the campus community was not aware of the candidates prior to hiring Collado. The daughter of immigrants from the Dominican Republic, Collado was the perfect candidate for the search committee after the former president abruptly resigned because of racial unrest on campus last year over allegations of racial injustice.

Collado served as the dean and Chief Diversity Officer at Middlebury College and also served as the co-chair of a national group called the Liberal Arts Diversity Officers. Collado told The New York Times on June 7, 2017, “I don’t have to be the chief diversity officer to be doing chief-diversity-officer work.” A Dream Candidate

Ithaca board member and chair of the Search Committee, James W. Nolan Jr. told The New York Times last June that the college was looking for a leader who would “encourage people to be talking, to be heard, that would really seem to be looking to bring the community together.” As the first person of color to lead the beleaguered college, Collado must have seemed like a dream candidate.”

In a published statement to the campus, Collado maintained her innocence and said she had made the “no-contest” plea on the basis of legal advice. She added that her decision to plead no contest occurred shortly after her husband’s suicide. She said that “she fought the claims for a while, but did not have the resources, social capital or wherewithal to keep going.” One of the members of the search committee told a reporter at The Chronicle that the search committee had “spoken with Collado about the case before she was hired. After deliberating, the committee decided the case was a singular incident and not a pattern of behavior.”

Indeed, there have been many such allegations of sexual abuse against powerful people—and powerless students and employees—in the past year that involved a “singular incident,” but few have received the kind of understanding and mercy that Collado has received. Collado has told The Chronicle that she was grateful for the support received on campus and adds that four bouquets of flowers from supportive members of the community arrived in her office last week. But Collado was allowed to enter her nolo contendere plea to a comparatively mild charge—placing one hand on a clothed breast of the patient. But the patient said there was more sex involved.

A Brave Student Editor

In contrast, Aidan Quigley, the beleaguered editor-in-chief of The Ithacan who broke the story after receiving a packet of court materials in the mail on the 2001 case from an anonymous sender, seems to be receiving few campus accolades for his courageous reporting. Letters to the editor of The Ithacan, from some Ithaca faculty members, took a harsh position against the student editor. Professor Nick Kowalczyk called Quigley’s story “shoddy reporting at best…. That this story broke quickly on Fox News and within 18 hours was commented upon no less than 922 right-wing trolls, whose comments are rife with misogyny and bigotry and white fragility suggests exactly where the sender of the anonymous package hoped for the story to land.”

Ignoring the 2001 court documents, including the witness statements, Kowalczyk simply assumes Collado’s innocence. Likewise, Harriet Malinowitz, an Ithaca lecturer in Women’s and Gender Studies wrote, “As a part-time faculty and as LGBTQ faculty, I have had my views explicitly sought by her and discussed with the kind of reflectiveness and care one usually only dreams of from a higher ed administrator. I think that she is a gem. …I am sure she is suffering greatly right now, and I hope others will join me in extending her massive outpourings of support.”

In a radio interview for WRFI, Quigley was asked if it was “fair” of him to put Collado through “double jeopardy in this way?” Claiming that since Collado had gone through court, was convicted and had a sentence, the interviewer suggested that she had already been punished enough.

Lost in all of this is the psychiatric patient who claimed in court documents to have been sexually abused by Collado. There has been no #metoo moment for her. This is unfortunate for two reasons: Students are learning that some people are part of a protected class and will be forgiven for their transgressions because of their racial identity. And victims of sexual abuse may be wary about naming abusers from a protected group. We have been told that we need to believe the victims. But as this case demonstrates, the alleged offenders’ and enablers’ excuses exhibit similar themes—claiming that “it was a singular incident” or “she is a gem,” do little to help her victim.

In response to all of this, Collado has begun to assume victim status herself now—claiming that it was “unsettling” to receive the anonymous attack on her. She has said that she felt “targeted” by the negative attention and told The Chronicle“I’ve shared things that I think most presidents don’t get up and share about who they are.” But the disturbed patient involved said Collado had sex with her repeatedly, once in a threesome with a male. And Collado picked a very vulnerable victim who had already been sexually abused as a child and again as an adult by a doctor convicted of the crime. In an age of “me too,” how much should a college overlook in a search for a diversity-minded president?

Photo: by Eugene Kukulka

Jordan Peterson and the Lobsters

Not many academics use lobsters as a stepping stone to fame, but Canadian psychologist Jordan Peterson just did. Last week, he was being questioned by a British journalist named Cathy Newman in what may have been one of the most maladroit interviews in the entire history of journalism. Every time Peterson made a point, Newman would aggressively mangle what he said, and throw it back at Peterson as an indignant accusation. (Here, Conor Friedersdorf in the Atlantic smartly analyzes this bizarre interview.)

At one Newman says, “Let me get this straight. You’re saying that we should organize our societies along the lines of the lobsters?” Peterson hadn’t suggested that we all should grow claws, live on the ocean floor or consult crustacean tradition on political organization. Peterson denied a popular leftist idea that hierarchical structures are sociological constructs of the Western patriarchy.

Here’s what Peterson said: “That is so untrue that it’s almost unbelievable. I use the lobster as an example: We diverged from lobsters’ evolutionary history about 350 million years ago. And lobsters exist in hierarchies. They have a nervous system attuned to the hierarchy. And that nervous system runs on serotonin just like ours. The nervous system of the lobster and the human being is so similar that anti-depressants work on lobsters. And it’s part of my attempt to demonstrate that the idea of hierarchy has absolutely nothing to do with sociocultural construction, which it doesn’t.”

The Lobsters Live On

A long and laudatory article on Peterson’s rebellion against PC and the leftist cast of higher education ran in the Guardian with a photo of Peterson in a jacket and tie holding a lobster in each hand. Another photo has Peterson sitting on a pile of books with a black cat in front. Floating by are a chair, a skateboard, a book, and two lobsters.

Peterson came to heavy attention in Canada and the U.S. last year when he refused to use the made-up pronouns of the transgender movement in his classes, though his employer (the University of Toronto) and his province(Ontario) insisted that he must.

Since his pronoun rebellion, Peterson has been increasingly visible on YouTube videos and has had attention from other intellectuals. In fact, he is now being regarded as one of the more significant campaigners against the domination of the campuses by the left. And his work in psychology has drawn a good deal of attention. Camille Paglia estimates him to be “the most important Canadian thinker since Marshall McLuhan.”

Tim Lott reported in the Guardian:

“He believes most university humanities courses should be defunded because they have been ‘corrupted by neo-Marxist postmodernists’ – particularly women’s studies and black studies. This has led him to be branded a member of the alt-right – although his support for socialized healthcare, redistribution of wealth towards the poorest and the decriminalization of drugs suggests this is far from the whole story. He defines himself as a ‘classic British liberal.’ But he also says – when challenged for being a reactionary – that ‘being reactionary is the new radicalism.’

Peterson has largely been in the news for his blazing, outspoken opposition to much of the far-left political agenda, which he characterizes as totalitarian, intolerant and a growing threat to the primacy of the individual – which is his core value and, he asserts, the foundation of western culture.”

He has also taken on Google, reporting that it blocked one of his YouTube videos in 28 countries as extreme.

Peterson combines a good sense of humor with a dark view that life is a catastrophe and the aim of life is not to be happy. HHHHe is a gifted and entertaining teacher whose videos have been watched more than 35 million times, and he is a passionate individualist: “Your group identity is not your cardinal feature. That’s the great discovery of the west. That’s why the west is right. And I mean that unconditionally. The west is the only place in the world that has ever figured out that the individual is sovereign. And that’s an impossible thing to figure out. It’s amazing that we managed it. And it’s the key to everything that we’ve ever done right.”

He is also willing to use apparently frivolous chapter headings in his most recent book, Rules for Life: an Antidote to Chaos

Peterson’s 12 rules

Rule 1 Stand up straight with your shoulders back

Rule 2 Treat yourself like you would someone you are responsible for helping

Rule 3 Make friends with people who want the best for you

Rule 4 Compare yourself with who you were yesterday, not with who someone else is today

Rule 5 Do not let your children do anything that makes you dislike them

Rule 6 Set your house in perfect order before you criticize the world

Rule 7 Pursue what is meaningful (not what is expedient)

Rule 8 Tell the truth – or, at least, don’t lie

Rule 9 Assume that the person you are listening to might know something you don’t

Rule 10 Be precise in your speech

Rule 11 Do not bother children when they are skateboarding.

Rule 12 Pet a cat when you encounter one on the street