Tag Archives: snowflakes

Do Free Speech Students Outnumber the Snowflakes?

As Middlebury initiated what appears to be token punishments (single-term probation) for the students who disrupted the Charles Murray talk, the college’s student government (which has yet to condemn the disruptors in any way) passed a resolution demanding that Middlebury cease all punishment of students under the current college disciplinary code, lest they “contribute to psychological trauma for marginalized students held accountable for disruption.” The vote continued a disturbing pattern of the majority of the Middlebury student body (the measure passed 10-3) seeming to endorse, or at least excuse, the actions of the mob. For a sense of the demonstrators’ hostility to free speech in their own words, listen to this New York Times podcast from Monday.

Countering this news, however, came a recent poll from Yale. Sponsored by the William F. Buckley, Jr. program, the poll found that by a more than 4-to-1 margin, Yale students opposed speech codes; and by a 16-to-1 margin, students endorsed bringing in intellectually diverse speakers, as opposed to forbidding “people from speaking on campus who have controversial views and opinions on issues like politics, race, religion or gender.” While some caveats exist (the pollster, McLaughlin, has a bad track record; and asking the second question in a different way—stressing the purported harm speakers pose to students—might have yielded a less promising result), this result is encouraging.

It also matters, from a policy angle. If, in fact, the Middlebury student government represents the majority viewpoint among most students, then little chance exists for meaningful dialogue on campus, absent very aggressive intervention, likely from trustees and perhaps even from legislators. If, on the other hand, anti-civil liberties activists represent only a minority, then colleges and universities should do more to facilitate events where the more passive (silenced?) minority of students can exchange ideas. Administrators, in particular, could do more, at relatively little cost—perhaps by adopting the University of Chicago principles, perhaps by encouraging faculty to do more to facilitate a broader array of voices speaking on campus.

Along these lines, it might be useful to share a recent experience of mine at Lafayette College. Early in the term, a newly-formed campus organization, the Mill Series, asked me to give a talk on due process and campus sexual assault. It quickly became clear things might not go well; the social media response among campus seemed fairly unfavorable, and the date of the talk had to be changed twice to avoid further inflaming campus constituencies. But the talk wound up going very well. (I’ll link to the video when available on my twitter feed.) Turnout was robust. Some questions were supportive of my thesis; some were skeptical, a few highly skeptical. But all of the questions were well-informed and responded to the actual content of the talk, rather than what the students might have thought I would say when the talk started. A couple of students even noted in the Q+A session, which wound up going several hours, that they had anticipated a somewhat different talk, seemingly because of the hostile pre-talk social media content.

So why did this talk not generate a disturbing response, like Charles Murray’s at Middlebury or Heather Mac Donald’s at Claremont McKenna? First, the organizers—Professor Brandon Van Dyck and Lafayette student Abdul Manan—actively engaged with campus critics before the talk. (Because the Mill Series has no sponsorship, they were volunteering their effort.) Obviously, this type of pre-talk engagement placed an unfair burden on their time, and shouldn’t be a requirement of any talk organizer, but their willingness to be proactive clearly defused a good deal of the tension before I came.

Second, the Lafayette students themselves already had been engaged with the issue of speech on campus. Earlier this semester, the student government had appointed an ad hoc committee to look into whether Lafayette heard from a sufficient variety of speakers. While many of the students who attended my talk (it was an ideologically diverse group) seemed critical of the committee’s work, none questioned the general principle that hearing from people with different views formed an important part of a quality liberal arts education. In a concrete way, the students’ behavior seemed to confirm the findings of the Yale poll.

For understandable reasons, protests like those at Claremont McKenna and Middlebury attract media attention. But to the extent disruptive students can be isolated rather than accommodated, colleges should do so.

NYU Professor Sides with “Snowflakes” Against Free Speech

Many leftist academics have denounced the recent spate of riots and shouting down of non-progressive speakers on college campuses – and good for them – but you knew that there were others who were glad to see students fighting back against such supposedly dangerous people as Charles Murray. One of them has put his thoughts into an op-ed piece for the New York Times and it is worth reading to understand why this kind of behavior is apt to continue.

Writing on April 24, New York University vice provost and professor of literature Ulrich Baer makes a case for the suppression of some speech in “What ‘Snowflakes’ Get Right About Free Speech.”

In Baer’s opinion, “The idea of freedom of speech does not mean a blanket permission to say anything anybody thinks. It means balancing the inherent value of a given view with the obligation to ensure that other members of a given community can participate in discourse as fully recognized members of that community.”

Let’s stop and take a look at that assertion. Freedom of speech really does mean “blanket permission” for each person to say whatever he thinks, just as free trade means blanket permission for people to enter into trade with anyone they want. Once you take away that complete freedom, you enter a world of selective permission to speak or to trade and that in turn requires having some person or group in authority to decide who receives permission and who does not.

Baer continues, declaring that “the inherent value” of some idea a person might want to express must be “balanced” against something else, namely “the obligation to ensure that other members” can “participate in discourse as fully recognized members of that community.” But how do we (that is, whatever authority gets the power) balance the value of an idea against the notion that each community member must be able to participate in discourse? If we have a regime of free speech, then everyone is able to participate in discourse and no one has to “balance” anything.

What Baer is getting at is the claim that some ideas are so hurtful to some people that those injured individuals cannot participate in discourse because they aren’t “fully recognized.”  The question he never addresses is why we should believe that.

Let’s say that a college allows someone on campus who argues in favor of white supremacy, as Auburn recently did. Everyone was free to ignore the speaker as a fool or argue against his ideas. No non-white student or other members of the Auburn community felt “unrecognized” by this speaker’s presence or unable to participate.

Baer argues that some ideas should not be debated because they “invalidate the humanity of some people.” On the contrary, even terrible ideas should be debated. Doing so sharpens the case against them, as John Stuart Mill pointed out in On Liberty.

Furthermore, Baer sets up a straw man when he writes, “I am not overly worried that even the shrillest heckler’s vetoes will end free speech in America.” Of course, the sorts of nasty actions we have seen at Berkeley, Middlebury and elsewhere won’t “end free speech in America,” but what they do accomplish is to prevent particular instances of free speech at specific places.

If we excuse those actions, as Baer does, we will get more of them and less free speech. You would think that a college professor would understand that our national commitment to freedom of speech necessarily means defending it each time it is attacked.

Implicit in Baer’s piece is the idea that because certain groups of people are less adept at making rational arguments for themselves, they should be allowed to veto people who are (or at least might be) good at that by preventing them from speaking. That, obviously, is a dangerous concept. Who then gets to decide when a person or idea is unacceptable and deserves to be censored? History gives us the answer: It will be those who are zealous fanatics for authoritarian programs that undermine civility and our social fabric.

Why Millennials Are So Fragile

I have stopped counting the number of times that an academic colleague reminds me that “undergraduates are not what they used to be.” In private conversations, a significant minority of academic teachers have raised the concern that the age-old distinction between school children and university students was fast losing its meaning.

Back in 2003, Neil Howe and William Strauss, the authors of the study Millennials Go to College, advanced the thesis that this generation is far less mature and resilient than previous ones. They noted that the millennial generation is far more “closely tied to their parents” than the students that preceded them, and they also insist on a “secure and regulated environment.”

Howe and Strauss concluded that as a result, students today find it difficult to flourish in the relatively unstructured environment of higher education. The assessment that the millennials find it more troublesome to make the transition to independent living on campuses than previous generations is widely held by educators on both sides of the Atlantic.

A report last September from Britain’s Higher Education Policy Institute said that the normal experiences of university life now constitute serious challenges to the well-being of the current cohort of students. It noted that “students are vulnerable” because in most cases they are living away from home for the first time. It also pointed to the new challenges they faced such as “a different method of learning” and “living with people they have never met before.”

Related: Should Colleges Coddle the Whiners?

One of the most significant and yet rarely analyzed developments in campus culture has been its infantilization.  Eric Posner, a leading legal scholar at the University of Chicago, declared that “students today are more like children than adults and need protection.” Posner contends that today’s university students are not ready for independence and require the moral guidance of their institutions.

In England, a group educators have criticized universities for treating their new students as if they were young adults. Sir Anthony Seldon, now head of Buckingham University, stated that ‘there is a belief among Vice Chancellors that young people are adults and can fend for themselves, but “18-year-olds today are a lot less robust and worldly wise.”

Most accounts of the unprecedented emotional fragility of university undergraduates claim that this development is the outcome of the expansion of student numbers. They suggest that many of these students come from diverse non-traditional backgrounds and lack the confidence and financial security of their more privileged predecessors. Catherine McAteer, the head of University College London’s student psychological services observed that the reason why a growing number of students require mental health support is because “students are now coming to university” who previously “would not have come.”

Some argue that first-generation students –undergraduates whose parents did not attend university – face unique problems attempting to fit into an alien, high-pressure environment. It is also asserted that since a significant proportion of first-generation students come from minority and socially deprived backgrounds they face a unique problem of adjusting to the traditional white middle- class campus environment.

Related: The New Age of Orthodoxy Overtakes the Campus

The principal problem faced by first-generation students is that their parents had little cultural capital to hand on to them and were, therefore, less prepared for university life than their more comfortably off peers. But unlike today, the problems they faced was not portrayed in psychological terms but in the language of culture and socio-economic deprivation.

Unfortunately, when first-generation students arrive on campus today, they are often treated as if they are likely to possess some emotional deficits. In the U.S. it is common for universities to organize special programs for integrating first-generation students. Diversity officers dealing with the first-generation often operate under the theory that this group faces a unique problem of being torn between family and university. They frequently contend that first-generation students suffer from guilt for leaving their family behind. The upshot of these theories is the belief that first-generation students need special dedicated psychological support.

Regrettably, the focus on psychology distracts attention from more constructive ways of preparing students from disadvantaged backgrounds to deal with the pressures of academic learning. The provision of academic support to help students gain intellectual confidence is probably the most useful way of helping students to make their way in the university.

Perversely the provision of psychological support as the default solution for helping first-generation students is likely to intensify their quest for validation. Instead of developing their power of resilience it may well heighten their sense of vulnerability. What universities need to do is not to cultivate the insecure identity of first-generation students but to provide them with the intellectual resources that will help them to gain confidence in their ability to achieve.

Related: Millennials Not Ready for the Job Market

In any case, it is far from evident if the link between emotional fragility and a student’s non-traditional background explains very much. Students from well-to-do backgrounds are no less likely than their poorer peers to talk the language of trauma and psychological distress. Indeed some of the most privileged campuses– Oxford, Cambridge, Yale, Berkeley, Oberlin – have been in the forefront of campaigns that focus attention to the emotional harms suffered by students from a variety of alleged causes.

The reason why the current generation appears to behave differently from their predecessor has little to do with their socio-economic background. Rather the sense of emotional fragility expressed by some undergraduates is the outcome of the prevailing ethos of socialization that treats young people as children.

The socialization of young people has become increasingly reliant on therapeutic techniques that have the perverse effect of encouraging children and youth to interpret existential problems as psychological ones. The concern with children’s emotions has fostered a climate where many young people are continually educated to understand the challenges they face through the language of mental health. Not surprisingly, they often feel find it difficult to acquire the habit of independence and make the transition to forms of behavior associated with the exercise of autonomy.

The complex emotional tensions that are integral to the process of growing up are now discussed as stressful events with which children and young people cannot be expected to cope. Yet is through dealing with such emotional upheavals that young people learn to manage risks and gain an understanding of their strengths and weaknesses. Instead of being encouraged to acquire an aspiration for independence, many youngsters are subject to influences that promote childish behavior. The infantilization of young people is the unintended outcome of parenting practices that rely on levels of support and supervision that are more suitable for much younger children.

The relations of dependence that are nurtured through these practices serve to prolong adolescence to the point that many young people in their 20s do not perceive themselves as adults. Whereas in the past infantilization was classically associated with the phenomenon of maternal overprotection, today the prolongation of adolescence is culturally sanctioned. In the case of universities, it is institutionally enforced.

Socialization through validation

The erosion of the line that divides secondary from higher education is a trend that contradicts the ethos of academic teaching and the vocation associated with it. In theory, the ideals associated with the university remain widely affirmed, but in practice, they are often tested by the introduction of conventions that were formerly confined to secondary education. The adoption of paternalistic practices and the wider tendency towards the infantilization of campus life can in part be understood as an outcome of the difficulties that society has encountered in the socialization of young people.

For some time now it has been evident that parents and schools have been struggling with the transmission of values and rules of behavior to young people. In part, this problem was caused by the lack confidence of older generations in the values into which it was socialized. More broadly, Western society has become estranged from the values that used to inspire it in the past and found it difficult to provide its adult members with a compelling narrative for socialization.

The hesitant and defensive manner with which the task of socialization is pursued has created a demand for new ways of influencing children. The growing remission of child protection and the widening of the territory for parenting activities can be interpreted as an attempt to develop new methods for guiding children.

Lack of clarity about the transmission of values has led to a search for alternatives. The adoption of the practices of behavior management serves as one influential approach towards solving the problem of socialization.  These psychological techniques of expert-directed behavior management have had an important influence on childrearing. From this standpoint, the role of parents is not so much to transmit values but to validate the feelings, attitudes and accomplishment of their children.

Though parents still do their best to transmit their beliefs and ideals to their children, there is a perceptible shift from instilling values to the provision of validation. Affirming children and raising their self-esteem is a project that is actively promoted by parents as well as schools. This emphasis on validation has run in tandem with the custom of a risk-averse regime of child-rearing. Social psychologist Jonathan Haidt has described this form of childrearing as that of “fearful parenting.” He claims that since the 1980s, children have been “protected as fragile,” which has the perverse consequence of undermining their capacity for resilience.

As I noted in my study, Paranoid Parenting, the (unintended) consequence of this regime of parenting has been to limit opportunities for the cultivation of independence and to extend the phase of dependence of young people on adult society. The extension of the phase of dependence is reinforced by the considerable difficulties that society has in providing young people with a persuasive account of what it means to be an adult. Instead of encouraging new undergraduates to embark on a life of independent study, universities have adopted a paternalistic ethos that treats them as biologically mature children. In this way, they have helped create a campus culture that discourages young people from embarking on the path to adulthood.

Wear that Batik Dress and You’re a Cultural Appropriator

 What follows are excerpts from the keynote speech on “Fiction and Identity Politics” delivered September 8 at the Brisbane Writers Festival in Australia by Lionel Shriver.

Let’s start with a tempest-in-a-teacup at Bowdoin College in Brunswick, Maine. Earlier this year, two students, both members of student government, threw a tequila-themed birthday party for a friend. The hosts provided attendees with miniature sombreros, which—the horror— numerous partygoers wore.

When photos of the party circulated on social media, campus-wide outrage ensued. Administrators sent multiple emails to the “culprits” threatening an investigation into an “act of ethnic stereotyping.” Partygoers were placed on “social probation,” while the two hosts were ejected from their dorm and later impeached. Bowdoin’s student newspaper decried the attendees’ lack of “basic empathy.”

The student government issued a “statement of solidarity” with “all the students who were injured and affected by the incident,” and demanded that administrators “create a safe space for those students who have been or feel specifically targeted.” The tequila party, the statement specified, was just the sort of occasion that “creates an environment where students of color, particularly Latino, and especially Mexican, feel unsafe.” In sum, the party-favor hats constituted – wait for it – “cultural appropriation.”

Curiously, across my country Mexican restaurants, often owned and run by Mexicans, are festooned with sombreros – if perhaps not for long.

But what does this have to do with writing fiction? The moral of the sombrero scandals is clear: you’re not supposed to try on other people’s hats. Yet that’s what we’re paid to do, isn’t it? Step into other people’s shoes, and try on their hats.

Yet were their authors honoring the new rules against helping yourself to what doesn’t belong to you, we would not have Malcolm Lowry’s Under the Volcano. We wouldn’t have most of Graham Greene’s novels, many of which are set in what for the author were foreign countries, and which therefore have Real Foreigners in them, who speak and act like foreigners, too.

In his masterwork English Passengers, Matthew Kneale would have restrained himself from including chapters written in an Aboriginal’s voice – though these are some of the richest, most compelling passages in that novel. If Dalton Trumbo had been scared off of describing being trapped in a body with no arms, legs, or face because he was not personally disabled – because he had not been through a World War I maiming himself and therefore had no right to “appropriate” the isolation of a paraplegic – we wouldn’t have the haunting 1938 classic, Johnny Got His Gun.

Though the book is nonfiction, it’s worth noting that we also wouldn’t have 1961’s Black Like Me, for which John Howard Griffin committed the now unpardonable sin of “blackface.” Having his skin darkened – Michael Jackson in reverse – Griffin found out what it was like to live as a black man in the segregated American South. He’d be excoriated today, yet that book made a powerful social impact at the time.

The author of Who Owns Culture? Appropriation and Authenticity in American Law, Susan Scafidi, a law professor at Fordham University who for the record is white, defines cultural appropriation as “taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission. This can include unauthorized use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”

What strikes me about that definition is that “without permission” bit. How are we fiction writers to seek “permission,” to use a character from another race or culture, or to employ the vernacular of a group to which we don’t belong? Do we set up a stand on the corner and approach passers-by with a clipboard, getting signatures that grant limited rights to employ an Indonesian character in Chapter Twelve, the way political volunteers get a candidate on the ballot?

I am hopeful that the concept of “cultural appropriation” is a passing fad: people with different backgrounds rubbing up against each other and exchanging ideas and practices is self-evidently one of the most productive, fascinating aspects of modern urban life.

But this latest and little absurd no-no is part of a larger climate of super-sensitivity, giving rise to proliferating prohibitions supposedly in the interest of social justice that constrain fiction writers and prospectively makes our work impossible.

So far, the majority of these farcical cases of “appropriation” have concentrated on fashion, dance, and music: At the American Music Awards 2013, Katy Perry got it in the neck for dressing like a geisha. According to the Arab-American writer Randa Jarrar, for someone like me to practice belly dancing is “white appropriation of Eastern dance,” while according to the Daily Beast Iggy Azalea committed “cultural crimes” by imitating African rap and speaking in a “blaccent.”

The felony of cultural sticky fingers even extends to exercise: at the University of Ottawa in Canada, a yoga teacher was shamed into suspending her class, “because yoga originally comes from India.” She offered to re-title the course, “Mindful Stretching.”

Seriously, we have people questioning whether it’s appropriate for white people to eat pad Thai. Turnabout, then: I guess that means that as a native of North Carolina, I can ban the Thais from eating barbecue. (I bet they’d swap.)

Writing fiction is a disrespectful vocation by its nature – prying, voyeuristic, kleptomaniacal, and presumptuous. And that is fiction writing at its best. When Truman Capote wrote from the perspective of condemned murderers from a lower economic class than his own, he had some gall. But writing fiction takes gall.

As for the culture police’s obsession with “authenticity,” fiction is inherently inauthentic. It’s fake. It’s self-confessedly fake; that is the nature of the form, which is about people who don’t exist and events that didn’t happen. The name of the game is not whether your novel honors reality; it’s all about what you can get away with.

My most recent novel The Mandibles was taken to task by one reviewer for addressing an America that is “straight and white”. It happens that this is a multigenerational family saga – about a white family. I wasn’t instinctively inclined to insert a transvestite or bisexual, with issues that might distract from my central subject matter of apocalyptic economics. Yet the implication of this criticism is that we novelists need to plug in representatives of a variety of groups in our cast of characters, as if filling out the entering class of freshmen at a university with strict diversity requirements.

You do indeed see just this brand of tokenism in television. There was a point in the latter 1990s at which suddenly every sitcom and drama in sight had to have a gay or lesbian character or couple. That was good news as a voucher of the success of the gay rights movement, but it still grew a bit tiresome: look at us, our show is so hip, one of the characters is homosexual!

We’re now going through the same fashionable exercise in relation to the transgender characters in series like Transparent and Orange is the New Black.

Fine. But I still would like to reserve the right as a novelist to use only the characters that pertain to my story.

Besides: which is it to be? We have to tend our own gardens, and only write about ourselves or people just like us because we mustn’t pilfer others’ experience, or we have to people our cast like an I’d like to teach the world to sing Coca-Cola advert?

For it can be dangerous these days to go the diversity route. Especially since there seems to be a consensus on the notion that San Francisco reviewer put forward that “special care should be taken with a story that’s not implicitly yours to tell.”

Thus in the world of identity politics, fiction writers better be careful. If we do choose to import representatives of protected groups, special rules apply. If a character happens to be black, they have to be treated with kid gloves, and never be placed in scenes that, taken out of context, might seem disrespectful. But that’s no way to write. The burden is too great, the self-examination paralyzing. The natural result of that kind of criticism in the Post is that next time I don’t use any black characters, lest they do or say anything that is short of perfectly admirable and lovely.

Why, it’s largely in order to keep from losing my fictional mojo that I stay off Facebook and Twitter, which could surely install an instinctive self-censorship out of fear of attack. Ten years ago, I gave the opening address of this same festival, in which I maintained that fiction writers have a vested interest in protecting everyone’s right to offend others – because if hurting someone else’s feelings even inadvertently is sufficient justification for muzzling, there will always be someone out there who is miffed by what you say, and freedom of speech is dead. With the rise of identity politics, which privileges a subjective sense of injury as actionable basis for prosecution, that is a battle that in the decade since I last spoke in Brisbane we’ve been losing.

Worse: the left’s embrace of gotcha hypersensitivity inevitably invites backlash. Donald Trump appeals to people who have had it up to their eyeballs with being told what they can and cannot say. Pushing back against a mainstream culture of speak-no-evil suppression, they lash out in defiance, and then what they say is pretty appalling.

We should be seeking to push beyond the constraining categories into which we have been arbitrarily dropped by birth.

Writing during the day and reading when I go to bed at night, I find it an enormous relief to escape the confines of my own head. Even if novels and short stories only do so by creating an illusion, fiction helps to fell the exasperating barriers between us, and for a short while allows us to behold the astonishing reality of other people. We fiction writers have to preserve the right to wear many hats – including sombreros.

Lionel Shriver is an American journalist and author who is resident in the United Kingdom. Her novels include We Need to Talk About Kevin and The Mandibles.