Tag Archives: academically adrift

More Bad News about College

What was the most noteworthy finding of the recent Gallup survey of people who have attended college? Half of the 90,000 respondents regretted one significant decision made as an undergrad, such as picking the wrong major. In journalistic terms, this is known as burying the lede — downplaying the major point of a story while elevating some minor point.

The major finding—stunning really–appears under the heading, “Most U.S. Adults Say They Had a High-Quality Postsecondary Education.” Gallup asked respondents whether they received a “high-quality” education, and they answered overwhelmingly in the positive.

Fifty-eight percent of those who earned a bachelor’s degree assigned their school a 5, the highest testimony, while another 31 percent graded them a 4.  Even those who never earned a degree came out at 40 percent a 5 and 30 percent a 4.

But most college students in America today do not receive a high-quality education. Academically Adrift, the 2010 book that struck the world of higher education like a bombshell, proved that only a small number of students make significant gains in critical thinking and problem-solving from freshman to senior year. Every survey of employers, too, shows them complaining about poor reading and writing skills.

Related: College Students Now, the Good and the Bad

One poll is particularly relevant here. When the Association of American Colleges & Universities commissioned a poll of college students and employers that focused on the latter’s workplace readiness, an astonishing gap was revealed. While 65 percent of students said that they were “well prepared” in written communication, only 27 percent of employers agreed. Similarly, wide discrepancies showed up in areas of critical thinking, problem-solving, and critical thinking.

Coincidentally, a week after the Gallup poll was published, The Wall Street Journal published an investigative report from Collegiate Learning Assessment on the scores of dozens of colleges and universities dating back to 2013 titled, “Many Colleges Fail in Teaching How To Think.”  The results are embarrassing, and they reinforce the judgments of Academically Adrift authors Richard Arum and Josipa Roksa.

The Journal reported that at more than half of 200 schools tested, at least a third of seniors were unable to make cohesive arguments, assess the quality of evidence in a document or interpret data in a table. This is a devastating finding. International rankings show U.S. college grads in the middle of the pack on numeracy and literacy and near the bottom when it comes to problem-solving.

The gist paragraph reads, “At some of the most prestigious flagship universities, test results indicate the average graduate shows little or no improvement in critical thinking over four years.”

Related: How to Make College as Bad as High School

The data comes from public records requests, and so the Journal’s findings apply to public institutions, not private ones. The biggest point gain didn’t come from top research universities. Plymouth State University in New Hampshire led the list.  The University of Kentucky and University of Texas-Austin students didn’t show much improvement at all. Those schools no longer use the test.  University of Louisiana—Lafayette scored low as well, and it, too, has dropped the test.

The Journal quotes a 2011 Lafayette graduate who recalls, “I wasn’t as focused as I should have been, but in a lot of classes, we just watched videos and documentaries, then we would talk about them. It wasn’t all that challenging.”  He now works in a local coffee shop.

I have no doubt that any other objective measure of actual learning that takes place from matriculation to graduation—except for the competitive areas of pre-med and STEM fields—will replicate these disappointments. Even if super-selective institutions point to the strong scores that their graduates earn on the CLA, they will not be able to show much value-added impact. That is, their students came in with sound critical thinking skill, and they left with, oh, a bit more of it.

Those data points force another interpretation of the high ratings people give to the quality of higher education. Instead of proving the actual rigor and excellence of undergraduate instruction in the United States, the sanguine estimates evince the low educational standards of American millennials. They just don’t know what actual excellence is. How could they when grade inflation in high school and college has reached such an absurd level that nearly half of all college grades are in the A range. If their teachers awarded them the top mark, well, then, they learned a lot in the course.  If the work that was required of them during the semester seemed suspiciously light, well, that may be due to the sparkling intelligence of the student, not to a cushy workload.

Or, perhaps, the faith that they received a high-quality education only proves their high gullibility. Every college has abundant marketing materials that proclaim the wonderful education they provide, and the students trust those pledges of superiority. It soothes their vanity. After all, the more superb the education they received, the more educated they are. The respondents in the Gallup poll are early in their adult lives, searching for jobs and for spouses, they want to believe in their own special condition.  Acknowledging a crummy education hampers their self-confidence. They need the power of positive thinking.

Millennials have been encouraged ever since kindergarten to overestimate their own abilities. They aren’t going to stop once they graduate. It takes several years of the realities of the American workplace to contain their judgment.

Why Did ‘Academically Adrift’ Strike a Chord?

Academically Adrift: Limited Learning on College Campuses by myself and Josipa Roksa  (2011) had 67 pages of statistical tables and was described as “a dense tome that could put Ambien out of business.” Yet it was one of those rare social science books that found a readership and influence outside of typical disciplinary boundaries. 

Why did Academically Adrift capture more attention than one would expect?  We can potentially find an answer to this question by considering a recent study by Pamela Barnhouse Walters and Annette Lareau that focuses on why some
educational research ends up being influential. Walters and Lareau argue that “it is the consistency of the research findings with prevailing political concerns, with prevailing understandings of what is wrong with schools and schooling, and
with already-formed policy preferences of powerful social groups” that
determines its influence on policy development.


According to Chester Finn, educational research shapes policy, “when
advocates, policy makers and journalists… (are) able to use these studies to
devise, justify, or sustain a reform agenda. Thus the research is less a source
of change and more an ‘arsenal’ for those already fighting the policy wars.”

Perhaps not surprisingly, the most hostile response to the work came from those associated with promoting the expansion of student service infrastructure on college campuses.  In recent decades, as we noted in Academically Adrift, this is the area of colleges and universities that has grown most rapidly.  As full-time faculty positions have decreased in relative terms, quasi-professional staffing on campus has increased and been charged with taking greater responsibility for ensuring the general well-being and promoting the social engagement and attachment of students with college life.  Social engagement policy and programs on campuses were promoted in part to reduce the high attrition rates of college students – since the more individuals are integrated into campus social life, the less likely they are to drop out.

In recent decades, however, advocates of increased social engagement went a step further.  Educational researchers identified an association between self-reported learning and social engagement.  Based on these findings, social engagement was advocated as a strategy to promote both retention and learning.

Academically Adrift was an explicit challenge to this research and programmatic paradigm.  In our project, we found no evidence that social engagement was associated with positive learning outcomes.  Our work thus was understood as in direct opposition to those promoting such investments in college as well as a research apparatus that was dependent on the use of survey instruments focused on self-reported learning.  Criticism from these quarters sought to discredit the work by challenging the CLA measure. The CLA was argued to be incapable of measuring generic collegiate learning – regardless of the fact that in our study it was demonstrated to be sensitive to instruction         

Roksa and I, however, were the first to admit to the limitations of the CLA.  All
assessment instruments are by definition limited and imperfect.  Clearly, though, the CLA was an improvement over the widespread use of self-reported learning measures that some of our critics had utilized in their work.  In addition, other researchers relying on a different national sample and an alternative objective multiple-choice assessment instrument designed to track higher order generic skills, such as critical thinking and complex reasoning, generated largely similar findings as those we reported on the CLA measure.  In addition, the low levels of
academic engagement we found in our work were consistent with a large number of other studies, including findings form some of our critics own survey
research.  We were not exaggerating the limited learning occurring in these setting, but instead attempting to report descriptive findings on the state of higher education accurately.
 

This is an excerpt from a paper delivered in New York City October 24 at a conference, “Changes in Higher Education Since the 1960s,” sponsored by Manhattan Institute and the journal Society.

Three Things Colleges Don’t Want Us to Know

professor2.jpg

Universities are in the knowledge business, and the creation and
dissemination of it is at the very core of what colleges do. Yet some forms of
knowledge about higher education itself are either unknown, or hidden from the
public. Why? Release of the information would prove embarrassing and possibly
even costly to the school.

1. What Are the Teaching Loads?

This is prompted by an email I received from Bill Armstrong, President
of Colorado Christian College and former two-term U.S. Senator. He is looking
for data on faculty teaching loads and cannot find it. Going to the latest Digest of Education Statistics, I learn
that there were 7,500 faculty members teaching agricultural or home economics
courses in 2003 between the ages of 35 and 39, or that there were 1,959
full-time equivalent faculty teaching in Delaware in 2009. But in over 20
tables on staffing, there is not a word on teaching loads.

Why? I suspect the reason is simple: faculty don’t teach very much, and
far less than they used to. I have been around higher education for over 50
years, and my recollection is that at middling quality state schools in the
early 1960s, most faculty taught around 12 hours a week. At those same schools
today, the average load is almost certainly not more than 9 hours. At
top-flight universities, faculty taught about six hours a week in the 1960s,
and often 3 hours or 4.5 hours (one semester, one course, the second semester,
two courses) now.  On average, we have
seen at least a 25 percent reduction in loads.

Why? We are told it is because of the need to expand research output.
And surely the number of academic journals and other outlets has exploded.  But what percent of the research gets
seriously read or cited? Mark Bauerlein of Emory, a regular contributor to
Minding the Campus, has demonstrated that vast amounts of research are seldom
even cited, and that the number of articles written in the last 25 years or so
about, say, Shakespeare, reaches into the tens of thousands. Do not diminishing
returns set in regarding academic research like it does everything else in
life?

Continue reading Three Things Colleges Don’t Want Us to Know

What Will They Learn? Maybe Not Much

Academically Adrift“, a study by two sociologists – Richard Arum of NYU and Josipa Roksa of the University of Virginia – demonstrated that 36
percent of our college students graduate with little or no measurable gains in
their core academic skills – areas like expository writing and analytical
reasoning.  Their diplomas are literally tickets to nowhere.  No, I
take that back.  With an average student debt of $25,250, they are tickets
to long-term financial crises that can curtail their opportunities for
decades. 

The higher education establishment assures us that this
poor showing is due to the underfunding of colleges. 
Not so.  The average per-pupil expenditure on higher education in America is
more than twice the average of other industrialized nations.  No, the
problem is not too little money.  It is too little attention to what
matters.  What do students learn during those expensive college years?

Continue reading What Will They Learn? Maybe Not Much

Grade Inflation All the Way Up

Among the many troubling findings cited by Richard Arum and Josipa Roksa in Academically Adrift is this remarkable note on grade inflation:
 
—–55 percent of college students have a B+ grade average or higher (3.3 and higher)
—–85 percent of college students have a B- grade average or higher (2.7 and higher)
 
Those numbers demonstrate what most everybody has known for a long, long time.  The old grading system running from A to F is virtually meaningless.  Today, except for the small fraction of low outliers and dropouts, the only meaningful discriminations happen at the high end.  Are you a B+ student or an A- student?  The difference may come down to only two-tenths of a point, but generally that’s the best the scale can do.
 
Everybody knows it, especially when you get outside Organic Chemistry, Advanced Calculus, and other science/med school prerequisites which have a forthright weeding-out purpose.  In a humanities class, a “C” counts as a lost case.  I am just as guilty of the practice as most, but at least my colleagues and I usually admit it.

Continue reading Grade Inflation All the Way Up

Let’s Push Trustees to Solve the Adjunct Problem

unemployed-phd.jpg

For years now, a sad, steady flow of articles, books, and studies has documented the rise of the “disposable academic,” the growing underclass of poorly paid, uninsured PhDs who do the bulk of college teaching but have no real chance of ever landing a secure academic job. This is a tragedy, the argument goes, not only for the young scholars who will never become professors, but also for undergraduates (whose educations suffer when they are taught by “disposable” teachers) and for progress itself (adjunct work is not conducive to original research, open debate, and knowledge production).

But despite the large body of work on the subject, the ratio of “securely-employed” to “disposable” has only gotten worse over the years. As Richard Arum and Josipa Roksa report in their recent study Academically Adrift, in 1970, 78 percent of college teachers at degree-granting institutions were full-time faculty; in 2005, only 52 percent were.  Fully three quarters of all faculty appointments today are non-tenure-track. “The professor”–as a job, a vocation, and an academic institution–seems to be disappearing. 
 
Moreover, warnings can only fall on deaf ears so many times before they sound absurd. People are still flocking to grad school. Grad schools are still admitting students–and, despite high attrition, they are still overproducing PhDs. In Higher Education?, Andrew Hacker and Claudia Dreifus report that between 2005 and 2007, American universities awarded over 100,000 doctorates–while creating less than 16,000 tenure-track assistant professorships.  
 
“Don’t make fun of grad students,” Marge Simpson tells Bart after he taunts a group of PhDs working at the local bookstore. “They just made a terrible life choice.” It may be bad behavior, but mocking grad students’ “terrible life choice” is turning out to be a popular pastime for academia’s armchair humorists. In recent months, thanks to the DIY movie site Xtranormal.com, a number of grad students and professors have created short cartoons spoofing the hapless would-be professor. First there was “So You Want to Go To Law School,” which has been viewed over a million times since its launch in October 2010. Parodying the starry-eyed idealism of wannabe lawyers, the film had an obvious applicability to grad school, where starry eyes and poor employment prospects are in even greater supply. Short videos on the career suicide of pursuing a PhD in physics, economics, political science, the humanities, and more followed in quick succession.

Continue reading Let’s Push Trustees to Solve the Adjunct Problem

Cut the Sniping—It’s a Great Book

The sniping has begun about Richard Arum and Josipa Roksa’s great new book Academically Adrift. Predictably, people are saying the test instruments used (especially the Collegiate Learning Assessment or CLA but also the National Survey of Student Engagement or NSSE) are imperfect, they look at only a small number of relatively anonymous schools, etc. These complaints on the survey have some validity, but the reality is the higher education community has not collected the data or developed the test instruments that could allow for a broader wider test. Why, for example, don’t we have a test of general knowledge, something of an extension of the Adult Civic Literacy Test developed by the Intercollegiate Studies Institute, that is administered widely at the beginning and end of the college careers of students at any institutions receiving (or whose students receive) federal grant or loan money? Why aren’t the NSSE results published for the hundreds of schools using it? Or, why not at least administer the National Assessment of Educational Progress exam given to 17 year olds again to 21 or 22 year olds near the end of their college career? Higher education has fought transparency and accountability, so researchers have to use the limited information available.

Basically, Arum and Roksa argue that students work little in college and consequently learn little. Most of us who have been in higher education for decades know that this is true, even when we don’t want to admit it. But why? You don’t have to read very far in Academically Adrift to find the answers. Below are a series of quotes either from the authors or from sources they cite, one from each of the first 10 pages of the book:

Continue reading Cut the Sniping—It’s a Great Book

The Trouble With Rigor

The big news in higher education last week was the issuance of findings from Academically Adrift: Limited Learning on College Campuses, a scientific study of how much college students progress intellectually during their four years on campus. Two researchers, Richard Arum, professor of sociology and education at New York University and director of the Education Research Program of the Social Science Research Council, and Josipa Roksa, assistant professor of sociology at the University of Virginia, charted scores on the Collegiate Learning Assessment to determine changes from the time of their arrival to senior year. The CLA isn’t subject-based, so the study couldn’t gauge changes in domain knowledge, but it does aim to calculate abstract thinking and analytical “competencies.” The results were abysmal. (An excerpt is here)
For nearly half of the students (45 percent), no significant improvement took place during the first two years of college.
For more than one-third (36 percent), no significant improvement took place over four years’ time. They are, in the authors’ words, “academically adrift.”
Why the poor showing? Several reasons, the authors say.
One, more and more students “report that they spend increasing numbers of hours on nonacademic activities, including working, rather than on studying.” For all too many of them, the classroom is a part-time thing.
Two, more students “enroll in classes that do not require substantial reading or writing assignments.”
Three, they “interact with their professors outside of classrooms rarely, if ever.”
And four, “they define and understand their college experiences as being focused more on social than on academic development.”
The response to Academically Adrift has been voluminous and mixed. One common negative reply is to challenge the methodology (the CLA is limited, etc.), while a common positive reply is to agree and denounce higher ed corruptions. (See here and here and here and a video of one of the authors here.
But colleges are in a bind either way. They are under pressure to open access and keep retention rates high. But the obvious solution to the low-learning problem—raise standards, assign more reading and writing, increase rigor—might improve test scores, but the other rates will fall. That is, if homework goes up and assignments get more rigorous, dropouts and flunk-outs will rise as well. At the very least, grades will plummet. Reaction will follow. Colleges are under intense pressure to get kids in the door and keep them there. If the retention rate falls, they have a lot of explaining to do in public.
So keep that dilemma in mind. The more you make students work, the fewer students will cross the finish line.

Students Who Learn Little or Nothing

I can’t recall a book on higher education that arrived with so much buzz, and drew so much commentary in the first two days after publication. The book is Academically Adrift: Limited Learning on College Campuses, by Richard Arum, and Josipa Roksa (University of Chicago Press). Arum is a professor of sociology and education at New York University and Roksa is an assistant professor of sociology at the University of Virginia.
Inside Higher Ed reported on the work yesterday, hailing it, if that’s the right word, as “a damning new book… asserting that many college students graduate without actually learning anything.”
After looking at data from student surveys and transcript analysis of 2300 students around the country, the authors concluded that 45 percent of students “did not demonstrate any significant improvement in learning” in their first two years of college, and 36 percent showed the same lack of significant progress over four years. Students improved on average only 0.18 standard deviations over the first two years and 0.47 over four years. “What this means,” Inside Higher Ed reported, “is that A student who entered college in the 50th percentile of students in his or her cohort would move up to the 68th percentile four years later—but that’s the 68th percentile of a new group of freshmen who haven’t experienced any college learning.”
Since our copy of the book arrived only today, we haven’t finished reading it, but we assume that its huge welcome in educational circles has a lot to do with the many books and articles deploring the lack of study on our campuses, the large number of college grads working at low-level jobs, books arguing that partying is the main activity of a great many collegians, and articles such as Peter Sacks’ here reporting on the all too common disengaged and academically tone deaf college students of today. We will have more to say later about Academically Adrift.