
A traditional four-year education has historically been advertised to encompass two key components: knowledge formation and practical application.
Critical thinking and intellectual debate—values with intrinsic worth that quench one’s thirst for knowledge and foster analytical assessment in daily life—exist as the base of academia: knowledge formation. On top of this stands practical application—expertise and skills that are used in the workforce and are directly applicable to a student’s chosen major of study. The goal at graduation is for students to walk away with both the wisdom that comes with intellectual study and the tools to contribute to society through their chosen field.
But is higher education truly fulfilling what it advertises?
In February of 2025, Hult International Business School released the survey results of a study sponsored by their university, which examined the extent to which traditional four-year degrees prepare young adults to enter the workforce.
Using a study cohort of 800 HR executives as well as 800 young professionals between the ages of 22 and 27, the report examined opinions on how the students—and the executives evaluating them—felt the degrees shaped their overall workplace skillset and equipped them for the demands of the workforce.
[RELATED: Degrees Have Value—But Employers Shouldn’t Require Them]
From the student perspective, greater than three quarters felt the first six months of their post-graduation job taught them more than their entire four years spent at university. Only a quarter reported feeling prepared with the skills necessary to adequately fulfill the demands of their position. Ultimately, the overarching consensus was a desire for better workplace preparation in university.
HR executives shared a similar sentiment in their evaluation of the entry-level employees fresh out of college. They note significant problems with the fresh applicant pool to include not having professional experience—or the right skills—and a lack of other interpersonal abilities such as adaptability, collaboration, and workplace etiquette.
Around 98 percent of these executives blamed the colleges for these gaps.
It is important to note that although the report did not clearly specify the degrees obtained by these recent graduates, it is implied that these students obtained a degree in business or a related field, such as communications.
Despite its limited scope, these findings hold significance for the broader university population, revealing a clear deficiency in the university system’s encouragement and development of applicable expertise within the talent pool.
From my own personal experience, having studied at a liberal arts college while majoring in science, I was fortunate to experience exactly what four-year universities advertise: wisdom formation combined with workforce skill development.
With a foundation in liberal arts education as an undergraduate, I gained a broad base of knowledge through courses spanning the humanities, civics, art, and global studies. As I noted in a previous article, “What Good Are the Liberal Arts? Ask a Scientist,” these classes expanded my understanding of the world and deepened my love of writing—eventually leading me to publish creative work and to my current position at Minding the Campus.
Beyond academics, I was equally immersed in learning practical skills associated with my major.
For science majors, the practical application of scientific principles is already naturally taught in laboratory classes. Besides this, however, I was strongly urged by professors and advisors to seek out institutional work opportunities to advance my skills, leading me to serve in positions ranging from laboratory technician to teaching assistant to peer tutor.
Having chosen to continue with my master’s at the same university, I discovered that serving in these positions—and, therefore, networking throughout the departments—fortuitously opened doors to paths I never thought possible. Though I am only halfway through my graduate degree, I have already had the privilege to teach laboratory class material solo, write and secure my own research grant, and co-coordinate our institution’s Interdisciplinary Research Conference. Most recently, I have been invited to join the editorial board for the conference’s proceedings publication.
I recognize how rare these experiences are, and they have undoubtedly strengthened my career prospects. And this was all thanks to continuing opportunities to master translating my academic knowledge into applied products—scholarly outputs, meticulously prepared laboratory materials, concise scientific lectures, etc.
Not all students are so fortunate to receive both the knowledge-building and practical application sides of higher education, not even at my own university.
As both an undergraduate student and now a graduate teaching assistant, I’ve observed students receive little guidance on how to balance academic success with practical experience. I’ve also listened to many express apprehension that work experience might mar their academic record.
Students with 4.0 GPAs have personally shared fears about tackling experiential learning for fear it would tarnish his or her transcript. Even those at the top of their class!
Understanding the theory behind the practice, as set forth by intellectual development, is extremely valuable—but it isn’t enough. Experience during college exists as the missing puzzle piece in bridging the university experience with fulfilling one’s calling in the world, and delivering the promises set forth by higher education’s core mission.
In my June article on unemployment among computer science majors, I demonstrated that gaps clearly exist in the computer science curricula offered by universities. The essential skills that real-world experiences provide not only fill in these gaps but also help students discover their passion in the broad field of computer science. This develops the students’ skillset towards specialized roles in software engineering or game development before they even graduate. Otherwise, the student is often left with the knowledge provided by a generalized computer science degree but is unable to apply this knowledge to contribute to the world.
This concept extends far beyond computer science. Beneficial opportunities exist for developing applied expertise in nearly every major.
English majors can apply their writing and proofreading skills through hands-on experience in journalism, content writing, assistant editorial positions, and more. History majors can apply their knowledge in archival work or positions in museums. Psychology majors can work a wide variety of different jobs, ranging from counseling internships to behavioral research to transcranial magnetic stimulation (TMS) technician positions.
These examples are only the beginning.
And yes, it’s true—career centers and academic advisors exist to navigate students towards these opportunities.
However.
Career center attendance is voluntary, and academic advising is not consistent. Fallout still happens. I have personally witnessed it. And if it blatantly occurs in a small private school with around 3,000 students, how much more often among students at a large state school where you are reduced to another number and your professors don’t even know your name?
Universities were founded as institutions of learning. University students must engage in academic pursuits—reading, writing, calculations, critical thinking, and logical reasoning—which are essential for navigating and actively engaging in daily life. This is the core of academia.
But universities also employ the use of majors. They tailor education to a student’s inborn talents and prepare them to contribute to society. And without the addition of experiential learning in these tailored fields, students lack the holistic educational experience that tertiary education markets.
[RELATED: Higher Education Fuels Corporate Profits at the Expense of American Workers]
The little paper, called a degree, gives a fresh graduate a sense of hubris. That is, until they are confronted with the reality that their university has not fulfilled all it promised. It has not prepared them for the workforce.
Experiential learning needs to be better supported throughout the entire academic institution. Career centers exist as the bridge point between university students and company partnerships; therefore, they should have a more active role in every student’s academic journey—not simply for the students who voluntarily choose to visit it. One way this could be accomplished is to have a career advisor actively work alongside a faculty advisor, giving the student a mentor for both the experiential and the academic elements of higher education.
Furthermore, all majors should include at least one class centered on an independent project or internship to allow students to use their field of study in practice. Although many already do, I have spoken to a few students who have shared that they have no such requirement for graduation from their major.
Ultimately, all students should have GPAs put into perspective and be steered towards balancing them with applied learning. In further pieces, I will explore specific avenues that a student can take to obtain these opportunities and round out their college experience.
One thing is clear. In the world of higher education, practice without knowledge risks leaving students unprepared for the world. But knowledge without practice risks leaving students unprepared for the workforce. University graduates must have both.
See more by Hannah Hutchins on Muck Rack.
Imgae by Zac on Adobe; Asset ID#: 1612758305
Hannah — the concept of a “research university” as you know it was a product of the 50 years war (1941-1991) when academia helped us defeat first the National Socialists and then the Communists.
The concept of the “university” as you know it is largely a product of pre-Hitler Germany, what was developed there at the turn of the 20th Century and what we then adopted, the so-called “German Influence.”
The vocational education you describe was first the Normal Schools (teacher’s colleges) and then the Land Grant Colleges (Scientific Agriculture and Mechanical Arts) which were publicly (taxpayer) subsidized. The liberal arts colleges were something else entirely.
Most, not all, started as divinity schools of sorts — Harvard to train ministers, Yale because the Connecticut ministers didn’t like the ones that Harvard was producing so they decided to train their own. Denomination doesn’t quite mean what it once did (e.g. Mary Dyer) and a lot of liberal arts colleges started as “me-too” efforts — the denomination wanting to train its own clergy and educate its own elite.
Educate its own elite is important here — the primary purpose of the liberal arts college soon became to create “gentlemen” — who could then be taught how to run the business. For a good example of this, see the 1983 movie “Trading Places” where an upper-class commodities broker gets replaced with a street hustler — it also involves issues of racism but really how the Ivy League degree is more an issue of class/caste.
That said, I would read Alan Bloom’s _Closing of the American Mind_ — and remember that the students he discusses were your parent’s generation, and that he was writing in a time (mid ’80s) when the thought of advertisements for condoms was unthinkable.
And then — while realizing he was expressing 19th Century Catholic theology that I don’t think the Vatican completely agrees with today — read some of the speeches in John Henry Newman’s _Idea of a University_ — particularly his concept of a universal knowledge of humanity that needs to be passed on to the next generation.
And then re-read Bloom.
The real issue is that we have made two mistakes — first the presumption that everyone needs (or wants, or would benefit from) a liberal education, and then that said liberal education will directly transfer into a lucrative job.
If you can get past the sexism of an earlier era and realize that successful marriages were/are partnerships and that the business executive of the 1950s got there in no small part because of his wife, look at the value instilled by the Seven Sisters. Why were their graduates (or even attendees, e.g. Barbara Bush) more valuable as wives?
They knew music — even playing a piano was a valued skill on January nights a century ago when it got dark early and there was no TV (or internet). They’d been to Europe and could talk about what they’d seen. They knew literature, etc, etc, etc.
This is the same thing that the business world (and the military) saw in male liberal arts graduates. The culture, the “small ‘l'” liberal arts. And a skill set that could be adapted to meet new and unforseen problems.