
In a recent Chronicle of Higher Education essay, philosopher Megan Fritts offered a stark meditation on her experience serving on a university artificial intelligence (AI) committee: “If this robot can write these books better than we can,” she writes, “then what are we doing here?” The question lingers not only for humanities professors defending their syllabi but also for institutions struggling to articulate their purpose in the age of AI.
Fritts’s essay is piercing, elegiac, and deeply human. She argues that AI threatens more than student integrity. It risks alienating us from the very language that forms our inner selves. Her proposed remedy is radical containment. The creation of classrooms where AI tools are not just banned but made socially obsolete—a cultural faux pas in sacred spaces of human expression.
I understand this impulse. But I have come to believe the opposite strategy is more urgent.
If AI is indeed an existential challenge to academia, then retreating from it may only hasten the erosion Fritts fears. What is needed instead is radical collaboration among disciplines, mindsets, and methodologies. AI challenges how we define knowledge and expertise. To preserve what is human in higher education, we must move not inward, but across.
[RELATED: Worried About AI? Study the Humanities]
At Babson College in Massachusetts, we’ve taken up that challenge through “The Generator,” our interdisciplinary AI lab. Launched by a cross-disciplinary faculty team, The Generator brings together expertise from entrepreneurship, philosophy, machine learning, writing, theater, strategy, IT, and innovation leadership. It explores AI not as a technical add-on but as a fundamentally integrative force.
Rather than locating AI within a single department, The Generator is organized around eight faculty-led groups, each focused on themes such as ethics, entrepreneurship, experiential learning, and arts and performance. These groups don’t operate in isolation; they collaborate to identify shared questions, co-design teaching innovations, and model experimental, team-based inquiry. Through our AI Teaching Training Program, more than half of Babson’s faculty have participated in discussions and development related to AI in their own disciplines.
New courses have emerged from this work. “AI Pioneers” weaves together machine learning, entrepreneurship, and risk ethics. “Writing with Robots” invites students to explore what it means to write with, against, and around machines. These aren’t just new content offerings—they represent new pedagogies that blur disciplinary lines and emphasize co-creation.
Students, too, are collaborators. They participate in AI buildathons, help design tools like our Prototype-It! Bot for entrepreneurial prototyping, and engage with faculty in co-learning partnerships. Rather than receiving AI knowledge passively, they become leaders and contributors to its development, implementation, and critique.
Unlike earlier technologies that disrupted discrete industries, AI destabilizes boundaries across the board: between disciplines, between creators and tools, and between learning and labor. Destabilization is disorienting, certainly, but also fertile. The most compelling responses to AI emerge when computer scientists work with ethicists, when entrepreneurs co-teach with artists, when students code alongside poets.
This is where we align with Fritts. Higher education’s traditional structures aren’t equipped for the scale and speed of this shift. But where she proposes tighter guardrails around the humanities classroom, we argue for new bridges—linking disciplines in shared inquiry about how AI is reshaping our collective future.
Higher education plays a vital role in shaping how society engages with AI. It models what collaborative, future-focused education can look like. Colleges and universities are uniquely positioned to convene expertise across domains, prioritize long-term thinking over short-term gain, and prepare students to lead in a world shaped by rapid technological change.
[RELATED: We Need to Do Something About Student Writing]
But realizing that potential requires a shift in how institutions see themselves. Unlike content providers or career training hubs, we need to view ourselves as ecosystems for inquiry, adaptation, and innovation. The Generator is one such example, but the path forward won’t look the same everywhere. Some institutions may establish cross-departmental AI centers, while others may embed interdisciplinary collaboration into curriculum reform, faculty development, or strategic planning.
What matters most is institutional commitment. AI is too complex, too consequential, and too dynamic to be left to ad hoc efforts or siloed experimentation. It demands coordinated action, shared vision, and structures that enable faculty and students to work across traditional boundaries.
Fritts writes movingly about the honest B or C student who struggles to express themselves but tries anyway. We, too, treasure that student. But we don’t believe they need a tech-free zone to flourish. We believe they need a collaborative space where their awkward, unfinished voice is taken seriously—where they can use, question, and sometimes resist AI in conversation with others, not alone.
To defend the humanities, we don’t need to retreat from the future; we need to help shape it. That means showing up to interdisciplinary conversations not as the moral conscience of the university but as co-designers of its next chapter. It means naming what’s at stake, including how the products we design affect the people we become. And it means building the structures that make collaboration possible across curriculum, research, and leadership.
Fritts asks what we’re doing here. I think we’re doing something essential. We respond to change and model how to live, think, and learn together in uncertain times. That is a human endeavor, and it demands more from us than containment. It demands collaboration.
Image of Babson by Aziz bm on Wikimedia Commons
I didn’t understand this article at all. Too general and very non specific about exactly how to use AI and not have it do all the writing. I agree with the other commenters.
AI comes at a propitious time. It is reinventing how we think about knowledge and learning at a time when universities have shown themselves to be tainted by ideology, party affiliation, and the influence of foreign governments.
The concept of universities needs to be reinvented. Their centralization, hierarchy, and departmentalization are out of synch with the decentralization, dehierarchization, and integration that AI makes possible.
Universities, with their caps and gowns and pyramids of prestige, are artifacts of the feudal age and see knowledge as esoteric and dependent upon elite status and official permission of entry. AI will in many ways reverse elitism. How humans can continue to learn in a world in which some problems are better solved by machines and much knowledge is available to everyone remains to be seen.
This article seems like a desperate attempt by a humanities professor to perserve her employment.
Am I the only person who didn’t understand most of this article? In particular, what is the Generator? I even clicked on the attached link that took me to The Generator’s website at Babson. Still don’t know what it is.