
Dear Students,
As students, you are constantly evaluated—sometimes explicitly, sometimes implcitly. You are evaluated to decide if you will be admitted to a university. You are evaluated every time you turn in an assignment, make a comment in class, or interact with group members. Whether one likes this approach to higher education or not, it is the nature of the beast. You are evaluated. Constantly. (And to be fair to colleges and universities, employers and coworkers will constantly evaluate you, too.)
And they don’t just evaluate you by an absolute or independent standard. They evaluate you relative to your classmates. Did you write a better paper than the students around you? Is your presentation better put together? Are you more on the ball than others? Partly this is due to the fact that generic human evaluation is relative—is she prettier or smarter than others around her? Is he funnier and taller? Our social lives are replete with evaluating people relative to their peers and competitors. Colleges and universities do this, too, sorting stronger and weaker students into different piles, at the level of the university or college itself, the nature of the school and major, whether you are graded on a curve, and so forth.
Since evaluation is relative, and large language models (LLMs) like ChatGPT, Grok, and other artificial intelligence (AI) chatbots exist, college writing has become a poor indicator of talent and hard work—it has become low rent. This is unfortunate but true. Your essay or research paper no longer signals that one is intelligent and hard working relative to your classmates. Why? Because anybody with access to an LLM and the assignments can produce a decent college essay with increasing sophistication as AI improves. Even a person who knows little to nothing about the class can do this. So, college essays are now low rent—they are a cheap signal of anything that higher education does and should care about. College essays, admittedly, were never an amazing measure of collegial potential, but their value in this respect has plummeted to zero over the last couple of years.
So why does that matter? It is partly because the university or college you attend needs a reliable indicator that is hard to fake and indicates that you understand the class material. Likewise, employers want to know that the degree one earned reflects one’s understanding. Copying, pasting, or uploading essay instructions into an LLM, and then turning it in as one’s own, is not an indicator or reflection of one’s academic ability and virtue. As Michael Spence, the economist who pioneered work on the signaling function of higher education, explains:
The signal, to be effective, must be one which the employer can observe, but which is more easily or less expensively obtained by high-productivity individuals than by low-productivity individuals. Education serves as such a signal—not because it necessarily increases productivity, but because it is a costly activity that high-productivity individuals find easier to complete.
College writing no longer fulfills that role due to LLMs and AI. If anybody with access to an LLM can turn in a passable paper, then it is no longer a hard-to-fake signal, e.g., like the ability to lift two hundred pounds over one’s head is hard, if not impossible, to do without the needed muscle. So, what is higher education to do? It must adapt or go extinct. It may resort to more in-person testing and exams, though this is harder than many who have never taught a college class appreciate. Using LLMs as tutors and blank slates to be taught basic concepts. And so forth. Whatever path higher education takes, it must teach students—without the wrong kind of deskilling—how to effectively use LLMs without undercutting their value as employees and citizens.
So we now have a handle on the problem. What can you, the student, do about it?
Do not forget—or fail to learn—how to write. Just because a device or computer can do something better than you hardly means that one should cease to do it, or never learn to begin with. Such a conclusion would, sometimes, be a mistake. It is true that sometimes technology renders a skill obsolete. Other times, it requires the retention of the skill to better use the technology. For example, modern farmers have shed the knowledge of how to operate a horse-drawn plough, but still retain a deep knowledge of soil quality, rain and weather patterns, seasons, and crop yield. In the latter case, the technology amplifies the knowledge of the farmer. Writing skills are more like the latter case than the former—though perhaps in more select cases—and those skills can become even more productive when augmented with LLMs.
Strong writing skills sharpen your thinking and clarity, surpassing those who fully offload writing to AI and LLMs. These skills also help you identify flawed writing and reasoning—whether from people or AI—and use LLMs effectively to enhance your work. Mastering LLMs is key, as many mistakenly treat them like search engines rather than iterative tools for inquiry. Higher education must urgently integrate AI and LLMs to empower students, not deskill them.
Because of this, I am redesigning my own classes to better leverage LLMs and AI within the classroom to empower you, the student, rather than to undercut and deskill your abilities. The idea will be to chart a course, as philosophers, using LLMs and AI to better understand deep and important philosophical questions. We can use LLMs to help us become better philosophers by using such tools to find research, devise examples and counterexamples to arguments, generate different versions of sentences and paragraphs, reformat in different styles, and proofread drafts. At the end of the day, LLMs and AI are merely tools to use in the exploration of philosophy—they possess neither a life full of experiences nor the existential angst underlying questions of God, ethics, death and the afterlife, society and politics, art, and so forth. Maybe they will someday, but for now, philosophy is an exclusively human enterprise aimed at fundamental and meaningful questions—we must proceed treating AI as a tool, rather than as a replacement.
To accomplish this and other pedagogical goals in the philosophy classroom, though, I need your help. I cannot do this alone. And neither can you.
Take care of yourselves and each other,
Jimmy Alfonso Licon
Fololow Jimmy Alfonso Licon on X.
Image: “Journaling Over Coffee” by Hannah Olinger on Unsplash