More Than the Code
The two earlier posts in this series — New Myths for Old and After the Buggy Whip — argued that LLMs are not substitutes for the experienced people whose tacit knowledge keeps systems running and safe, and that the field of computing is being reshaped rather than ended. So, you may ask, if computing is not going away, and the people who do it well remain hard to replace, what is the best preparation for a career in it?
The answer is awkward considering the current direction of higher-education budgeting and state-level academic policy. The right preparation includes a substantial helping of what is being cut.
Picture an undergraduate two weeks into sophomore year, looking at her course plan. She is in computer science, a field she has come to love. She had planned to add a minor in sociology, or a second major in music or history. Now she is reconsidering. Her parents worry about the job market and think anything outside her major is a distraction. Her engineering friends tell her nothing outside the major will help her get hired, and that CS itself will soon be replaced by AI. She no longer knows what to believe. The minor goes.
That scenario will play out on many campuses next term, paralleling an argument older than computing. John Henry Newman's The Idea of a University, drawn from lectures first delivered in 1852 and developed across the following two decades, held that a university's purpose is the formation of intellectual habit — independent judgment, the capacity to weigh evidence and pursue truth — not the production of trained specialists. Half a century later, John Dewey's pragmatism placed a different emphasis: knowledge is inseparable from doing, and education should equip people to work, vote, and live as citizens. The American land-grant tradition that produced Purdue and scores of other public universities split the difference. The 1862 Morrill Act called for "the liberal and practical education of the industrial classes" — agricultural and mechanical instruction paired with broader study. These are not opposing camps. They are different facets of the same education.
Those facets have always been in tension over budget and class hours, but until recently, few seriously proposed cutting one to subsidize the other. The present is different. Amid budget shortfalls, demographic decline, and political pressure to demonstrate immediate "workforce alignment," universities are reducing breadth to fund a narrower depth. Increasingly, languages, liberal arts, and fine arts are being eliminated as majors at one regional public university after another. The justification offered is invariably economic.
Higher education also faces a non-economic pressure. It has long attracted political hostility because it encourages thought and argument that may run counter to current leadership. That hostility has intensified in recent years in the US. For example, consider state-level prohibitions on teaching specified topics, federal cuts to research on subjects deemed inappropriate, and proposed reductions to the National Endowment for the Humanities. The fields most impacted are the same fields being cut for stated economic reasons. The two pressures reinforce each other; the students lose access to the same coursework, regardless of the justification offered.
The most-cited recent case is West Virginia University, which in September 2023 eliminated 28 majors and 143 faculty positions to close a $45 million budget gap, including most of its world-language degrees. WVU was not the only one. Among others:
- Clarkson University announced in late 2023 that it would phase out the nine majors in its School of Arts & Sciences — history, sociology, political science, literature, film, and others — and refocus on STEM.
- In November 2024, Boston University suspended PhD admissions in a dozen humanities and social-science programs, including English, history, philosophy, classical studies, and sociology.
- In April 2026, Syracuse University announced the elimination or suspension of 93 academic programs, most of them in the humanities and arts: classics, fine arts, painting, music composition and performance, several foreign languages, and digital humanities. Syracuse's provost framed the decision as alignment with student demand rather than budget. The fields cut are the same.
The pattern is concentrated at regional public universities and at private institutions serving non-elite students: first-generation, rural, working adults, and others for whom a broad college education has historically been a step up in society. Wealthy private universities continue to staff full humanities and arts faculties; their students will read Rawls and Dostoyevsky, listen to Brahms and Glass, study Rembrandt and Mondrian, and learn Latin and Mandarin. The students who lose access are the ones paying for an education that no longer offers what their wealthier peers still take for granted.
Against this backdrop, one might ask, what does a computing professional actually need from outside computing? The list is shorter than the curriculum, but each item is easier to learn early, alongside the technical skills, where the habits take hold and last.
- Communication, written and spoken, technical and lay. Most consequential decisions in cybersecurity, and in computing more broadly, are made in writing — in an executive summary, an after-action report, or a board memo — and depend on readers who understand the context, why it matters, and what comes next. Policy review and design review rely on the same skill set. The NACE Job Outlook 2025 survey found problem-solving rated essential by nearly 90 percent of employers, teamwork by more than 80 percent, and written communication by close to 70 percent. Employers consistently report new graduates underperforming on those competencies. The WEF Future of Jobs 2025 report puts analytical thinking first; creative thinking, resilience, leadership, and motivation round out the top five. Those capacities are cultivated more reliably in a seminar that requires you to read a difficult text and argue about it than in any programming course.
- Ethics and the recognition of harm. The ACM Code of Ethics, which I have cited repeatedly, is explicit on the duty to anticipate and avoid harm. Anticipating harm requires moral imagination, historical perspective, and the willingness to consider who is on the other side of the system being built — none of which the Code itself can teach. Those capacities are cultivated by the disciplines under threat.
- Social and historical literacy. Computing systems are deployed into societies, not vacuums. Knowing how a labor market works, why a community distrusts the agency rolling out a new platform, what a free-speech tradition has meant in practice for two centuries, how an artistic movement broke from the one before — these are not decorations. They shape whether a system gets used, whether it is fought, and whether the decisions it automates produce the outcomes its designers intended.
- Argument and interpretation. A senior engineer reads ambiguous evidence — a half-confirmed indicator of compromise, an inconclusive postmortem, a policy proposal with consequences that depend on contested assumptions — and reaches a defensible reading. That is the same skill English majors practice on poems, art historians on paintings, and historians on archives. It is not a coincidence that people who do well in computing tend to have studied widely in other fields.
A defender of the cuts will object that AI now reproduces the humanities. LLMs can produce essays, sermons, and sonnets that read fluently. Image models generate plausible canvases in any style on demand. Audio models compose passable tunes. There is no shortage of seemingly new artistic material being produced by machines.
That output is mechanical novelty, not innovation. An LLM trained on published novels can recombine plot, voice, and image into an arrangement that has never previously existed, but the arrangement will be derivative by definition. Real innovation in any of these fields has historically meant that a person, shaped by long reading, looking, or listening, has arrived at a way of working that breaks with what came before. Atwood did not interpolate from Dickens, Mondrian from Vermeer, Coltrane from Brahms. Each had to live with the prior tradition and then go somewhere it had not been. That is not the operation an AI model performs.
The formative experience that produces such people produces the judgment that is now being hired to backstop AI. Reading complex material longer than a text message, writing under critique, arguing in a seminar, defending a reading against a serious objection — these are how the capacity to weigh and choose under uncertainty is built. The polished surface an AI tool produces is not a substitute; it is exactly the kind of plausible artifact that practitioners with that judgment are needed to evaluate. As I argued in New Myths for Old, an LLM is a statistical interpolator over a frozen corpus. It is good at recombining what has been written and poor at handling the unfamiliar. That is true for cybersecurity decisions; it is no less true for art.
The clearest signal of what AI development actually requires comes from the firms building those systems. The major AI labs have hired philosophers, anthropologists, and ethicists to work alongside their engineers on safety, policy, and societal-impact questions. Anthropic, for example, brought on the philosopher Amanda Askell to help define the values its models express, and the moral philosopher Peter Railton to work on training in ethics. Google DeepMind and OpenAI have parallel programs. The pundit class may claim that AI will replace humanists; the labs themselves are hiring humanists to constrain what their models do.
The strongest computing programs are structured along the same logic. Purdue's undergraduate BA in Artificial Intelligence is built jointly by the Departments of Philosophy and Computer Science — by design, not as an accommodation. The degree pairs twenty-four credit hours of philosophy with fifteen of CS, and Purdue also offers a CS+Philosophy dual-degree program. Stanford's Institute for Human-Centered AI places philosophers and ethicists alongside engineers as co-equal members. MIT's Schwarzman College of Computing has a mandate to integrate computing with the social sciences and humanities. Carnegie Mellon's Department of Philosophy and School of Computer Science offer joint undergraduate paths — including a longstanding Logic and Computation program — that pair the two fields by design. The institutions best positioned to define computing for the next generation are treating philosophy and the humanities as core to the work, not adjacent to it.
The fine arts have made a less-discussed contribution to computing, one that no AI tool yet produces on its own: the difference between the merely usable and the joy to use. Susan Kare, an artist with a BA in art from Mount Holyoke and a Ph.D. in fine art from NYU, drew the original, beloved Macintosh icons. Jim Reekes, an audio designer, composed the Mac startup chord on a Korg Wavestation in his living room, with "A Day in the Life" on his mind. Jony Ive, trained in industrial design and now chancellor of the Royal College of Art, shaped the look of every Apple product from the iMac through the iPhone. Musician Brian Eno composed the Windows 95 startup sound, which was added to the Library of Congress's National Recording Registry in 2025. Engineers built the machines; artists made them into objects people wanted to use.
My own undergraduate degree was a double major in math and computer science with a minor in philosophy. The faculty designed our program to emphasize interdisciplinary study and extensive research and writing, even though both majors were technical. My graduate work was in computing throughout, with psychology as a minor area. Some of what I learned in those non-technical courses I did not fully appreciate at the time. I have drawn on it in every decade since, and I believe they have contributed to my success.
The INSC graduate program in information security at Purdue — the world's first cybersecurity graduate degree program, which I founded in 2000 and directed for twenty-five years — requires coursework in ethics and technology policy. Not as enrichment, but because a security graduate who cannot reason about consequences and policy is not prepared. One reason our graduates have been highly valued in industry, government, and academia is that they arrive with that range. A narrowly trained graduate is at a disadvantage in the rooms where decisions are made.
If you are a student reading this and weighing what to take next, choose your electives deliberately. Take writing-intensive courses in which the writing is critiqued. Take a course that requires you to argue and be argued with. Consider a second major or minor in humanities, social science, or fine arts — not as decoration, but because the half-life of any particular framework or tool is shorter than a career, while the half-life of literacy, judgment, and the capacity to read a room is the rest of your working life. The capacities that make you human are the capacities that will not become obsolete with the next model release.
And there is a part of this argument that the career framing misses. Liberal-arts and fine-arts studies are not only a hedge against technical obsolescence. They are the foundation for the rest of life: for understanding what you read, watch, and listen to, for participating as a citizen, for making sense of grief and joy, for sustaining good conversation across a long life with people who matter. They are, eventually, what fills your time after your career ends. A person trained only for technical work reaches retirement with little left to draw on. The same person trained to read history, hear music, and follow an argument carries this training into a wider life.
The tasks that will still need people twenty years from now are the ones that require people to be people: to judge under uncertainty, to argue with care, to understand who is on the other side of the system. The life that will still be worth living when that work is done is the life one was prepared for outside the office. A university that strips its humanities and fine-arts programs to fund another technical certificate is preparing graduates who will be obsolete in their work and impoverished outside it — and thus failing to deliver on the promises on which higher education was founded.
(A few portions of this text were drafted and structured with the assistance of Anthropic Claude Opus 4.7; the ideas, arguments, and final editorial decisions are the author's.)


