Are the Liberal Arts a perfect background for the age of AI?
For twenty years, parents looked at the staggering cost of college and did what seemed rational. They steered their kids away from the Humanities and toward "sure things." Engineering. Computer science. Business. Something that guarantees a job.
The kids felt it too. Pick a lane. Specialize early. Don't waste tuition money on poetry.
Then I read about an interview with Demis Hassabis, CEO of Google DeepMind, where he said: "Learning how to learn will be the most important skill of the next generation."
And I sat there thinking: wait, isn't that exactly what a liberal arts education teaches?
Not coding. Not frameworks. Just the capacity to learn, unlearn, and learn again. To stay curious. To connect ideas across disciplines.
I got my B.A. decades ago. I majored in English, but I also dove into the arts, sciences, social sciences, and languages. As I explored AI this year, something clicked. My training kicked in almost instantly. A few specific patterns keep showing up.
Adaptability is an advantage. People who studied broadly seem more comfortable in this shifting landscape. Liberal arts programs build that implicitly. You read philosophy, pivot to statistical analysis, then argue about art. You develop pattern recognition across disciplines. You learn to think laterally.
I'm finding this matters with AI because the value isn't in the output. It's in the question. When you bring context from multiple fields, you can ask the machine to connect dots it can't imagine on its own. Range beats depth when the landscape keeps shifting.
Questioning feels more important than ever. One thing liberal arts definitely teach: how to poke at confident answers, to ask "Is this actually true?" even when everyone nods along.
The liberal arts train you to question. That practice shows up in how I work with AI. It’s further support for what Dan Rockmore told me in our serendipitous meeting at a winery: the people who’ll thrive with AI will be those with the confidence to push back.
Taste and cultural depth make a difference. Ethan Mollick wrote something in Co-Intelligence that stuck with me: "To get AI to do unique things, you need to understand parts of the culture more deeply than everyone else using the same AI systems."
I see this in my own work. When I'm directing AI on creative projects, all those years of studying how humans make meaning can matter more than technical skill.
The market is sensing this shift. When I see OpenAI posting content writer jobs with a $300k salary, I can tell they’ve figured this out. They know what their machine can do. They know what human storytellers can do. They’re investing in the humans.
Humans still need to ask the ethical questions. AI optimizes for whatever you tell it to optimize for. Humans have to supply the "should we?" after the "can we?" Liberal arts teach you to think about fairness, perspective, the long view.
AI can do a lot of things, but it can't be a well-rounded human being. Watching OpenAI pay $300k for writers — watching AI companies realize what their machines can't do — makes me think liberal arts grads are in a pretty good spot.
Other companies will catch up eventually. They'll figure out what the AI builders already know.
Those impractical degrees are looking surprisingly practical.