← Back to Brain Food Is Your University Ready for What AI Is Actually About to Do to It? Fuel

Is Your University Ready for What AI Is Actually About to Do to It?

12 min read brain-food

Is your university ready for AI’s transformative wave? Discover how some institutions are embracing deeper integration while others risk falling behind. The future of education is being redefined—does your school have a plan? Read on to find out.

The institutions that figure this out first will define the next era of education. The ones that don't may not survive it.

I grew up in education. Not as a metaphor. My father was a high school art teacher. My mother taught English. They worked at the same school, and the rhythms of the academic calendar shaped everything about our family life. I understood instinctively from a young age what it meant to dedicate yourself to helping people learn.

That background never left me. My wife and I are both graduates of Kansas State University - go wildcats! We have nine amazing children, ranging from six years old to twenty-two. One of them is in college right now. The youngest ones are growing up in a world where AI is already woven into how they encounter information, how they do homework, and how they think about what school is for. I watch all of this with the particular attention of someone who has education in his DNA and a lot of personal stake in getting this moment right.

So when I sat down recently in a room with university administrators, faculty, business executives, and school superintendents to talk about artificial intelligence and its impact on higher education, I was not just there as an AI consultant. I serve on the advisory board of this Midwest university, which means I have a direct stake in where it goes. And I was there as someone who cares deeply about what happens to the institutions that shaped my family and that are shaping my children right now.

It was one of the most honest conversations I have had in a long time. And it left me with a question I want to put to you directly: does your institution have a plan for AI that goes deeper than adding it to a syllabus?

If the answer is not a confident yes, this article is for you.

The Honest Starting Point

The Midwest university I visited had done something that takes real institutional courage: they told the truth about where they had been.

Like most schools, they started with prevention. When AI tools became publicly accessible, the instinct was to stop students from using them. Strict academic dishonesty policies. Faculty on alert. A posture that treated AI as a threat to be contained.

That phase lasted a short time. Because the reality was obvious to anyone willing to look at it: students were using AI regardless. Just as teenagers use tools their parents say not to use. The question was never going to be whether students would use AI. The question was always what we were going to do about it.

So the university moved. They stopped trying to prevent and started trying to integrate. AI is now a core graduation competency across every applicable course in their College of Management. Their accrediting body has formally endorsed this direction. Faculty are redesigning the curriculum. The conversation in that room was not about whether to embrace AI. It was about how fast and how deep to go.

That is the right conversation. But it is also where the real challenge begins.

The Layer Most Institutions Are Missing

I use an analogy when I talk about AI with organizations. Think of an onion. Most people, most institutions, are operating on the outermost layer: prompt something in, get something out, feel like you are doing AI. It looks impressive. It produces results. And it is barely scratching the surface.

The real work is in moving toward the core.

The schools operating on the surface layer will keep getting surface-level results. The ones building toward the core will have a structural advantage that compounds over time.

The middle layers are where you integrate AI into your actual workflows, use it iteratively, and build genuine institutional fluency. The core is something different entirely: it is a proprietary knowledge system, built around what your institution specifically knows, teaches, and values, that grows continuously and becomes your source of truth.

Most universities do not have this. They have individual faculty members using AI tools in their own ways, disconnected from each other and from any central institutional intelligence. That is surface-layer adoption dressed up as strategy.

I asked the group directly: what operating system does your university have that everyone can access, that reflects what your institution actually knows, and that is continuously growing? The room got quiet. That is the gap worth closing.

The Question AI Is Forcing Education to Finally Answer

Here is the uncomfortable truth that came up in our conversation, and that I find comes up everywhere I go: AI did not create the core problem in education. It just made it impossible to ignore any longer.

I think about my kid in college right now. I think about my six-year-old, who is already comfortable asking a voice assistant questions that I would have had to look up in an encyclopedia as a child. The spectrum of my family, from a first-grader navigating a world fluent in AI to college students who are being asked to figure it out alongside their professors, gives me a front-row seat to every stage of this challenge simultaneously.

What I see across all of them is this: the system was built to measure output, not understanding. Write the paper. Answer the question. Meet the rubric criteria. Students became skilled at producing the right-looking output. My parents spent their careers trying to teach beyond that instinct, trying to get students to genuinely engage with art, with language, with ideas. The system made it hard. AI has now made the gap between the two completely undeniable.

When a student can generate a perfect-scoring paper in thirty seconds, the assessment architecture built around output production collapses entirely. The question my parents wrestled with in their classrooms for decades, what are we actually measuring here, suddenly becomes the most urgent question in higher education.

One professor at this university figured it out the hard way. He noticed that every student in his graduate Finance course was scoring 25 out of 25 on discussion posts. Every single one. He knew what was happening. He could not prove it, but he knew. So he redesigned the assignment to require documented AI use, visible prompting, and genuine revision. The next round came back with an average score of 17. He walked into the dean's office to share the news, and his reaction was not disappointment. It was relief.

A class where everyone scores perfectly is a class where the assessment is not working. The 17 average was not a failure. It was the first honest signal this professor had gotten about where his students actually stood.

That story matters beyond the classroom. It is a design principle. If your assessment can be fully completed by AI with no student engagement required, you are not measuring learning. You are measuring compliance with a format. The redesign does not have to be dramatic. Require process documentation. Require verbal defense. Require the student to explain their reasoning live. Create the friction that separates real understanding from polished output.

This Is Not the First Time We Have Been Here

I want to offer a frame that I find genuinely helpful when the pace of change feels overwhelming. We have been here before.

When Henry Ford introduced the assembly line and eventually robotics replaced human manufacturing labor, the fear was that workers would become obsolete. They did not. They moved to the next layer. New roles emerged that did not exist before. The economy adapted. Humans adapted with it.

In that same room, someone mentioned they were not allowed to use calculators in their college calculus courses. Calculators existed. Students had them. Bringing one to the exam was prohibited because the concern was that students would stop learning the underlying mathematics. That restriction had a measurable cost: a high dropout rate from courses that created unnecessary barriers.

I graduated from Kansas State University. My wife did too. We both lived through the early transitions of technology in the classroom and the institutional hesitation that surrounded each one. I watched my parents navigate those same moments from the teacher's side of the desk. The pattern is consistent across every generation: the tool arrives, the instinct is to restrict it, the restriction eventually gives way, and the students who thrived were the ones whose educators helped them engage thoughtfully rather than avoid it entirely.

The question worth sitting with today is: which of the AI restrictions we are imposing right now will the next generation look back on the way we look back at the no-calculator rule? The point is not that every restriction is wrong. The point is that the instinct to restrict needs to be grounded in genuine pedagogical reasoning, not unfamiliarity and fear. And the institutions that move through that fear fastest will define what comes next.

What the University of the Future Actually Looks Like

This is where I want to be both honest and hopeful, because I think the future of universities is genuinely bright for the ones willing to evolve. But the evolution required is more fundamental than most institutions are currently planning for.

The university I support through my advisory board service is already deploying AI agents to run operational functions. A class scheduling optimization that used to take a registrar months was completed by an AI agent in three days. The humans reviewed it and validated it before implementation. A 24/7 service model for military students, powered by AI agents, is moving toward deployment. These are not experiments. They are operational shifts.

The question that follows is one I put directly to the room: if your university is using AI agents to run its operations, are your students being prepared to work in organizations that operate the same way? Because they will. Every significant organization your graduates enter over the next decade will be running some version of this model. The students who understand how agentic AI works at an organizational level, who can direct it, evaluate it, and take ownership of its outputs, will have a material advantage over those who only ever saw AI as a writing tool.

That is the graduate the market is moving toward. Not someone who can use AI. Someone who can supervise it, challenge it, and build with it. The distinction matters more than most curriculum designers currently recognize.

What I Would Tell Every University Administrator Reading This

I am not here to tell you the sky is falling. I have seen enough technology transitions to know that the institutions and individuals who approach change with curiosity rather than fear consistently come out ahead. But I do want to be direct about what I think the path forward requires.

Stop treating AI as a topic to be covered in one course. It is infrastructure. It belongs woven into everything your institution does, from how you teach to how you operate to how you measure success.

Build toward the core. The surface layer of AI adoption is accessible to everyone, which means it is a competitive advantage for no one. The institutions that build proprietary knowledge systems, that develop genuine AI fluency at every level of the organization, are the ones that will lead.

Redesign what you measure. If your assessments can be completed by AI without student engagement, they are no longer measuring what you think they are. The shift from measuring output to measuring understanding is not easy, but it is necessary, and it is also an opportunity to build something more meaningful than what existed before.

Make your own AI use visible to students. If your institution is using AI to optimize operations, say so. Show students what it looks like in practice. The best case study you have access to is the one happening inside your own walls.

Where This Is All Headed

Sitting in that room, I left with something I did not fully expect: genuine optimism. Not because the path is easy, but because the people in that conversation were asking the right questions. They were not pretending the problem was smaller than it is. They were not clinging to the way things were. They were doing what the best educators have always done: trying to figure out, honestly and collaboratively, what it means to prepare people for the world they are actually going to live in.

I think about my parents standing in front of classrooms for decades, trying to help teenagers find themselves in a painting or a paragraph. I think about my kid in college right now, navigating institutions that are genuinely trying to figure this out in real time alongside them. I think about my six-year-old, who will walk into a higher education system in twelve years that we are building the foundation for today.

That is what makes this moment feel personal to me. It is not abstract. It is my family. It is probably your family too.

We have been through transitions like this before. Every generation has a moment where the rules change fundamentally. The people who figure out what the new rules are and help others navigate them with clarity and compassion are the ones who matter most in what comes next. My parents did that work for their students. The administrators and faculty I sat with are trying to do it now.

The institutions that figure this out first will not just survive this moment. They will define what education looks like on the other side of it. And the students whose schools got it right will carry that advantage for the rest of their lives.

That work is already underway. The question is whether your institution is leading it or waiting to follow.

Created with ❤️ by humans + AI assistance 🤖