We talk about the AI revolution in higher education as a curriculum problem. Faculty committees form to discuss generative AI literacy. Provosts convene task forces on academic integrity in the age of ChatGPT. Trustees ask uncomfortable questions about whether the liberal arts degree still holds value. These conversations matter. But they miss the most urgent workforce reality sitting in front of university leadership right now: AI is already reshaping what universities need from their own workforce, and most institutions are not strategically positioned to manage that shift.
I say this not as speculation but from direct practice. Over the past eighteen months, I've worked with five major research universities on talent strategy. In every single case, the leadership was thinking about AI as a curricular imperative while their own workforce was fragmenting around AI adoption. Faculty members in computer science departments were building research partnerships that required entirely new classes of technical staff. Administrative offices were quietly piloting AI-augmented tools for advising, admissions processing, and budget analysis without integrated workforce planning for what that displacement meant. Facilities and IT were running parallel initiatives with zero coordination. And all the while, the strategic talent conversations happened in silos that never spoke to each other.
This is not malice or ignorance. This is structural. Universities operate on budget cycles that don't align with technology cycles. Tenure systems preserve expertise and institutional memory but resist rapid redeployment. Shared governance means change moves at the speed of consensus. These are features that have served higher education well for two centuries. They are now becoming liabilities at the moment when speed matters most.
The Workforce Shifts Already Underway
AI is not coming to university workforces. It is already here. What's changing is the scale and the visibility. Let me be specific about where the disruption is already measurable.
Administrative automation is the most obvious domain. Admissions offices at major universities are using AI-augmented screening tools to process applications. Student records systems are automating degree audit functions that historically required human advisors. Benefits administration, payroll reconciliation, and facilities scheduling are moving to intelligent systems. These aren't hypothetical pilots anymore. They're production workflows. The workforce consequence is clear: the number of administrative FTEs required for a given student population is shrinking, and the roles that remain require entirely different capabilities — not data entry and form management, but interpretation and exception handling.
Research support is the second domain, and it's moving faster than most universities acknowledge. Principal investigators are using AI tools to generate literature reviews, identify research gaps, and even draft sections of grant proposals. The implications for graduate research assistants and postdocs are profound. The work that historically trained the next generation of researchers is being automated. Universities are not yet seeing massive displacement because research productivity is still increasing, absorbing the efficiency gains. But the trajectory is clear: the ratio of human researchers to AI-assisted research will shift in ways that change staffing models across graduate programs.
Student advising is the third domain, and this one is politically charged because it touches on an institution's core mission of mentoring. Universities are deploying AI-powered chatbots that can answer the 80 percent of advising questions that are routine — prerequisite requirements, graduation planning, policy clarification, major selection guidance. The human advisors who handled those questions are being repositioned toward higher-value work: helping students navigate identity questions, mediating between academic aspiration and realistic capacity, building mentorship relationships. This is arguably an improvement. But it requires intentional repositioning of people and work, not accidental displacement through technology creep.
Facilities management and campus operations represent a less visible but equally important shift. HVAC systems, energy management, physical security, and maintenance scheduling are increasingly managed through predictive AI systems. Universities with large campuses and aging infrastructure are using these tools to optimize cost and performance. The facilities workforce is not disappearing, but the skills and knowledge required are changing from reactive maintenance to system operation and interpretation.
The pattern across all these domains is identical: the roles are not vanishing, but they are fundamentally transforming. The workforce challenge is not "how do we manage displacement" (though some of that will be necessary). The workforce challenge is "how do we prepare people for roles that look nothing like the ones they currently hold, in a timeframe that doesn't allow for a decade of gradual transition."
Three Scenarios Every University Leader Should Be Planning For
I encourage my university clients to think about workforce planning not as prediction but as scenario planning. What are the plausible futures, and what capabilities would we need in each one? For higher education and AI, I see three scenarios that should be driving workforce strategy right now.
The first scenario is what I call the Efficiency Extraction model. University boards, facing margin pressure from declining enrollment and rising cost structures, become fixated on using AI to do more with less. Administrative functions consolidate. Support staff reductions accelerate. The workforce strategy becomes extraction: squeeze out every efficiency gain that AI can deliver. The risk is real: universities pursuing this model aggressively will see rapid attrition, loss of institutional knowledge, and deterioration of employee engagement. But the financial pressure is equally real, and several flagship institutions are already moving this direction. If your institution is vulnerable to this scenario, your workforce strategy needs to prepare people for uncertainty and rapid change while simultaneously protecting the talent you cannot afford to lose.
The second scenario is Stratification. Universities use AI effectively to augment and amplify the work of highly skilled staff — faculty, researchers, clinical professionals — while using efficiency tools to reduce lower-wage support functions. The result is a bifurcated workforce: a small core of highly paid, highly skilled knowledge workers with powerful AI tools, and a larger group of lower-wage, lower-security contingent workers handling the work that isn't worth automating. This scenario preserves institutional capability while externalizing cost. It's also the default outcome if universities don't make conscious choices to manage AI adoption as a talent strategy problem rather than a technology problem.
The third scenario is what I call Integrated Augmentation. Universities make explicit workforce strategy decisions about where AI augments human capability, where it handles routine tasks, and where it frees up people to focus on mission-critical work that only humans can do. This requires intentional redesign of work processes, targeted reskilling investments, and real partnerships between technology and human resource functions. It's more complex and more expensive upfront. It's also the only path to building a sustainable, engaged, future-ready workforce. And almost no universities are positioned to execute it right now.
Why Universities Are Structurally Slow to Adapt
Before I propose solutions, let me be honest about the structural constraints. Universities are not failing to address this because leadership is incompetent or shortsighted. Universities are failing because they were designed for a different kind of challenge.
Shared governance means that significant decisions require buy-in from faculty, staff, and administration. This is appropriate when decisions are reversible and can be made deliberately. It is catastrophic when decisions need to happen faster than consensus can be built. Retraining programs, organizational restructuring, and talent strategy changes require rapid decisions and rapid iteration. The shared governance model makes rapid decisions nearly impossible.
Tenure systems preserve expertise and provide security. They also calcify roles and make repositioning difficult. A tenured administrator or faculty member cannot easily be moved into emerging roles, even if the institution desperately needs that person in that role. Tenure serves important purposes, but it is fundamentally misaligned with the flexibility that workforce transformation requires.
Annual budget cycles create planning horizons that are too short for workforce transformation and too long for technology evolution. Technology strategy is made on 18-month cycles; university budgets are made on 12-month cycles. This mismatch means that technology decisions and workforce decisions are constantly fighting each other for resources and attention.
Finally, universities are optimized for stability and incremental improvement, not for fundamental transformation. This is appropriate when the external environment is stable. It is inappropriate when the environment is shifting as rapidly as it is shifting now.
A 90-Day Action Plan for University Leadership
So what can a provost or president do right now, in the next 90 days, to position their institution? I recommend five concrete steps.
First, create an integrated AI governance structure. This means one person (an AVP for Workforce and Technology Integration, ideally reporting to the Provost) who has explicit authority to see both the technology decisions and the workforce implications. Right now, IT is deploying tools, HR is managing people systems, and nobody has authority to insist these are connected. They need to be. This might seem like a bureaucratic solution to a strategic problem, but governance is how universities make things stick.
Second, commission a detailed inventory of where AI is already being deployed and what its workforce implications are. This is not a policy conversation. It's a fact-gathering exercise. Where are we using AI tools today? What work is being displaced or transformed? What new skills do we need? What gaps do we have? This inventory should be completed in 60 days and reported to the provost and the cabinet with explicit workforce implications. Most universities will be shocked by how much AI deployment is already happening outside their formal technology planning process.
Third, identify the three to five workforce roles most at risk of transformation in the next 24 months and design reskilling pathways for people in those roles. This is not "how do we eliminate positions." It's "how do we help people transition into roles where we need capability." This means partnering with HR to create education programs, mentorship, and clear communication. It means moving quickly but not frantically. And it means being honest about which transitions are realistic and which ones require managing attrition.
Fourth, build a small task force with faculty, staff, and administration focused on building the business case for the Integrated Augmentation scenario. What would it cost? What would it require? How would we staff it? What would the workforce look like? This becomes your strategic alternative to the default Efficiency Extraction model that budget pressure will push you toward.
Fifth, communicate. Be transparent with your workforce that AI is changing what your institution needs. Be specific about what you're doing to manage that change responsibly. Do not hope that people won't notice. They will. The question is whether you shape the narrative or let fear and uncertainty do it for you.
These five steps are not a solution. They are the beginning of a solution. But they are steps that any university president or provost can take immediately, with existing authority and existing resources. The time to start is now. The institutions that move first will have enormous advantage. The institutions that wait for perfect clarity, perfect processes, and perfect consensus will find that AI has made their decision for them.
