AI in Higher Ed: Beyond Cheating to a Future of Learning

by Emma Walker – News Editor

The University of Michigan launched UM-GPT in August 2023, becoming the first higher education institution to offer AI services at scale, a move signaling a broader shift in how universities are approaching artificial intelligence. Whereas initial concerns centered on academic dishonesty, institutions are now actively building and deploying AI tools across campus, from admissions to research, prompting a fundamental question: what role will the university serve in an age where machines can increasingly perform the work of learning and discovery?

For years, the debate around AI in higher education revolved around preventing cheating. But universities are now embracing the technology, developing chatbots and AI-powered systems aimed at keeping students and faculty on the cutting edge. According to a recent EDUCAUSE report, 37% of colleges and universities provide institutionwide licenses for chatbots, and 14% have developed their own homegrown bots. As EdTech Magazine reported in May 2025, the approaches vary, with some institutions training chatbots on all their institutional information while others experiment with smaller applications.

The University of Michigan’s UM-GPT provides secure access to commercially available large language models through the university’s private Microsoft Azure environment, illustrating a proactive approach to AI adoption while maintaining data control. “There are going to be both opportunities and challenges,” says Ravi Pendse, vice president for IT and CIO at Michigan, “But on balance, this will be a force for positive disruption.” Similar initiatives are underway at Harvard University, Washington University, the University of California, Irvine, and UC San Diego. Inside Higher Ed reported in March 2024 that these efforts are driven by concerns over security, equity, and intellectual property rights.

The push to develop in-house AI tools is, in part, a response to the cost of access. While older versions of ChatGPT remain free, the newest version costs $20 per month, creating a potential disadvantage for students who cannot afford it. “We demand to talk about ‘AI for good’ of course, but let’s talk about not creating the next version of the digital divide,” said Tom Andriola, UC Irvine’s chief digital officer, according to Inside Higher Ed. Michigan’s Pendse echoed this sentiment, stating that U-M GPT is free to all students, aiming to level the playing field.

Though, the integration of AI extends far beyond simply providing access. AI-powered software is being used in admissions, purchasing, academic advising, and institutional risk assessment. These “nonautonomous” systems automate tasks while still requiring human oversight, but raise concerns about data privacy, bias, and transparency. Questions remain about who has access to student data and how “risk scores” are generated.

More complex ethical questions arise with “hybrid” systems – AI-assisted tutoring chatbots, personalized feedback tools, and automated writing support. Students and faculty are increasingly using these tools for writing, research, and course design. This raises concerns about the impact on the student-professor relationship and the potential for cognitive offloading, where users rely on AI to perform tasks that build competence. University of Pittsburgh researchers have found that interacting with AI chatbots can lead to feelings of uncertainty, anxiety, and distrust among students.

The most significant changes may come with the development of “autonomous agents” – AI systems capable of performing research independently. While still largely aspirational, these systems could automate much of the research cycle, potentially reducing opportunities for graduate students and early-career academics to gain essential skills. If these systems absorb the “routine” responsibilities that traditionally serve as on-ramps into academic life, universities risk thinning the pipeline of expertise.

The increasing automation of knowledge work forces universities to confront a fundamental question: what is their purpose? One view sees the university as an engine for producing credentials and knowledge, prioritizing output. Another views the university as an ecosystem for cultivating expertise, mentorship, and critical thinking. In a world where AI can deliver outputs more efficiently, universities must decide what they owe their students and society. As of now, the answer remains unclear, and institutions are grappling with how to navigate this evolving landscape.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.