TL;DR: This article explores how generative artificial intelligence tools like ChatGPT, Claude AI, and Google Gemini are transforming higher education, affecting how students learn, professors teach, and institutions approach academic integrity.
đ Table of Contents
Jump to any section (16 sections available)
đč Watch the Complete Video Tutorial
đș Title: How artificial intelligence is reshaping college for students and professors
â±ïž Duration: 597
đ€ Channel: PBS NewsHour
đŻ Topic: Artificial Intelligence Reshaping
đĄ This comprehensive article is based on the tutorial above. Watch the video for visual demonstrations and detailed explanations.
The class of 2024 marks a historic milestone: itâs the first senior cohort at universities nationwide to have spent nearly their entire college experience in the age of generative artificial intelligence. This transformative technologyâcapable of creating human-like text, images, and even codeâis no longer a futuristic concept. Itâs embedded in daily academic life, reshaping how students learn, how professors teach, and how institutions define academic integrity. From detection dilemmas to innovative classroom integration, generative AI is forcing a complete rethinking of higher education.
In this comprehensive guide, we unpack every insight, real-world example, policy shift, and student-professor perspective from PBS NewsHourâs in-depth report on how Artificial Intelligence Reshaping college campuses across America. No detail is left outâfrom detection software struggles to Ohio Stateâs bold AI fluency initiative.
The Turning Point: When Professors First Noticed AI in Student Work
About two years ago, Megan Fritts, a philosophy professor at the University of Arkansas at Little Rock, began spotting something unusual in her studentsâ assignments. Suddenly, essays and test answers from students whose writing she knew well started sounding like âofficial business documentsâ or âtechnical writingââhighly polished but deeply impersonal.
Fritts realized: this wasnât her studentsâ authentic voice. It was likely AI-generated content. This moment marked the beginning of a seismic shift across higher education, as generative AI swept through campuses nationwide.
Why Generative AI Spread So Rapidly in Academia
The appeal is obvious: tasks that once required hours or even days of writing and revision can now be completed in mere minutes. For example, a student can prompt ChatGPT: âWrite me a 1000-word essay on the topic of, âIs it OK to lie?ââ Using vast training data, the AI instantly predicts and generates coherent, structured prose.
But for educators like Fritts, this convenience comes at a steep cost. âIf Iâm reading the writings of ChatGPT instead of my students,â she says, âI have lost the very best tool that I have to see if I am being effective in my capacity as an instructor.â
Academic Integrity in Crisis: The Detection Dilemma
Universities are scrambling to respond. Brian Berry, Vice Provost of Research at UA Little Rock, leads a campus committee tasked with crafting AI policies. He acknowledges a sobering reality: âThe technology is outpacing our ability to detect it.â
How One Professor Detects AI Use
Professor Fritts enforces a strict no-AI policy in her classroom. When she suspects AI use, her process is rigorous:
- She runs the submission through Phrasely, one of eight different AI detection softwares she uses.
- If multiple tools flag the text as AI-generated, she meets with the student.
- During the meeting, she asks the student to explain or discuss the content of their assignment.
âIf they can talk about the thing that they wrote about, then great,â Fritts says. âBut a lot of times, they canât.â
The Flaws in AI Detection Tools
Detection software is far from foolproof. False positives are common, and students report being penalized for writing styles that happen to resemble AI output.
Ashley Dunn, a former senior at Louisiana State University, was accused of using AI to write a short essay for a British literature class after a detection tool flagged her work. âI was like, am I gonna fail this class? Am I gonna get a zero?â she recalls, noting that colleges treat plagiarism very seriously.
Though Dunn was eventually given an A for the assignment after clarifying with her professor, her TikTok video about the incident went viralâwith countless students sharing similar stories of being falsely accused.
âA lot of people ended up making responses to my video⊠saying that they had gone through the same thing, but that they didnât really get as lucky and they ended up either getting zeros or failing the class. Some people recently have been making videos about, âOh, my professor said that my essay was AI because I used an em dashââbut thatâs just a regular way of writing, especially for a college level.â
Two Campus Responses: Restriction vs. Integration
Not all universities are taking a hardline stance against AI. In fact, institutions are diverging sharply in their approachesâsome banning it, others embracing it as an educational tool.
Policy Approach at UA Little Rock
UA Little Rock is finalizing a campus-wide policy that gives individual professors the authority to decide what AI use is acceptable in their coursesâas long as itâs clearly outlined in the syllabus. This decentralized model empowers faculty while promoting transparency.
Ohio Stateâs Proactive Integration Strategy
Meanwhile, Ohio State University has taken a radically different path. Rather than resist AI, itâs embedding it into the core curriculum.
Ravi Bellamkonda, Executive Vice President and Provost at Ohio State, was struck by a student AI violation case that made him rethink the technologyâs potential. âWhat if there existed technology that indeed lets our students produce work of very high quality? Shouldnât we investigate this a little further?â
His answer: launch the AI Fluency Initiativeâa university-wide program requiring all undergraduate students across all disciplines to learn and use AI tools responsibly.
âThe trick is to figure out, like any human interaction with technology, what can we offload to technology, and what do we need to add value to? Ohio State wants to be at the front of that creation of those rules.â
Real-World Classroom Applications of Generative AI
At Ohio State, AI isnât just permittedâitâs encouraged as a learning aid. Professors and students are experimenting with innovative, discipline-specific uses.
Entrepreneurship: AI as a Critical Thinking Partner
Lori Kendall, who teaches entrepreneurship at Ohio Stateâs Fisher College of Business, initially asked, âNow what? Do we allow AI? Do we not allow AI?â But she quickly realized: âTheyâre going to use it anyway.â
Now, she encourages students to use AI to critically examine their original work. As student Rachel Gervais (majoring in air transportation) explains: âI oftentimes use AI to create questions regarding this topic. So I not only get a better understanding of the actual material, but I also can test and see what I need to maybe focus on even more.â
Kendall emphasizes: âA lot of people might use AI just to get assignments done or [commit] plagiarism, but I like to use AI for deeper understanding.â
Music: AI as a Research and Teaching Accelerator
In the College of Arts and Sciences, Professor Tina Tallon teaches a course titled âAI and Music.â Her pedagogical approach starts not with technology, but with problems.
âI always start the class by asking them to think about a challenge in their field. At that point, weâre not even talking about AI. I just want them to identify something that either theyâve run up against or that their students or their colleagues have.â
Case Study: Tuba Performance Optimization
Doctoral student and tuba instructor Will Roesch is using AI to analyze airflow into his instrument across thousands of repetitions. The resulting data helps him guide students toward playing the âperfect noteââa task previously impossible due to the sheer volume of manual analysis required.
Case Study: Infant Musical Development Research
Graduate student Natalia Moreno Buitrago studies how babies acquire musical knowledge. Before AI, she spent hours combing through home audio recordings, manually identifying moments when caregivers sang or hummed near infants.
Now, AI automates this processâfreeing her to focus on analysis and interpretation rather than data collection.
âIf we critically examine the tools that weâre engaging with and are actively involved in the development of them, I think we can do some pretty incredible things.â
The Broader Implications: Beyond the Classroom
The disruption caused by generative AI extends far beyond academic assignments. Itâs reshaping the very skills students need to succeed in a rapidly evolving job market.
âIf you donât use AI or the next technology that comes along to be effective, youâre not going to be competitive in the job market. The job marketâs changing right under your feet.â
This reality forces a fundamental question: how can institutions navigate this transformative moment in a way that ultimately improves human potential rather than diminishes it?
âHow do we go through a transformative moment like this with the disruptions that it is going to cause and yet do this in a way that ultimately is additive to us as a society? That it improves our lot as human beings?â
Student Perspectives: Caught in the Middle
Students are navigating a confusing, inconsistent landscape. While some professors ban AI outright, others require its use. Detection tools are unreliable, and penalties can be severeâeven for honest mistakes.
Ashley Dunnâs experience highlights the emotional toll: anxiety, fear of academic penalties, and frustration with opaque systems. Yet others, like Rachel Gervais, see AI as a legitimate study partner that enhancesânot replacesâlearning.
Faculty Challenges: From Teaching to Policing
Professors like Megan Fritts are bearing the brunt of enforcement. Verifying AI use is time-consuming, technically complex, and emotionally draining.
Using eight different detection softwaresâincluding Phraselyâis not sustainable long-term. And even when AI is confirmed, confronting students feels less like education and more like surveillance.
Policy Frameworks: What Universities Are Doing Now
Higher education institutions are developing varied policy responses:
| Institution | AI Policy Approach | Key Features |
|---|---|---|
| University of Arkansas at Little Rock | Decentralized, faculty-driven | Professors set AI rules per course; must be stated in syllabus. Detection and enforcement left to individual instructors. |
| Ohio State University | University-wide integration | AI Fluency Initiative mandates AI literacy for all undergraduates. Encourages critical, creative use across disciplines. |
| Many other institutions | Reactive or undefined | No clear policy; reliance on honor codes and existing plagiarism frameworks, which werenât designed for AI. |
The Ethical Core: Revisiting Kant in the Age of AI
Philosophy professor Megan Fritts opens her classes with Immanuel Kantâs principle: âTreat all people as ends in themselves, never merely as means.â This ethical foundation takes on new meaning in the AI era.
When students outsource thinking to AI without engagement, they risk becoming mere conduits for machine outputâviolating the very purpose of education: human development, critical thought, and authentic expression.
AI Tools Mentioned in Campus Use
Students and faculty are actively using a range of generative AI platforms:
- ChatGPT â Used for essay drafting, question generation, and idea exploration.
- Claude AI â Cited in the 86% usage statistic; known for nuanced, long-form responses.
- Google Gemini â Integrated into Google Workspace; accessible via common student tools.
Detection Software Landscape
Despite limitations, detection tools are widely used. Professor Fritts alone employs eight different softwares, including:
- Phrasely â Specifically named as a first-line detection tool.
- Other unnamed AI detectors â Reflecting the fragmented, competitive market of detection technology.
Steps for Students: Using AI Responsibly
Based on insights from Ohio State and forward-thinking educators, hereâs how students can ethically leverage AI:
- Use AI for understanding, not submission â Generate practice questions, clarify concepts, or brainstorm ideas.
- Always engage critically â Donât accept AI output at face value; interrogate its logic and sources.
- Disclose usage when required â Follow your professorâs syllabus guidelines.
- Test your own knowledge â Use AI-generated quizzes to identify gaps in your learning.
Steps for Professors: Designing AI-Resilient Assignments
Educators can reduce AI misuse by redesigning assessments:
- Incorporate personal reflection or in-class writing.
- Require oral defenses or process documentation.
- Focus on local, current, or unique prompts that AI hasnât been trained on.
- Clarify AI policies in writing on the syllabus.
The Future: Co-Creating the Rules with Students
As Ravi Bellamkonda suggests, the path forward isnât top-down prohibitionâitâs collaborative innovation. Students must be active participants in shaping how AI is used in education.
After all, they are the generation entering a workforce where AI fluency is no longer optional. Higher educationâs role isnât to shield them from this realityâbut to prepare them to harness it ethically and effectively.
Conclusion: Navigating the AI Transformation Together
Generative AI is not a passing trend. Itâs a foundational shift in how knowledge is created, shared, and evaluated. The class of 2024 stands at the epicenter of this transformationâlearning in real time how to balance convenience with integrity, innovation with ethics.
From Megan Frittsâ detection struggles to Natalia Moreno Buitragoâs research breakthroughs, the message is clear: Artificial Intelligence Reshaping higher education isnât about banning or blindly adopting tools. Itâs about cultivating critical engagement, transparent policies, and human-centered learning.
As Fred de Sam Lazaro reports from Columbus, Ohio: this is a question without a clear answer yet. But with thoughtful collaboration between students, faculty, and institutions, the disruption of AI can become a catalyst for a more dynamic, relevant, and human-centered higher education.

