Why Your AI Tutor Might Be Widening the Achievement Gap

November 25, 2025

Intelligent Tutoring Systems and AI are often heralded as the great equalizers in education, promising personalized learning for every student. The data shows promise: one study at Los Angeles Pacific University (LAPU) found that students who used an AI assistant at least three times saw an average 7.5% increase in GPA (LAPU, 2024).

However, a closer look at AI’s voluntary use reveals a critical, hidden risk: the benefits may accrue only to a small, already privileged group of learners. If not chosen strategically, your AI tutor could actually be widening, not closing, the achievement gap. 

Slightly caffeine-deprived and in that weird “twilighty” afternoon space following a productive day, I found myself unable to come up with a snappy name for this hidden risk to student success around AI use. So, I turned to ChatGPT. Its response? This is the Five Percent Problem. It is not very imaginative, but it does get to the shocking heart of the issue: only about five percent of students show learning gains by using AI companions. 

The Uncomfortable Truth About Unguided AI Use

Research in educational technology consistently reveals a significant flaw: only a small, highly motivated subset of students utilizes digital tools as intended, and reap most of the benefits.

The Five Percent Problem. In many studies, this active user group, comprising five percent of the total student population, is already comprised of high-performing individuals who are also more likely to come from higher-income backgrounds. This dynamic was documented by Laurence Holt in Education Next (2020) and echoed by other researchers examining online and AI-supported learning.

Meanwhile, the other 95 percent of students, those who primarily aim to finish assignments rather than master content, show little to no measurable gains.

The result? Instead of democratizing learning, AI tools can become amplifiers of inequality, turbocharging the already motivated and resource-rich, while leaving others further behind.

The Cognitive Cost: Trading Struggle for Shortcuts

Even for those who do engage, over-reliance on AI tutors introduces a subtle cognitive risk: the erosion of “productive struggle.”

Productive struggle—the mental wrestling that leads to deep understanding and lasting skill—is what transforms effort into mastery. Yet many AI systems provide immediate hints or solutions, which can short-circuit that process.

The Risks:

  • Cognitive Offloading: When students offload problem-solving to AI, they may lose opportunities to build independent reasoning skills. Studies and position papers in 2024–2025 warn that unreflective AI use can cause “cognitive atrophy” and reduce long-term retention and creativity.
  • Reduced Critical Thinking: The U.S. Department of Education’s 2023 AI in Education report cautions that tools offering instant answers risk undermining deep reasoning if they aren’t designed with human oversight and reflection prompts.

If students begin to view AI as a shortcut rather than a collaborator, the learning journey—the “eureka” process itself—gets devalued. We risk training efficient task-completers instead of resilient problem-solvers.

Strategy Is Everything: Designing for Independence

AI can absolutely close equity gaps—but only when implementation is proactive and institution-led.

Consider the Georgia State University model. Through predictive analytics and proactive advising, GSU has achieved dramatic gains in graduation rates and closed equity gaps for Black and Hispanic students. Their success didn’t come from voluntary AI use—it came from a systematic, data-informed outreach model that ensured every student received timely support.

To avoid the “Five Percent Problem,” institutions should:

  1. Prioritize Proactive Over Voluntary
    Use predictive analytics to identify disengaged or at-risk students and trigger proactive outreach—rather than waiting for them to self-select into help.
  2. Design for Independence
    Incorporate “graduated disclosure,” where AI tools reveal help only after the student shows genuine effort. Pair this with reflection prompts or “worked-example fading” to keep learners cognitively active.
  3. Mandate Critical Engagement
    Train faculty and students to treat AI as a thinking partner, not an answer key. Encourage students to question, critique, and verify AI outputs to sustain intellectual agency.

The Bottom Line

AI tutoring can transform learning—but only when equity and independence are baked into the design.

Voluntary, unguided use risks creating a “Five Percent Problem” where motivated learners soar and everyone else stalls. The antidote is intentional design: AI that challenges before it helps, nudges before it rescues, and partners rather than replaces.

That’s how we ensure AI empowers every student—not just the top five percent.

Want to see an AI companion that avoids the “Five Percent Problem” and actually supports every student? Gray DI’s College Companions are designed from the ground up to enhance learning—not replace it. Each Companion is powered by your institution’s own course materials and guided by faculty oversight, ensuring accuracy, transparency, and academic integrity. Built-in Socratic questioning, adaptive feedback, and graduated assistance help students think critically before receiving answers—promoting genuine learning over shortcuts. And with 24/7 multilingual access and customizable instructor controls, every student, regardless of background or schedule, gets equitable support.

If your institution is ready to empower students rather than enable dependency, schedule a demo of Gray DI’s College Companions and see how strategic design turns AI into an ally for true academic growth.

Mary Ann Romans

Associate Vice President, Marketing

Mary Ann creates, defines, and executes marketing strategy at Gray Decision Intelligence.

About Gray DI

Gray DI provides data, software and facilitated processes that power higher-education decisions. Our data and AI insights inform program choices, optimize finances, and fuel growth in a challenging market – one data-informed decision at a time.

Related Posts
Subscribe to Our Blog

Don’t miss our latest research and insights

Related Posts

Gray Insights

Gray DI’s 2025 Holiday Gift Guide: Give the Gift of Data-Informed Decisions!

Get ready to rethink holiday gifting with a look at how data-informed tools can offer far more value than another mug or fruit basket. This guide showcases how Gray DI’s Program Evaluation System and AI College Companions bring clarity, better decisions, and meaningful improvements across campus, revealing how modern insights can shape a more confident and strategic year ahead.

Read More
Gray Insights

The 24/7 Career Advisor I Wish I’d Had in College

Many students hesitate to ask for help until it feels too late. This reflection explores that moment of uncertainty and how AI can help bridge the gap. Meet Career Companion, an empathetic 24/7 partner that helps students privately explore academic and career paths before meeting with an advisor. Discover how CoCo and the College Companions are reshaping how students navigate their choices with confidence and clarity.

Read More