ChatGPT. Google Gemini. Snapchat AI.
Once unfamiliar terms, these technologies have become fixtures of everyday life as they revolutionize thinking, creativity and decision-making. More importantly, but they have become personal assistants as reliance on them has grown from sporadic use to dependency. We no longer rely on AI exclusively for help; we turn to it to do our thinking for us.
Early Forms of AI
Early forms of generative AI (GenAI) primarily simulated human conversation and created generic responses to simple questions. Now, AI systems can be personalized upon user request. For example, when studying for a difficult subject, you could tell ChatGPT: explain this to me as if you were a specific character or person (e.g a teenage girl, a professor, or even Spongebob); ChatGPT would process the request and immediately format its responses to reflect the requested tone.
The immediate uptake and use of ChatGPT is reflected in the number of hits it receives everyday. As of March 2025, ChatGPT gets approximately 5.2 billion views per month. That’s approximately 173 million views a day. 7.2 million views an hour. 120,000 views a minute. By the time you have read this paragraph, thousands of people have opened ChatGPT and are asking it questions. So, are ChatGPT and other GenAI systems really making us smarter? Or are they just a growing pandemic that will lead to adverse effects on our ability to process information critically?
Applications to Education
The reality of the education system is as follows: 86% of students report using AI to do their schoolwork. What may have started with an easy way to answer a quick question has now turned into a personal assistant. Don’t have time to do your math homework? ChatGPT will do it for you. Don’t know how to phrase this? ChatGPT will do it for you. Can’t think of the correct phrasing for your essay? ChatGPT. Will. Do. It. For. You.
How long until we lose the ability to do these things ourselves? In an investigation of the growth of ChatGPT, I told students that had ChatGPT to ask it: ‘How many questions have I asked you?’
As pictured above, one student has asked ChatGPT almost 3,000 questions in the past two years. That’s 100 questions a week, which amount to around 14 questions a day. This isn’t reflective of all students, but it is reflective of many. Further, I asked the students to ask ChatGPT how their usage has changed over time. The results are as stated: in the first year, they ‘lightly used’ the AI system. They asked occasional questions, mainly driven by curiosity. However, in the past 6-8 months, their usage of ChatGPT has skyrocketed. They were frequently asking 80-100+ questions a week and engaging it in discussion of complex topic such as their A.P. classes, their writing, and more. ChatGPT also noted that the student started asking layered questions: one on top of the other, many follow-up questions within a session. What does this suggest? The student has evolved from using AI periodically to using it as a substitute for their thinking–a clear sign of dependency.
With systems quickly developing in place of our thinking skills, the convenience puts us at the risk of cognitive atrophy. In fact, according a study by psych.org, there was a recorded negative correlation between AI tool usage and critical thinking scores. In other words, frequent AI users were reported to have weaker critical thinking skills. In addition, the study also found that cognitive offloading, an individuals ability to think independently, was weakened in people that reported higher AI usage. This does not necessarily mean that those who use ChatGPT more experience cognitive atrophy. It is about how you use it—not simply just that you use it.
I asked ChatGPT what frequent usage of AI systems leads to. “ Convenience is neutral,” it said. “ AI tools like ChatGPT are built to make tasks faster and easier. Using them doesn’t automatically make someone lazy—it can make them more efficient, especially when time is tight. But there’s a tipping point. When people start using convenience as a substitute for learning rather than a support, that’s when it can slide into laziness or dependence.”
ChatGPT even produced a sample progression of events:
- Convenience: “I’ll use ChatGPT to brainstorm ideas faster.”
- Dependency: “I’ll just let it write my outline so I can get this done.”
- Laziness: “I’ll copy the whole essay—it sounds good enough.”
This demonstrates how repeated shortcuts can dull original thinking and effort if they’re not used thoughtfully. However, “there’s a key difference,” ChatGPT says. “Intention and awareness. If students use AI to learn more deeply, check understanding, or refine ideas, it’s a powerful tool. If they outsource thinking, it stunts growth. In that case, convenience can absolutely be the first step to laziness.”
Let’s break this down. When it comes to AI use, there are two main groups of people, the active users and the passive users. The active users use AI to supplement their thinking–to get feedback, to break down information, or to research ideas. The passive users use AI to bypass their own thinking. Examples of this are uploading homework assignments and copying directly from what ChatGPT wrote, asking AI to generate answers without thinking about the answer yourself, and repeatedly relying on it when you don’t feel like learning content for yourself. The key difference is cognitive offloading. Active users use AI to improve their own thinking. Passive users hand off their mental effort to an AI tool. This habitual ‘offloading’ leads to regressed thinking and problem-solving skills, and it is what leads to cognitive atrophy.

Forming a Habit
According to psychology, a habit takes 18-66 days to fully form. However, due to the ease and accessibility of ChatGPT, the routine of checking and consulting it tends to form quickly. According to a survey that I conducted in person with Bronx Science students who use ChatGPT, I found that dependency developed only a few months after they began using it.
One student told me, “half the time, I use AI because I didn’t make time to do my homework.” This is a sign of passive learning developing. However, this student also said that, “on the other hand, I am able to give it all of the information I have learned in a curriculum, and tell it to curate a study guide for me. And if terms remain unclear, it is so much faster to ask follow-up questions to it than to spend hours searching online for the answer to a niche question.” This is a good example of active learning. The student is using the information from ChatGPT to further their understanding of a topic.
Before the increase of AI use, people had to rely on active learning in order to complete assignments and process information. With AI systems, the immersive active learning is giving way to passive learning over time. There is becoming less of a need to understand a topic, and more of a habit to let ChatGPT figure out the topic for you.
But does everyone fall into the trap of dependency?
A study conducted by PSYPost explored how students are engaging with generative AI, and how their personality traits and character influence their use of the technology. The researchers surveyed 326 undergraduate students from three universities in Pakistan. The students were asked to report their perceptions of the fairness of their universities grading systems as well as fill out a self-report questionnaire grounded in the Big Five model. The Big Five model is a framework is psychology that organizes personality into five main traits: Openness, Conscientiousness, Extraversion, Agreeableness, and Neuroticism. Participants were asked to rank their perceived levels of these traits. These traits were selected based on their relevance to academic performance; for instance, “openness reflects intellectual curiosity and creativity, while neuroticism is associated with anxiety and emotional instability.” In the second part of the study, participants were asked how frequently they use GenAI tool for schoolwork. In the last stage of study, students reported their GPA, and assessed their academic self-efficacy and recounted their experience with learned helplessness.
The results revealed that the use of generative AI “mediated the link between conscientiousness and academic outcomes. Specifically, students who were more conscientious were less likely to use AI, and this lower use was associated with better academic performance, greater self-efficacy, and less helplessness.”
The author of this study, Sunday’s Azeem, an assistant professor of management and organizational behavior at SZABIST University said, “Our findings that generative AI use is associated with reduced academic self-efficacy and higher learned helplessness are concerning as students may start believing that their own efforts do not matter. This may lead to reduced agency where they believe that academic success is dependent on external tools rather than internal competence. As the overuse of generative AI erodes self-efficacy, students may doubt their ability to complete assignments or challenging problems without the help of AI. This may make students passive learners, hesitating to attempt tasks without support.”
Whether AI is beneficial is based on whether one uses it as a substitute or a support. What starts as convenience can easily become dependency, as the overuse of AI leads to the underuse of original thinking. To outsource your thinking is to reduce self-efficacy and potentially reduce your ability to tackle harder academic challenges when AI is not readily available.
“Be smart.” – ChatGPT
As of March 2025, ChatGPT gets approximately 5.2 billion views per month. That’s approximately 173 million views a day. 7.2 million views an hour. 120,000 views a minute. By the time you have read this paragraph, thousands of people have opened ChatGPT and are asking it questions.