“But it makes everything so efficient!”
The most convincing argument for using Artificial Intelligence as an everyday tool in work, at school, and with one’s home life, is that completing tasks with AI saves time and effort. This free machine that sorts, collects, and organizes data seems like magic at the fingertips of any burnt out student. The prospect of cutting out tedious busywork to hone one’s complex thinking skills is clearly an opportunity to capitalize on. Despite moral qualms about plagiarism and individual integrity, people with work to do increasingly justify using AI with the sheer jump in efficiency.
More broadly, AI symbolizes progress in the 21st century — an ever- improving, learning consciousness with the potential to fix all of humanity’s modern problems. Beyond everyday convenience, the prospect of having a web of almost limitless, quick information side by side with the smartest humans makes the future of medicine, rocket science, and technology look bright. The best part is that AI is free. Instead of paying a human being to do work for us, we can ask an AI program to do the same work for no cost.
In my experience using AI for high school work and talking to students who also do so, the idea of an almost-sentient being doing free labor even makes some feel guilty. Sonali Campbell ’26 told me that “I feel a little bad because it seems like a person.” Chat GPT’s human-like responses and constant readiness to help with any task create an illusion of exploitation. However, although there are layers and layers of exploitation involved in these human-AI interactions, the AI system is not the victim.
An exploration of the production and maintenance of an AI system reveals the deep harm, beyond mere educational integrity, that these systems propagate. The abuses intertwined with the creation and usage of AI fall into two categories: physical and intellectual. Physical exploitation includes the mistreatment of human workers to maintain factories and extract incredible quantities of non-renewable resources. Intellectual exploitation covers the non consensual harvesting of human data and information, and corporate managers’ profiting from that data.

Physical
Before reaching Silicon Valley, the journey of AI hardware is dark and dangerous. Within the very core of an AI system lies a semiconductor: a microchip composed of materials between a conductor and an insulator that facilitates high speed electrical current transmission. The semiconductor market has gained value exponentially quickly within the past decade, with global powers like the U.S., China, and Russia competing to control the most mines, factories, and distribution centers of these chips. Each semiconductor contains over 300 materials, from copper to yttrium. Some of these materials are classified as REEs (rare earth elements) — critical minerals that are uncommon and sought-after.
At the very beginning of the semiconductor production line, adults and children toil for little to no pay in dark mines in the Global South. In countries like Zimbabwe, Mozambique, and the Democratic Republic of Congo, non-governmental organizations are committing grievous human rights abuses without supervision. In one instance in the DRC, a mining company forced people to evict from their homes, threatening and intimidating them into settlements. These largely informal operations often breed unchecked labor violations, hiring young children and neglect to enforce safety regulations or minimum wages. The global rush in demand for critical minerals has made government officials and company owners frantic to squeeze as much profit as possible out of the land, and let justice slip through the cracks.
Even when stretched this thin, these African companies are often unable to stay afloat. Because of global profit disparities Africa contains 30% of the world’s critical mineral reserves and only makes 10% of the revenue. Silicon Valley tech companies have a consistently high growth rate, while their counterparts in the rest of the world sink deeper into debt.
Another stakeholder in the mining industry is indigenous peoples whose ancestral land is rich in minerals. These groups are politically unlike any other, given their lack of affiliation with any country or profitable organization. The OECD released guidelines for the treatment of indigenous people, but many modern mining corporations have violated them. Companies that interact with local communities should, and often neglect to, have policy guidance in that area. In Indonesia, a mining company called Freeport signed a contract in 1967 without any legal protections for indigenous communities. There were armed clashes between the company’s security and the local population, many of which violated UN Human Rights guidelines. Although such incidents have become more infrequent, they are not unheard of.
After resource extraction, refineries process minerals before sending them to manufacturers. These refineries, or smelters, are predominantly located across Eastern Europe and West Asia, and are a Pandora’s box of corruption and abuses. The BHRRC, an initiative to track human rights abuses in smelting facilities, identified 421 allegations of abuse in refineries across 16 countries from 2019-23. These included mostly workplace deaths and occupational health and safety concerns.

Even more subtle than the mining industry’s violations of labor laws is the environmental impact of AI production. Here lies the paradoxical nature of this new wave of technological advancement: Artificial intelligence is expected to help humanity solve the climate problem (among many others), but creating the hardware itself generates incredible amounts of pollution. As we look to the shiny new AI models churned out by Big Tech companies to fix our dying planet, we neglect to acknowledge the underlying hypocrisy that stains the industry.
The process of extracting critical minerals has already put a strain on the climate. Critical minerals, including rare earth elements, are non-renewable materials used in semiconductors. Mining for these resources has irreparably damaged the ecosystems where mines are located. Furthermore, the raw minerals must be refined to remove excess unwanted elements, and refineries are responsible for overwhelming carbon emissions.
But the fault lies not with the governments overseeing these facilities. Global importers like the U.S. and China put immense amounts of pressure on countries like Chile, South Africa, and Zimbabwe to export as many critical minerals as possible. In an attempt to maximize exports, many of these countries have turned to producer cartels, such as OPEC, to exploit resources more efficiently. Cartelization comes with harms of its own, with uneven production and distribution causing unexpected price spikes and drops. Cartels neglect to check for human rights abuses, and the social consequences are deadly.
Data centers, the facilities that allow AI machines to run, are also responsible for vast amounts of energy pollution. Generative AI, like Chat GPT, requires energy to sort through huge data sets and respond to prompts quickly. Indeed, one image generation uses the same amount of energy that it takes to fully charge a smartphone. Computational models built upon machine learning and deep learning algorithms are more complex and require more energy per use. A study suggests that by 2027 the AI sector will account for the same amount of energy consumption as a large country.
These centers require fuel to remain in service, and account for large coal and oil extractions. The area surrounding data centers are subject to light, sound, and air pollution. Many communities have protested the government building data facilities near towns because of the deep environmental costs. In Chile and Uruguay, for example, local families have protested proposed data center projects that would consume drinking water reserves.
Intellectual
When asked about her moral qualms with using AI for simple tasks, Sonali Campbell said, “I don’t have any. If AI is able to do it arguably better and with less effort, then I shouldn’t be asked to.” This seems to be the general consensus among high school students. Although it feels minorly disingenuous to complete a task not completely independently, the clear efficiency outweighs that inconvenience. Jonah Bonin ’26 explained that, “I try to stray away from anything that would use AI to replace my thinking and learning, like having AI generate writing for me or analyze a text.” He uses programs like Chat GPT to explain difficult problems to him as a supplement to his learning, but draws the line at what feels like it might hinder his own critical thinking processes.
Generally, students understand the long-term effects of using AI for schoolwork, such as shortening our attention spans and diminishing our ability to analyze. Upon research, however, I learned that there is a more ominous side to what feels like a reasonable tradeoff between efficiency and integrity. Chat GPT is not a free source of labor, ready to follow our every command. The vast data centers from which AI draws its intelligence do not create information out of thin air. Every day, with everything we do, humans create data, and thus do work for AI machines, not the other way around.
Very few researchers have examined the growth of AI from the lens of labor rights, but those who have, created a clear way to think about it. Simply put, the way that AI companies make profit exploits worldwide labor rights by generating revenue from non consensual human labor. Every time we open our phones, type in our search bars, or even say something in the vicinity of a recording device, entities like cookies and pixels monitor, collect, and store pieces of data. Even after clicking off of a website, these tools track our activity online. Some websites have options to accept or reject cookies, but rarely, if ever, do websites explain to the user exactly what the cookies do.
Because of this lack of information about how data is used and collected, the system of interacting with online platforms and giving up personal information is almost completely non consensual. One of the first consent rules taught in middle school is that the party giving consent must be consciously aware of what they are consenting to. In almost every case, data companies violate this rule.
The only way that AI companies generate revenue is through collecting, sorting, and distributing data. The data itself is the resource, or commodity, that is traded to make profit.
In an examination of the modern commodification of data, an important distinction between data and other commodities is that none of the costs associated with crowdsourcing data are paid by the capitalist; instead, all are borne by society. That means, excluding the nearly insignificant costs of data center maintenance and management, AI corporations make 100% profit. Through a Marxist analysis, the rate of exploitation is also 100%: for every $100 the capitalist makes, the worker makes nothing.
One could evaluate this cycle as a method of generating infinite profit: consumers feel no burden when interacting with technology, and in turn companies make profit without paying any workers. But this data production is far from non burdensome. In fact, a high schooler on average sends at least 4 hours on technology per day, while an average adult spends 6 hours and 40 minutes. These hours disappear, eating away at our lives and building into days and years spent interacting solely with a screen. If time is money, we are being robbed.
Social media algorithms make the violation worse by constantly adapting “for you” pages to entice humans to spend as much time as possible on their applications. Each scroll releases dopamine into our brains, deceiving us into thinking we are interacting with other humans and keeping us glued to the technology. The time tradeoff for the company is simple: keeping consumers active longer generates more data and more profit. For our lives, however, this time is far more valuable and the slope is slippery. Without making a single dollar of profit from our own data, we scroll away our lives and allow corporations to exploit our labor.
What was once humans feeding, nurturing, and exploiting cattle for milk and meat is now huge corporations with complex AI systems exploiting sentient human beings for the same incentive. Unless we pick our heads up and reevaluate our situation, each and every one of us is a data cow, grazing our days away without any level of consciousness.
So when a worker decides to use an AI machine to boost their labor efficiency, or even goes so far as to feel bad for the system that they are using, they are under many layers of disillusionment. The labor of unpaid workers, from miners to refinery workers to each and every consumer, was manipulated to generate the very data set that helps that worker finish their task. The AI is not doing free labor. We are.
What was once humans feeding, nurturing, and exploiting cattle for milk and meat is now huge corporations with complex AI systems exploiting sentient human beings for the same incentive. Unless we pick our heads up and reevaluate our situation, each and every one of us is a data cow, grazing our days away without any level of consciousness.