Journalism is home to an environment where people can experiment with their voices, share their unique thoughts on the world, and inform their audiences. It is meant to be an outlet to discover and share truths of the world, and thus, it is vital to a thriving society. It is a constant source of knowledge, coming directly from the minds and research of brilliant individuals who dedicate their lives to informing the public. Writing techniques vary among different reporters, making journalistic writing that much more fascinating and unique. At its core, it is authentically human.
Recently, Artificial Intelligence has permeated every sphere of life. It is quick, accessible, and easy to use for a multitude of tasks. While it can be helpful for completing tasks in everyday life, it does raise ethical concerns, specifically in the field of journalism.
With the introduction of AI, even well-known newspapers such as The New York Times have begun to incorporate AI into their work. While this is generally done to improve efficiency, it is undoubtedly a topic of controversy as we watch our world become dominated by AI. Unfortunately, much of the authenticity that is so prominent in journalism is jeopardized when AI comes into the picture.
Ms. Nicole Scotto, an A.P. European History teacher at Bronx Science, shared her thoughts on the subject. “The authentic human voice is key. I think that for journalism, or any writing, whether it’s an article or a work of fiction, the human voice just isn’t going to be replicated by AI. I think that it’s important to have that human voice in your writing, especially in journalism,” Ms. Scotto said.
AI is generic, standard, and mundane – the exact opposite of what quality journalism entails. AI often follows a rigid structure when writing that leaves very little room for creativity. Slowly, we could begin to see a decline in the quality of journalistic pieces as AI becomes more and more integrated into our daily lives.
While AI has its drawbacks in the journalistic environment, it can also provide immense benefits if used in the correct manner. The New York Times released a brief statement in which the esteemed newspaper was transparent about the ways in which it has used AI. AI has been used to sift through data for investigative reporting, curate individualized article recommendations for readers, create drafts of headlines and summaries of Times Articles, and create automated voice technology so that people can access the news in many different languages.
Furthermore, the University of North Carolina at Chapel Hill reported that AI has been used to generate news articles on routine topics such as sports scores, financial reports, and weather updates, giving journalists the time and space to focus on more complex and investigative stories. This development is ultimately very controversial, as it can set a bad precedent that will continuously allow for the intervention of AI in daily news. However, it is also beneficial because it allows journalists to focus all of their energy on producing more interesting pieces on current events and front-page stories that garner more attention.
“You could spend your time drafting a more engaging story, rather than writing a small paragraph about the weather. It might even allow people to be more creative because it gives one the space and time to do so,” Ms. Scotto said.
Overall, AI has the potential to enhance the efficiency, accuracy, and reach of journalism, while also presenting new challenges and ethical considerations for the industry.
Despite these benefits, another issue that the implementation of AI raises is the lack of accountability. The Center for News, Technology, and Innovation (CNTI) details some of the key issues with this technology. AI is notorious for producing inaccurate information and presenting it as the truth. If we continue to allow for the integration of AI in journalism, these discrepancies will inevitably undermine public trust.
Similarly, AI could produce offensive or sensitive content that doesn’t sit right with the public and leads to backlash. This could include the reinforcement of stereotypes that marginalize certain groups of people, which is clearly problematic.
AI also presents opportunities for abuse of copyright for journalists’ original work. After all, AI is essentially scanning the internet at a rapid pace and compiling information based on what already exists; clearly, there is a risk of copyright. If this goes unchecked, it could lead to major legal repercussions for the parties involved. Large sums of money and jobs would be at stake.
For example, in December of 2023, the New York Times sued OpenAI and Microsoft for copyright infringement. Essentially, millions of articles published by The Times were used to train automated chatbots, which then spit out full portions of these articles verbatim to users. OpenAI has now become a threat to well-established news sources, as people are beginning to turn to generative AI technology for information on current events. The Times was furious to find that their very own content was being relayed to the public by their new competition.
At that point, this question arises: who is to be held responsible for AI’s mistakes? In the case that AI copyrights a journalist’s original work, who should be held responsible – the people behind the AI itself or the industry that used the AI?
A couple of Bronx Science students shared their thoughts on this question. “I think that the industry that uses AI should be held responsible because when you’re using an AI generator, you should be aware that it’s taking information from other sources and only producing what it’s fed,” said Emily Appelbaum ’27.
“I think the industry that used the AI should be held accountable, because if we held every innovator accountable for their work, then people would be disincentivized from creating anything. AI has harms and benefits, and if a company chooses to use an AI system in a harmful way, that responsibility should not fall on the creator,” said Frances Auth ’26.
“Overall, I think it’s a slippery slope to blame the creator of something once they introduce it to the world. I would say that the industry is responsible because once something as volatile as AI is introduced into the world, an event that was bound to happen, whether the particular creators were the ones who did it or not, it is the world’s responsibility to determine what to make of it,” said Ruby Lahana ’27.
“I would say that it’s the fault of the people behind the AI itself because it’s their job to code that the AI will use a summary of the information available on the internet and not just directly copy from one source. When a user uses AI, they are assuming that the AI isn’t plagiarizing and that it’s just summarizing online information. However, copying and pasting from any form of AI is its own form of plagiarism, so while it’s not technically the user’s fault, they still partook in plagiarism and cannot claim that work as their own. Also, the user shouldn’t just assume that the AI isn’t plagiarizing so technically they are both at fault and should both be held liable,” said Alena Egan ’27.
At Bronx Science, every article is written and edited entirely by students. As the use of AI is highly discouraged, we develop our researching, analyzing, and writing skills without the help of generative AI technology. Each day, we have 40 minutes of dedicated class time to work on our articles, peer-edit, and produce high-quality journalism. In a high school newspaper setting, it is especially important for students to develop writing skills and practice applying them, which is why the use of AI is highly frowned upon.
It is clear that the use of Artificial Intelligence in journalism has its benefits and drawbacks. While AI can create newfound time for journalists dedicated to producing integral pieces, it can also take away from the authenticity of journalistic writing, while also creating issues with accuracy and accountability. Its effect ultimately depends on how it is used and the length at which it is relied on. “Again, it really comes down to how people are using it,” said Ms. Scotto.
“The authentic human voice is key. I think that for journalism, or any writing, whether it’s an article or a work of fiction, the human voice just isn’t going to be replicated by AI. I think that it’s important to have that human voice in your writing, especially in journalism,” Ms. Nicole Scotto, an A.P. European History teacher at Bronx Science.