In an increasingly digital world, artificial intelligence (AI) has been integrated into everything from health care to education, and thus organizations are working to discuss how it affects access, opportunity, and the environment.
The Washington Informer hosted a virtual panel Jan. 14, moderated by the CEO of Black Meta Agency, Howard Jean, focusing on ethical practices and the benefits of AI. The afternoon conversation featured leaders, innovators and advocates, including: Liz Courquet-Lesaulnier, managing editor of Word In Black; Marlon Avery, vice president of applied AI at JP Morgan; Taylor Frazier McCollum, an activist with the Party for Socialism and Liberation (PSL) who has led the public opposition against the proposed Landover data center; Eric Brown Jr., a senior solutions engineer working with generative AI at Microsoft; and Sydney Goitia-Dora, the editor-in-chief at Howard University’s newspaper The Hilltop.
“I think that as a community, it’s important to be able to hold two truths,” Brown Jr. said during the conversation. “This is definitely a tool that can change generations, but at the same time, if not intentional about it, it can widen the gaps that already exist.”
The term “artificial intelligence” was coined by John McCarthy in 1955 when proposing a workshop exploring the potential development of machines that could mimic human intelligence. While advancements were made throughout the years in the medical, robotics and automotive sectors, such technology as AI assistants became more mainstream in the 2010s, especially after Apple launched Siri with the iPhone 4S in 2011.

As the tool became more normalized in day-to-day operations, text-to-image technologies erupted, catalyzing the boom in generative AI systems, specifically large language models (LLMs) like ChatGPT and Sora. These systems are trained on enormous datasets and designed to understand and generate human-like text. Because training LLMs requires thousands of graphics processing units (GPUs), they consume a vast amount of energy.
Although AI is capable of driving positive change in science, economics, entertainment and more, such extensive innovation necessitates the construction of hyperscale data centers, which guzzle resources and come with concerning environmental implications if not properly regulated or thoroughly studied.
“Studies haven’t been done for the long-term impacts of data centers being in our communities, so it’s [teetering] on how you ethically expand AI,” Frazier McCollum said. “I just think we need to wait for all sides of the coin. I think we can’t just say, ‘let’s go forward with these hyperscale data centers because it’s gonna make us trillions of dollars.’”
AI’s High Energy Demand
According to Stanford University’s 2025 AI Index Report, generative AI garnered $33.9 billion in private investments worldwide, increasing by 18.7% from 2023 to 2024. The Brookings Institution reported that in 2023, data centers consumed approximately 4.4% of the United States’ electricity, and AI is expected to account for nearly 21% of the world’s energy usage by 2030.
Energy production and consumption are directly correlated to driving climate change, as they are responsible for 75% of greenhouse gas emissions, and AI systems were potentially responsible for 32.6 to 79.7 million tons of carbon dioxide emissions in 2025, according to research published in December 2025.
Such a high demand for resources, which in turn increases utility bills, has caused people to be wary of AI and the data centers that come with it.
“It’s consistently a split of people wanting to learn it and integrate it, and you have people who are… standoffish with everything,” Avery said.
He continued to refer to these systems as the treadmill for the AI-centric workforce that he predicts the world will soon live and work in. He stated that the beautiful side of a treadmill, and in turn artificial intelligence, is that it allows a user to adjust their speed and pace themselves when using it.
While consumers can use AI at the speed and complexity of their choosing, able to control the momentum, experts note systems developers or those in charge of approving and constructing the data centers they require should also pace themselves when doing so, taking into account all the implications such technology brings to society and the planet.
“AI on its own is not going to close the equity gap,” Brown Jr. said. “It really takes intentional design, and without that intentional design in people, human intelligence being able to advocate, AI can scale inequities that already exist.”
Ensuring Responsible AI Usage and Data Center Development
Since July 2024, when the company xAI installed a data center in South Memphis, cases of asthma and respiratory illness have increased due to excessive pollution.
Boxtown, where the structure is located, was also once home to the Allen Fossil Plant, which left behind large quantities of toxic coal ash after its demolition in 2018 and continued contributing to the region’s vulnerable air quality.
If data centers are continuously built in vulnerable communities without regulations safeguarding the environment, the consequences could be detrimental to people’s health. According to a review of LLMs’ energy demand, one search on ChatGPT requires 25 times more energy than a Google search, and if every Google search used generative AI, the annual electricity consumption would equal that of Ireland’s.
“I think there’s ways that we can use it and to make our jobs easier, but also, if I know that this could potentially kill my people because the data centers are being put in Black communities or close by, do I really need to use this to write an email to somebody?” Courquet-Lesaulnier said during the discussion. “Whatever it is, we have to remember that people have to be involved.”
Goitia-Dora believes that if people are thoroughly educated on how to ethically use AI, then there’s a benefit in taking advantage of it as a tool. She was once skeptical of using it, but after accepting that artificial intelligence is unavoidable in this day and age, she became determined to learn more to try and understand how to use it responsibly.
The student journalist believes that people’s reliance on AI to think critically for them is one of the most dangerous aspects of the technology, and would like certifications or tests to be put in place that determine whether or not a person can access artificial intelligence systems.
“So it is kind of an individual choice, but as a community, we need to be doing our part in terms of educating people, researching people who are at the forefront of AI so that no one is left in the dark,” Goitia-Dora said. “I think just ignoring it and saying this is something bad isn’t going to be the solution.”
A handful of leaders across the nation believe that more extensive research looking into the impacts of AI and data centers is crucial before moving forward with development.
Legislation has passed in states like California, Maryland and New Jersey requiring the execution of studies analyzing data centers’ impacts on energy costs, the environment and the economy. Access to such information will potentially help foster more responsible AI usage and data center developments.
Frazier McCollum intends to continue the fight alongside other residents in Prince George’s County against the proposed Landover data center, pushing for equitable and effective development throughout the region.
“We only have one planet,” she told The Informer, “so we really have to cherish what we have.”

