AI’s climate cost is rising. Glass could help cut it.
New hardware innovations may hold the key to powering AI more efficiently — and more sustainably.
AI’s rapid growth is colliding with the realities of our climate crisis, but advances in chip materials like glass-core substrates offer a promising path forward.
-
AI demands are driving up carbon emissions and straining power grids
-
Traditional chip materials are reaching their physical limits
-
Glass substrates can enable more efficient, powerful, and thermally-resilient chips
-
Cutting-edge material science provides the path to reducing AI’s energy footprint at scale
On one hand, an unstoppable force. If you haven’t yet felt the push to weave artificial intelligence into the fabric of your daily life, you will soon. According to the Pew Research Center, 34% of U.S. adults say they’ve used ChatGPT, nearly doubling the reported percentage in 2023. And as the world’s biggest tech companies invest billions into AI technologies, infrastructure, and advertising every year, that number will only climb.
On the other hand, an immovable object: planet Earth. “AI has created an unexpected surge in energy demand, and with it, climate‑warming greenhouse gas emissions,” writes Aaron Krol in the Massachusetts Institute of Technology’s Climate Portal. “Addressing this will take more computationally efficient AI models, more energy-efficient data centers, and new clean energy to power it all.”
That is to say, we currently find ourselves caught between the physical limits of our planet and the inevitable rise of AI. Without serious, material intervention, where else does this trajectory lead but to catastrophic collision?
The climate toll of AI
We already know the carbon cost of generative AI is steep. Following our current trajectory, the International Energy Association forecasts that electricity demand from datacenters worldwide is “set to more than double by 2030 to around 945 terawatt-hours, slightly more than the entire electricity consumption of Japan today.”
Meanwhile, Microsoft and Google each reported double-digit increases in emissions over the past few years, largely attributing the recent spike to AI and the infrastructure expansion needed to support it.
But why does AI demand so much power in the first place? It comes down to computational power required, and therefore the hardware used, to process enormous volumes of information simultaneously.
“Training these models involves processing massive datasets using power-hungry data centers equipped with GPUs or TPUs, often over weeks or months,” says Dr. Kai Two Feathers Orton, Head of Data Science at AnitaB.org. “It’s like having thousands of powerful computers running 24/7, storing and processing data on a huge scale for months on end.”
Meet Kai Two Feathers Orton, Ph.D.
Kai Two Feathers Orton, Ph.D., is a First Nations (Innuinait, Tłı̨chǫ, Niimíipuu, Nēhiyawak) Canadian-American cybersecurity, data, and AI governance executive, scientist, ethicist, and visual artist. With 20+ years across nonprofits, academia, and industry, she builds ethical, resilient, and inclusive information systems. A trained biophysicist and AI scientist, her work reflects a deep belief that technology, land, and tradition are interconnected – and that sustainability requires honoring all three.Beyond training, simply using programs like AI-assisted search tools requires massive amounts of computing power too. “Each time you ask generative AI a question, the program is essentially starting from scratch,” explains Charles Yeomans, founder and CEO of Atombeam, a company centered on optimizing the efficiency of AI. “It essentially runs all that data over again and bases new answers on probability – all the while using the same amount of power every single time.”
The result, Charles says, “is a flat energy consumption curve, not growing more efficient over time, regardless of how frequently or repetitively the system is used.”
When unstoppable converges on unmovable
AI’s environmental impacts are happening already. In Santa Clara, California data centers now consume 60% of the city’s power, pushing household energy bills even higher. Reports from Oregon show data centers are consuming more than 25% of one city’s water supply, putting a strain on community and agricultural utilities. And in May, diesel backup generators tied to data centers in Virginia caused an estimated $150 million in public health damages due to pollution.
Globally, the United Nations Environment Program predicts that AI-related infrastructure will soon “consume six times more water than Denmark, a country of 6 million residents.”
“There is still much we don’t know about the environmental impact of AI but some of the data we do have is concerning,” Golestan (Sally) Radwan, Chief Digital Officer of the United Nations Environment Program wrote in the report. “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.”
Charles echoes the sentiment. “If we don’t change course, if we don’t find better ways to power or run AI, it could soon rival the energy consumption of the world’s largest nations,” he says. “Our ability to generate that much clean, renewable energy just isn’t there yet, so we’ll be forced to rely more on energy sources that pollute-more fossil fuels, more greenhouse gases, and possibly a push to build out nuclear power infrastructure quickly.”
As an Indigenous scholar, Kai brings a unique lens to this conversation: “In healthcare AI can absolutely save lives – but if it runs on systems that pollute the air or extract finite resources, we’re creating imbalance,” she says. “And that imbalance often hits frontline communities first.”
Fortunately, solutions are emerging.
One that’s already in development is “small modular reactors, or SMRs,” Charles explains. “These are compact nuclear power units that can provide energy to data centers more safely and cleanly than traditional nuclear plants — they’re a big part of the conversation because the scale of AI’s energy use is just so massive.”
Beyond new energy sources, he says, the biggest opportunity lies in making AI itself more efficient. “Because large language models don’t actually learn from the environment or interactions, we can engineer models that learn from their environment, and therefore become more efficient over time,” he says.
In addition to building more efficient models and finding cleaner sources of renewable energy, Kai adds that tech companies should be held accountable for their energy use. “Think of it like a nutrition label,” she says, “but for the tool’s carbon impact.”
But even with more efficient algorithms, AI still requires a massive amount of processing power that pushes traditional computer chips to their physical limits.
For decades, silicon technology-driven performance increases in computer chips have been the standard. They’ve gotten us to this point, and provided the computational power to give rise to AI in the first place. But as demands increase, these technologies and materials are hitting their physical limits – in thermal management, power efficiency, and interconnect density.
In order to meet the energy demands of AI without further burdening the climate, more efficient hardware can make a significant contribution. And increasingly, that means moving beyond traditional materials.
“When people hear the word ‘semiconductor,’ they picture a small silicon chip that's made from a larger silicon wafer,” explains Colin Schmucker, Business Development Manager for SCHOTT Semicon Glass Solutions. Those silicon chips, in order to work in a consumer electronic, need to be packaged.
A semiconductor package, then, protects the silicon chips and provides the electronic circuitry to route signals to and from the device. “The packages include a substrate which, as of today, is typically made of a mixture of dielectric polymer and copper,” Colin says. "The devices used in high performance computing to support AI are starting to require much larger, more complex substrates and packages.”
One leading solution to these challenging applications is glass. As a substrate core material, it offers promising advantages: flatter surfaces, improved thermal performance, and greater dimensional and mechanical stability.
“By inserting a layer of glass – called a glass core – into the normal dielectric and copper layers, substrates can be made larger, flatter, and with more complex structures,” Colin continues. This means glass-core substrates allow for tighter packing of chip components and miniaturization of structures.
In other words, glass packs more computing power into less material, resulting in fewer chips, less material waste, and lower total energy consumption. That kind of efficiency gain doesn’t just make AI faster and cheaper, it makes it more sustainable.
A material path forward
“The power needed to run AI applications is growing exponentially,” explains Colin. “So you're looking at massive investment in energy infrastructure just to maintain computing power – but if glass can save even a fraction of a kilowatt, it will significantly contribute to overall energy reduction for these AI applications, which aren't going anywhere.”
Glass substrates “could make a real difference in how efficient AI systems become,” says Vipul Jain, AI expert and founder of Sprout Innovate. “Better thermal management alone means less heat – and therefore less energy needed for cooling, which is a big part of data center power use.”
Even modest cuts in chip-level power consumption could lead to massive, global energy savings. “If chips were just twice as efficient, the savings could match the electricity consumption of an entire country like Japan — that gives you a sense of the scale we’re talking about,” Vipul continues. “So if this technology scales, it’s not just a hardware win – it’s a step toward real decarbonization at the infrastructure level.”
Stepping back, Kai sees the impact of lessening AI’s energy consumption from a more wholistic point of view.
“Everything is connected – land, water, people, and technology – but if we keep building AI without considering its energy demands, we risk damaging the very systems that sustain life,” she says. “That's why I believe that in building sustainable AI, we can build a technology that helps people, and the planet, thrive.”
Glass won’t solve the climate crisis alone. But as AI becomes more embedded in daily life, innovations like glass-based packaging offer a tangible path to cut emissions –without curbing progress. And that could make the difference in keeping the unstoppable force from shattering the immovable object.