Using AI to Assess Veterans’ Exposure to Harmful Forever Chemicals
In This Section
By: Carrie McDonough and Olexandr Isayev
Service members and first responders are disproportionately exposed to low levels of harmful chemicals throughout their careers such as per/polyfluoroalkyl substances (PFASs). CMU has developed new techniques to rapidly identify these chemicals and predict the long-term consequences, with enough time to mitigate the harm.
Why it matters: The groundwater and/or drinking water of more than 700 U.S. military bases have confirmed or suspected PFAS contamination. PFASs are known as “forever chemicals” due to their persistence, and they can stick around and accumulate in the human body over time. Coming investments in energy infrastructure, such as lithium ion batteries, may exacerbate PFASs exposure.
- Long-term exposure can have significant health consequences later in life, including higher risk of testicular and kidney cancer. There are thousands of “forever chemicals,” and most aren’t screened for by available commercial and clinical blood tests, so they go undetected until the harm is irreversible.
- PFASs are very useful for many applications in energy technologies, including their use as electrolytes in lithium-ion batteries, so we expect their production and usage to increase as energy demands grow.
Catch up quick: Thousands of chemical substances are synthesized for many different applications, including chemicals used as food additives, pesticides, dyes, plastic additives, cosmetics and fragrances, industrial products, medicines, and on and on. But the rapid pace of invention and production far outpaces the safety assessments for these chemicals.
- This is particularly worrisome for military service members and veterans who have been exposed to various chemicals at low levels continuously throughout their careers, including hundreds of forever chemicals ) in aqueous film-forming foams (AFFFs), turnout gear, and other equipment.
- Veterans and military service members are at an increased risk for certain cancers, likely due to chemical exposures during their service. We know very little about how to assess the risks of these exposures with respect to long-term health outcomes.
- Rapid, high-throughput, and predictive methods are rapidly needed to identify PFASs that persist in the blood of service members for years after exposure.
What we’re doing: We are pairing AI techniques with laboratory experimentation and advanced mass spectrometry instrumentation to better understand how these chemicals persist and accumulate in the human body over time, and predict the risks of chemicals based only on their structure.
- This work will lead to tools that will inform improved protective measures going forward.
- These tools will spur the development of more sustainable chemicals and inform the next generation of synthetic chemists, helping to integrate considerations of sustainability into everything we do in the CMU Department of Chemistry.
Key insight: Traditional models used for chemical risk assessment don’t work for “forever chemicals.” We are using chromatographic techniques in our laboratory to separate PFASs from complex mixtures based on their affinities for biological molecules like phospholipids and serum albumin to rapidly learn about the potential of these chemicals to remain in the body long term. Thus far, our work with animal models has demonstrated that in some cases, PFASs we didn’t initially detect in a complex mixture are only discoverable after the undetectable PFASs accumulate in the body — these chemicals wouldn’t be detected or studied if we didn’t have ways to pick them out and identify them as accumulative.
Policy takeaways: Our work could help pave the way for more rapid risk assessment so that chemical regulations can be shaped on a shorter timescale. In the future, the potential for accumulation and persistence in the body over time can be incorporated for a wider range of new poorly understood “forever chemicals” based on this study.
What’s next: There are many other jobs where chemical exposures are of concern. In addition to assessing chemical exposures among military service members, we are particularly concerned with understanding chemical usage and potential exposures in semiconductor and lithium-ion battery production.
Building Public Trust: Developing a Framework for Measuring and Reporting the Impacts of AI
In This Section
By: Emma Strubell and Tamara Kneese
AI is changing our lives — from our health care to the way we work. But as AI becomes more capable and integrated into the U.S. economy, its growing demand for resources, such as energy, water, land, and raw materials, is driving significant economic and environmental costs.
Why it matters: Resource consumption associated with AI infrastructures is expanding quickly, and this has negative impacts, including asthma from air pollution associated with diesel backup generators, noise pollution, light pollution, excessive water and land use, and financial impacts to ratepayers. Without full visibility by policymakers into the range of AI’s impacts and public participation to minimize these risks, we risk losing the public’s trust and slowing the advancement of AI.
The big issue: To create effective, sustainable resource policies, policymakers need to be able to access clear measurements of AI’s computing-related, application and system-level impacts and their negative outcomes. While there have been regulatory and technical attempts to develop scientific consensus and international standards around the measurement of AI’s environmental impacts, a more holistic picture needs to include impacts from AI’s application, such as applying AI to oil or gas drilling, and AI’s system-level impacts, broader social and economic impacts such as health impacts on local communities due to increased air pollution. These are challenging to measure and predict.
The way forward: Effective policy recommendations require more standardized measurement practices, including corporate transparency and innovation around technical ways to collect and report data, and engagement from multiple stakeholders to meet the needs of states, local government offices and communities. There is also a need to balance the potential costs and benefits of AI data centers and related energy infrastructures, to reduce state and local opposition.
Carnegie Mellon and Data & Society researchers have developed a plan of action with a series of recommendations to address these issues:
- Develop a database of AI uses and framework for reporting AI’s immediate applications in order to understand the drivers of environmental impacts.
- National Institute of Standards and Technology (NIST) should create an independent consortium to develop a system-level evaluation framework for AI’s environmental impacts, while embedding robust public participation in every stage of the work.
- Mandate regular measurement and reporting on relevant metrics by data center operators.
- Incorporate measurements of social cost into AI energy and infrastructure forecasting and planning.
- Transparently use federal, state, and local incentive programs to reward data-center projects that deliver concrete community benefits.
The bottom line: Data centers have an undeniable impact on energy infrastructures and the communities living close to them. This impact will continue to grow alongside AI infrastructure investment, which is expected to skyrocket. It is possible to shape a future where AI infrastructure can be developed sustainably, and in a way that responds to the needs of local communities. But more work is needed to collect the necessary data to inform government decision-making. The US needs to adopt a framework for holistically evaluating the potential costs and benefits of AI data centers and shaping AI infrastructure buildout based on those tradeoffs. That framework includes establishing standards for measuring and reporting AI’s impacts, eliciting public participation from impacted communities, and putting gathered data into action to enable sustainable AI development.
Go deeper: Read the full memo published by the Federation of American Scientists.
Powering Environmentally Sustainable AI
In This Section
By: Nicholas Z. Muller and Valerie J. Karplus
Without policy, sharp increases in electricity generation driven by AI could have detrimental effects on public health.
Why it matters: The air pollution produced by fossil fuel-powered electricity generation exacerbates a range of adverse health outcomes, including premature mortality. These effects are most acutely borne by the elderly, the very young, and people with pre-existing health stressors.
Catch up quick: Over the past 10 to 15 years, numerous coal-fired power plants have retired in the U.S. Research shows the result of these plant closures has been major reductions in both air pollution and greenhouse gas (GHG) emissions.
- Financial and insurance markets reveal additional resilience costs associated with GHG emissions and, by extension, continued reliance on fossil fuels to produce electricity.
- For example, rising costs from disasters including floods, wildfires, and hurricanes have resulted in dramatic increases in property insurance premiums in high-risk markets. Corporate insurers have also cancelled policies due to changes in the risk distribution. The result of these conditions has been a sharp increase in the risks faced by insured consumers and firms.
The main issue: Given the societal costs associated with high-emitting generation, expanding clean electricity supply to meet data center electricity needs while increasing system efficiency is a central need.
- Renewable energy, paired with a modest amount of clean firm electric power and energy storage, is likely to play an important role here.
- Incentives that encourage electrification in a manner that is coordinated with generation and transmission investments regionally to avoid imbalances, and system instabilities will be important to steady progress.
The jobs factor: Renewable electricity has also been a significant source of employment in the U.S.
- Recent data clearly suggest that the vast majority of new jobs in the electric power generation industries were due to deployment of clean energy technologies.
- And the total health benefits that come from retiring coal-fired power plants are many times greater than the lost salaries and wages resulting from the retirements.
Yes, but: Although national trends are encouraging, it is nevertheless essential to recognize and account for regional impacts.
- Renewable energy may not always emerge in places where fossil energy is in decline, and those who reap benefits may be different from those who face costs.
- In regions where the jobs impacts are most acute, providing targeted support for workers to transition to occupations with adjacent skills and commensurate pay is especially important.
Policy takeaways: Nuclear power, geothermal and electricity produced with renewables like wind and solar coupled with energy storage impose far fewer environmental and social costs than either natural gas or coal. And these sources of clean energy may provide important local economic stimulus and job creation, although targeted efforts to address impacts by region will be essential.
The bottom line: Carnegie Mellon University research shows that how the U.S. meets new load growth for AI will determine whether air quality and the climate are affected. Relying primarily on new natural gas (or even more so, reviving shuttered coal plants) will degrade local and regional air quality and public health.
Building the Robust Transmission Capacity Necessary to Power America
In This Section
The U.S. must upgrade its transmission system to meet America’s surging demand for more energy.
Why it matters: Transmission moves electric power from the locations where it is produced to the locations where it is needed — and the U.S. urgently needs more transmission capacity.
- The Department of Energy predicts the country will need to more than double high-voltage transmission capacity over the next several decades.
- At the same time, the construction of new long-distance transmission has stalled.
- Breaking the logjam that often makes it impossible to build new transmission capacity — without placing an unacceptable burden on consumers’ electric bills — is a major challenge.
Demand for electricity is growing again: Electricity consumption in the U.S. stopped growing in about 2007, but data centers, AI, heat pumps, EVs, the electrification of industry and other factors are driving new growth in electricity demand and the need for additional transmission capacity.
While building new conventional high-voltage lines is important, it is not the only way to expand the capacity of the U.S. transmission system.
- “Low sag conductors,” which allow lines to carry more power, can be used to expand the capacity of existing transmission corridors.
- The use of nontraditional corridors — such as building overhead or underground lines along highways and railroads — would help avoid the many problems of creating new rights of way.
- Recent developments for high-voltage direct current (DC) are making long distance buried cables possible, in both traditional and nontraditional corridors, including cables buried under lakes and rivers. Systems like these will be more expensive, but the extra cost may be worthwhile if such lines can be installed with far less public opposition.
Who will pay: Building any kind of new transmission will be expensive. As the U.S. meets the growing need for new transmission, it is important to make sure that those who benefit cover the costs. Otherwise, there is a risk that regular homeowners and other traditional electricity customers will face higher electric bills.
- All electricity customers should pay their fair share of additions to transmission that serve the electricity and reliability needs of everyone. However, the cost of large new infrastructure that will serve the needs of just a few should not be spread across ordinary rate payers.
- Most high-voltage transmission lines are owned by traditional, regulated investor-owned utilities (IOUs). The incentives these companies face from state Public Utility Commissions (PUCs) often encourage them to invest in local transmission upgrades rather than expanding the capacity of needed regional and interregional transmission.
- For some urgently needed interconnections, independent transmission developers may be able to avoid some regulatory obstacles and arguments about cost allocation. Such companies arrange power purchase agreements for those who need the power to use their lines.
- In some cases, transmission costs can be avoided if new loads like data centers can be located next to generation, such as Microsoft’s plans to restart the undamaged reactor at Three Mile Island near Harrisburg, Pennsylvania.
What we’re doing at CMU: Carnegie Mellon University is at the forefront of a range of research on the future of the power grid.
- CMU’s Electricity Industry Center provides insights for industry, including Professor Jay Apt’s work on how very hot or cold weather can affect the reliability of the power system and Professor Ramteen Sioshansi’s studies on how to use storage to make the grid more reliable.
- A team led by Larry Pileggi, head of CMU’s Department of Electrical and Computer Engineering, used their expertise in design automation of very large integrated circuits to build a platform called SUGAR that has become a leading tool for the analysis and optimization of power systems.
- FERC Commissioner David Rosner praised SUGAR in a letter highlighting interconnection automation software platforms, writing “One application reproduced the manual study of a large interconnection cluster — which took nearly two years to complete — in just 10 days and arrived at largely similar results.”
- SUGAR led to the creation of the award-winning startup Pearl Street Technologies.
- Granger Morgan, former head of CMU’s Department of Engineering and Public Policy, and a member of the U.S. National Academy of Science (NAS), recently led a NAS session on the need for expanded transmission. He has also chaired three major consensus studies on the power system: Terrorism and the Electric Power Delivery System, Enhancing the Resilience of the Nation’s Electricity System, and The Future of Electric Power in the United States.
What’s next: CMU faculty are leading a new multi-institutional, interdisciplinary consortium that is focused on addressing the challenges of expanding transmission capacity, including:
Assessing legal, regulatory, institutional and political obstacles to expanding capacity.
- Examining the sources of public resistance to new transmission and improving public understanding of the need for more capacity.
- Proposing the policy, legal and regulatory changes necessary for expanded capacity.
The group has received initial support from the Alfred P. Sloan Foundation and is actively seeking additional resources. Beyond Carnegie Mellon other members of the consortium include investigators at Carleton University, Pacific Northwest National Laboratory, Pennsylvania State University, The University of California San Diego (UCSD), and The University of Southern California (USC).
The bottom line: The U.S. urgently needs to expand transmission capacity to meet our growing demand for affordable and reliable power. Work being done at Carnegie Mellon University is charting a course for expanded transmission to benefit all.
‘AI Fast Lanes’ for an Electricity System to Meet the AI Moment
In This Section
By: Costa Samaras
Targeted policies to create “AI Fast Lanes” will ensure that the development of new AI and data center infrastructure does not increase costs for consumers, impact the environment, and exacerbate existing energy burdens.
Why it matters: Absent policy action, consumers and the environment could bear the brunt of adding new power generation to the nation’s electricity grid.
Catch up quick: One of the biggest bottlenecks in many regions of the U.S. in adding capacity to the electricity grid is the number of studies that must be completed before connecting. But the Electric Reliability Council of Texas (ERCOT) uses a “connect and manage” interconnection process that results in faster interconnections. A similar nationwide fast lane for connecting to the grid can address the bottleneck felt across the U.S.
What we’re doing: A new “Smart AI Fast Lanes” framework would expand the spirit of the Texas “connect and manage” approach nationwide for data centers and clean energy. This framework also includes investment and innovation prizes to speed up the interconnection process, ensure grid reliability, and lower costs for communities.
Data center providers would work with the Department of Energy, the Foundation for Energy Security and Innovation (FESI), the Department of Commerce, National Laboratories, state energy offices, utilities, and the Department of Defense to speed up interconnection queues, spur innovation in efficiency, and re-invest in infrastructure, to increase energy security and lower costs.
- FESI is an independent, nonprofit, agency-related foundation that was created by Congress to help the Department of Energy achieve its mission and accelerate the development and commercialization of critical energy technologies.
- FESI leading a Smart AI Fast Lanes initiative could be a force multiplier to enable rapid deployment of clean AI compute capabilities that are good for communities, companies, and national security.
How it works: For any proposed data center investment of more than 250 megawatts, companies could apply to work with FESI.
- Successful applications would leverage public, private, and philanthropic funds and technical assistance.
- Projects would be required to increase clean energy supplies, achieve world-leading data center energy efficiency, invest in transmission and distribution infrastructure, and/or deploy virtual power plants for grid flexibility.
Policy takeaways: Recommendations to implement the Smart AI Fast Lane initiative include:
- Fees to speed connection: New large AI data center loads would pay a fee to connect to the grid without first completing lengthy pre-connection cost studies. Those payments would go into a fund, managed and overseen by FESI, that would be used to cover any infrastructure costs incurred by regional grids for the first three years after project completion.
- ‘Bring your own power’ prize: FESI could award prize funding for clean electricity generated locally that covers twice as much as the data center uses annually.
- Efficiency innovation awards: FESI could award prizes for meeting efficiency targets of how much AI computing work we get for the amount of energy and water used. This would create incentives for both innovation and transparency in efficiency.
- Award transmission upgrades: Create prizes for deployment of reconductoring, transmission, or grid enhancing technologies to increase capacity, and prizes to upgrade distribution infrastructure beyond the needs for the project. This will reduce future electricity rate cases, which will keep electricity costs affordable.
- Flexibility prizes: Prizes that reward flexibility and end-use efficiency investments, including for:
- Data centers that demonstrate best-in-class flexibility through smart controls and operational improvements.
- Utilities hosting data centers that reduce summer and winter peak loads in the local service territory.
- Utilities that meet home weatherization targets and deploy virtual power plants.
The bottom line: The U.S. is facing the risk of electricity demand outstripping supplies in many parts of the country, which would be severely detrimental to people’s lives, to the economy, to the environment, and to national security. “Smart AI Fast Lanes” and FESI-run prizes will enable U.S. competitiveness in AI, keep energy costs affordable, reduce pollution, and prepare the country for new opportunities.
Go deeper: Read the policy brief “Speed Grid Connection Using ‘Smart AI Fast Lanes’ and Competitive Prizes” and the Guest Essay “AI’s energy impact is still small—but how we handle it is huge”.
Sustaining AI Growth Needs Energy and Carbon Efficient Computing Infrastructure
In This Section
AI's growing energy consumption could destabilize the grid and undermine climate goals unless we fundamentally shift from optimizing only for performance to also for energy and carbon efficiency.
By: Yuvraj Agarwal
America stands at a crossroads. Artificial intelligence, particularly generative AI models, is driving unprecedented economic growth, reshaping industries from health care to manufacturing. Yet this digital revolution comes with a hidden cost: massive energy consumption that threatens to stress our energy grid and also undermine our climate goals.
Why it matters: With AI model training consuming as much electricity as small countries and data centers projected to potentially account for 8% of global energy use by 2030, we face a fundamental challenge — how do we sustain AI's transformative benefits while reducing its impact on the grid and its operational and embodied carbon emissions?
Catch up quick: AI's carbon impact comes from three sources:
- Embodied carbon from manufacturing computing hardware.
- Energy use and carbon emissions for developing and training AI models.
- Energy-use and carbon emissions during inference when running those models for AI tasks.
For example, a smartphone's embodied carbon represents 80% of its lifecycle emissions, while data center servers balance operational and embodied carbon more evenly. Yet the AI industry largely operates in the dark — companies rarely disclose the energy consumed training their latest models or the energy usage and the carbon footprint (which depends on the source of the energy) of running them millions of times daily for inference tasks.
The big question: This opacity prevents both competition on energy efficiency and informed decision making. Without carbon and energy transparency, organizations can't choose between a highly accurate but energy-hungry model versus a slightly less accurate but dramatically more efficient alternative.
- The same goes for choosing different types of hardware, with different levels of energy efficiency, to run these workloads.
- Furthermore, there is currently a lack of data on the source of that energy (e.g., renewable sources or otherwise) used by each data center operator at fine time granularities, that prevent carbon-based scheduling of these AI workloads at different geographic locations.
Can we create transparency and optimization across the entire AI lifecycle — from manufacturing chips to training models to running inference — while driving innovation in both software and AI model efficiency and hardware design? The critical question isn't just whether we can shift computing to clean energy, but whether we can fundamentally reduce the energy and carbon footprint of AI development and deployment through better visibility, more efficient models, and more efficient hardware.
Policy takeaways: Policymakers have several immediate opportunities to accelerate the development and the use of energy and carbon-efficient AI infrastructure:
- Transparency mandates: Require disclosure of energy consumption and carbon emissions for training large AI models, similar to automotive fuel economy standards. Mandate embodied carbon reporting for computing hardware, from chips to servers. Make energy usage and carbon-usage metrics available to customers of cloud infrastructure for their workloads for decision making.
- Efficiency standards: Establish energy and carbon efficiency benchmarks for AI models, moving beyond accuracy-only metrics. Create energy efficiency requirements for data centers and cloud AI services.
- Research investment: Fund development of energy-efficient AI hardware, architectures, models and algorithms. Support research into carbon-aware computing systems that automatically optimize for clean energy availability jointly with performance metrics (e.g., accuracy) and energy usage. While investments in advanced manufacturing facilities are important, significant sustained investment is needed to train graduate students and support research labs who work on semiconductors and AI systems to drive U.S. led innovation.
- Procurement leadership: Federal agencies should prioritize energy and carbon-efficient AI services in government contracts, creating market demand for transparent, energy and carbon efficient alternatives.
- Innovation incentives: R&D investments for energy-efficient AI hardware and model development. Support creation of standardized carbon accounting tools for the AI industry.
The bottom line: True AI sustainability requires visibility and optimization across the entire lifecycle — from chip manufacturing to model training to daily inference. By mandating transparency and creating incentives for efficiency, policymakers can drive a virtuous cycle where competition on carbon performance spurs innovation in both AI algorithms and computing hardware.
Go deeper: CMU faculty are collaborating with several other universities to develop new technologies to better show and track carbon footprints, using AI to further energy flexibility and security, and ultimately create software and systems to reduce energy use and emissions. They are focusing especially on coupled societal infrastructures, which includes computing (e.g., data centers), transportation, buildings and the energy grid.
Accelerating Safe Microreactor Deployment with AI-Powered Knowledge
In This Section
By: Pingbo Tang
Deploying nuclear microreactors to increase energy infrastructure resilience — especially as electricity demand from AI is surging — demands more than safe engineering. It requires the ability to reuse proven designs, procedures, and training systems across national borders and regulatory regimes.
At Carnegie Mellon University, researchers are developing AI-powered frameworks that help designers, engineers, training developers, and regulators communicate clearly and collaborate efficiently across domains.
Why it matters: Today’s microreactor innovations are often stalled not by technical limitations, but by semantic mismatches in regulation, communication silos among stakeholders, and duplicative licensing efforts across jurisdictions.
- For example, a single term — like “Important to Safety” — can have dramatically different interpretations in U.S. versus Canadian regulatory systems.
- Without intelligent tools to bridge these gaps, organizations struggle to align designs, training programs, and safety justifications, wasting valuable time and resources.
The opportunity: By modeling the knowledge ecosystem surrounding microreactors, we can validate designs and engineered solutions to be reused and adapted more quickly across missions, facilities, and national boundaries.
- Training procedures approved in one context can be reviewed for compatibility in another.
- Regulatory reviewers can trace design decisions back to their justifications and precedents, while engineers can proactively identify and resolve potential compliance gaps.
What we’re doing: Carnegie Mellon researchers are seizing this opportunity with a combination of large language models (LLMs), semantic mapping, and graph-based reasoning. Their approach includes:
- Cross-regulatory semantic comparison: AI models extract, align, and disambiguate terminology from regulatory documents issued by the U.S. Nuclear Regulatory Commission (NRC) and the Canadian Nuclear Safety Commission (CNSC) to enable meaningful, clause-level comparison and reuse of design justifications.
- Multistakeholder knowledge graphs: The team builds domain-specific graphs that integrate technical requirements, operational data, training procedures, and regulatory rules — supporting transparent collaboration across engineering, operations, training, and oversight.
- Simulator-integrated human-system interfaces (HSIs): These interfaces connect real-time operator behavior with contextual regulatory and procedural expectations, enabling adaptive training and safety assurance even in remote or autonomous deployments.
The research team is developing and validating these capabilities through:
- Graph-driven design and review workflows, where engineers, training developers, and regulators can query system components, failure modes, and precedent cases with shared context.
- Behavior-procedure alignment tools, comparing simulator logs with prescribed procedures to detect divergence, improve training, and refine interfaces for clarity and consistency.
- AI-assisted documentation tools, helping teams translate between design rationales, training objectives, and regulatory language in a consistent, machine-interpretable format.
What’s next: Prototypes are in development, with simulation data and expert input guiding iterative design. These tools will inform future reports, stakeholder briefings, and decision-making frameworks for nuclear oversight and workforce training.
Go deeper: Project updates and documentation will be published at Human-Machine Harmony for Infrastructure. Relevant foundational research includes:
AI-Driven Discoveries to Catalyze Energy Storage
In This Section
By: John Kitchin
A collaboration between Carnegie Mellon University researchers and Meta AI is powering new solutions to convert renewable energy into climate-friendly fuels to power transportation and industry.
Why it matters: The transition from fossil fuels to renewable energy sources such as wind and solar power will help lower pollution and combat climate change. But we need to improve the process for converting energy from these sources to provide scalable and sustainable clean fuels for use in aircraft, long-haul trucking, and shipping.
Catch up quick: Wind and solar energy produce intermittent power, but with new cost-effective ways to store the power for later use, we can use renewable energy to make climate-friendly fuels for transportation and industry. One way to do this is by converting that stored energy into other fuels, like hydrogen or ethanol, through chemical means. But current methods for converting renewable energy into other fuels require catalysts that are often rare and incredibly expensive, such as platinum.
What we’re doing: The Open Catalyst project aims to solve this by finding low-cost catalysts to drive these reactions at high rates.
- Discovering new catalysts is an arduous and costly undertaking. Catalytic surfaces are made using a combination of several elements that work together to speed up the reactions.
- There are dozens of elements in the periodic table that are potential catalysts with numerous possible combinations. Add to that the fact that different ratios and configurations of these elements also have an effect, and the possibilities become uncountable.
- The high cost of running simulations and experiments limits the number of structures that may be tested.
The Open Catalyst solution is to develop AI models to accurately predict atomic interactions faster than the existing compute-heavy simulation.
- This approach means calculations that take modern laboratories days could instead take seconds, and will enable researchers to screen millions and maybe even billions of possible catalysts per year.
- The key is open datasets that offer catalogues of data on molecules and materials known to be important for renewable energy applications. This allows machine learning algorithms to quickly test millions of possible combinations, and eventually discover more efficient and inexpensive electrocatalysts.
What’s next: To enable the broader research community to participate in this important project, we have released a family of Universal Models for Atoms spanning molecules, materials, and catalysts.
- These models are trained on half a billion density functional theory (DFT) calculations, a quantum mechanical simulation tool.
- In addition to the data, baseline models and code are open-sourced on our Github page. There is also a leaderboard to see the latest results and allow the researchers to submit their own results to the evaluation server.
- We are expanding the capabilities of the Open Catalyst Project to magnetic materials enabling research in materials for electrification.
- Open Catalyst Experiments 2024 aims to bridge experiments and computational models in the search for low-cost, durable, and effective catalysts that are essential for green hydrogen production and carbon dioxide upcycling to help in the mitigation of climate change.
The big picture: This project represents a powerful model of collaboration linking AI and machine learning specialists from one of the leading AI companies with thought leaders from Carnegie Mellon’s Department of Chemical Engineering. This work illuminates broader strategies that the U.S. will need to pursue and support.
The bottom line: The Open Catalyst datasets are accelerating efforts to improve renewable energy storage and climate-friendly fuels that were previously hindered by lack of compute. It enables collaboration between the machine learning community and catalysis researchers for electrocatalyst and energy-related materials discovery across a much broader set of new materials and chemistry.
Data Center Growth Could Increase Electricity Bills 8% Nationally and as Much as 25% in Some Regional Markets
In This Section
By: Michael Blackhurst, Cameron Wade, Joe DeCarolis, Anderson de Queiroz, Jeremiah Johnson, Paulina Jaramillo
New modeling from the Open Energy Outlook Initiative shows that data center and cryptocurrency mining growth through 2030 could increase average U.S. electricity generation costs by 8% and greenhouse gas emissions from power generation by 30%.
Why it matters: Absent policy action, this increase in demand for electricity generation could lead to dramatically higher electricity bills for consumers and undermine the nation’s clean energy goals.
For example, capacity market prices in the nation's largest grid operator (PJM) exploded in December 2024, from $30 to $270 per megawatt-day (MW-day), a ninefold rise that will increase bills for 67 million customers across 13 states.
Catch up quick: Traditional utility planning assumes predictable 1%-2% annual demand growth over decades, but data centers are driving regional growth rates of 20%-30% annually. This mismatch between conventional planning timelines and demand growth has exposed limitations in capacity planning practices and increased short-run electricity generation costs, with some markets heavily utilizing older and more costly fossil-fuel generators in the short run.
What we did: The Open Energy Outlook (OEO) Initiative, a collaboration between Carnegie Mellon University and North Carolina State University, modeled the energy and emissions implications of expected data center and cryptocurrency mine growth. The analysis models the infrastructure, economic, and environmental implications under current policies and recommends policy tools that could ensure the grid accommodates this expansion while protecting consumers and climate goals.
What we found: Recent model results show that rapid data center demand growth increases generation from aging and expensive coal-fired power generators. In contrast, Texas utilizes more wind generation through targeted transmission investments. These contrasting outcomes underscore that data center growth will drive different regional outcomes.
Other key findings from the modeling include:
- Regional cost surge: Central and Northern Virginia face projected 2030 electricity cost increases exceeding 25%, the highest regional increase in the model.
- Coal gets lifeline: More than 25 GW of aging coal plants otherwise scheduled for retirement would continue operating primarily to serve data center demand.
- Emissions spike: Power sector emissions could increase 30% compared to scenarios without data center growth, reaching 275 million metric tonnes of CO2 annually by 2030. That matches the entire annual carbon output of France.
- Carbon leakage: Virginia's data center growth drives increased fossil fuel use in nearby states like Ohio, Pennsylvania, and West Virginia, potentially undermining state and regional climate goals.
Policy takeaways: There are several state and federal policy strategies to consider that may mitigate the effects of this expected growth, such as:
- Fair cost allocation: Create new customer classes and revenue sharing mechanisms to ensure large users rather than families pay for elevated infrastructure costs.
- Strategic siting incentives: Incentivize data center expansions in renewable-rich regions and away from areas dependent on fossil fuel generation.
- Transmission acceleration: Improving permitting and cost allocation for transmission lines connecting renewable resources to demand centers.
- Demand flexibility requirements: Incentivize or require energy efficiency or load management during peak periods or emergencies.
What's next: Data center and crypto mining electricity demand is projected to grow 350% by 2030. Without intervention, the pattern seen in PJM — massive price spikes followed by political backlash — will become more frequent. Continued investment in energy systems modeling resources like those maintained by the Open Energy Outlook Initiative can help identify policy interventions that balance the benefits and costs of increased data server activity.
The bottom line: The digital infrastructure boom is outpacing our electricity system’s ability to respond. Data centers offer potential benefits but risk locking in higher emissions and driving up prices for households without proactive and coordinated planning. Policymakers must act now to align infrastructure investment and regulation with this demand surge.