Summary

Social risks of an AI-driven energy transition

Anna Triponel

February 13, 2026

Forum for the Future published AI and Social Justice in the Energy Transition (January 2026), a taxonomy of social justice issues impacting and impacted by the use of AI in the transition to renewable energy. The taxonomy was developed via an AI-assisted literature review and tested against real-world case studies and with experts.  

Human Level’s Take:
  • The transition to renewable energy increasingly depends on AI technology, used in everything from energy demand forecasting to siting projects to pricing. But what’s the potential human cost?
  • Forum for the Future notes that there has been limited research to date on the ways in which using AI in the energy transition could have unintended impacts on people, yet the risk is there.
  • For example, AI-enabled systems can make decisions based on biased data that exacerbate existing inequalities in access to reliable, affordable and clean energy. They could also lead to siting projects in locations that disproportionately impact certain communities and ecosystems, or cause job loss or new physical and mental health risks for energy sector workers. In short, the impacts could be widespread.
  • Forum for the Future’s 12-issue taxonomy outlines the key social justice issues at stake in a rapid, AI-enabled energy transition, ranging from questions of access and participation, to fairness and transparency, to work and benefits, to environmental costs.
  • So what does it take to combat these risks in an energy system dominated by AI? One critical action for business is strong human rights and environmental due diligence to help predict, prevent and mitigate risks. Due diligence will need to take place along the full project or product lifecycle, from design to development to deployment to closure.
  • In addition, having transparent or even open-source systems could help ensure equal access to the benefits of AI-enabled energy, and help root out entrenched inequalities that lead to unfair outcomes.
  • Governments will need to take the lead in setting out governance frameworks and regulations to prevent social and environmental impacts, but companies can also play their part by advocating for fair rules and avoiding undermining them through lobbying.

Some key takeaways:

  • The challenges posed by AI in the energy transition:. The report identifies a few ways in which the rapid development and deployment of AI could negatively impact people, and where additional safeguards will be needed. AI is being used to make important decisions about who gets renewable energy, at what cost, from what sources and under what conditions. The speed with which companies and governments are embedding AI into our energy infrastructure and relying on it to make policy decisions makes it more important than ever that we create strong AI governance systems now. In addition, AI-enabled systems could make decisions that exacerbate existing inequalities in the energy system, for example affordability for low-income households and lack of access to new products for some communities and demographics. In order to reap the benefits of the transition — cleaner air, secure and affordable energy, climate stability and resilience — we will need to make conscious design choices, including on how to manage tradeoffs. Forum for the Future has identified a gap in current research exploring these dynamics and challenges with respect to AI and the energy transition. The taxonomy aims to bridge these so that decision-makers can mitigate the social and justice risks of AI deployment and identify opportunities for positive social outcomes.
  • A taxonomy of social justice issues: The paper outlines 12 issues, clustered in four thematic areas:
    • Thematic cluster 1: Access and participation — who has access to AI-enabled energy systems and who gets a say in making decisions about them?
      • Issue: Access and affordability — AI-enabled energy systems could exclude low-income users through high costs and technology barriers, worsening existing inequalities in access to reliable, affordable energy.
      • Issue: Digital divide and inclusion — Communities without digital literacy or infrastructure (especially in remote or rural communities) risk exclusion from new energy markets.
      • Issue: Data rights and consent — Data collection without adequate consent could increase the risks of surveillance, profiling and commercial exploitation.
      • Issue: Community participation and consent — Without mechanisms for community consultation, consent and ongoing engagement, AI technologies could reproduce top-down decision-making models.
    • Thematic cluster 2: Fairness and governance — Are algorithms fair, markets competitive, and are accountability mechanisms robust?
      • Issue: Algorithmic fairness and explainability — If training data is biased and decision-making systems are opaque, they can reinforce existing social inequalities in energy access and services.
      • Issue: Procurement and market power — Dependence on a small number of powerful vendors could increase the risk of oligopolies and limit local innovation.
      • Issue: Policy, standards and redress — A lack of strong rules and governance frameworks means that communities impacted by AI can struggle to access remedy.
    • Thematic cluster 3: Work and benefits — How are efficiency gains distributed, and what happens to workers and jobs within the system?
      • Issue: Labour and just transition — Energy sector jobs can be reshaped without training and re-skilling opportunities, social protection and worker voice in decisions to adopt technology.
      • Issue: Distributional impacts and benefit sharing — The benefits of the transition, like lower costs, increased reliability and access to new services, could mostly flow to wealthy or urban communities, with the costs and risks impacting marginalised groups.
      • Issue: Reliability and safety in critical infrastructure — AI malfunctions, cyberattacks, or extreme weather can trigger cascading grid failures, disproportionately harming vulnerable populations who lack backup systems and resources to cope with outages.
    • Thematic cluster 4: Ecology and system — What are the environmental costs, and who bears them?
      • Issue: Environmental footprint of AI — AI data centers require large amount of electricity and water, stressing ecosystems and undermining sustainability.
      • Issue: Critical minerals and supply chains — AI hardware and energy infrastructure relies on minerals which can be extracted under poor human rights and environmental conditions, often in developing countries.
  • Suggested actions: The paper offers recommendations for different actors in the energy system, including companies like energy utilities, AI developers and companies developing AI or using it in their operations and processes. One key action is to ensure that the social justice issues identified in the taxonomy are prevented and mitigated, for example via strong governance frameworks and due diligence along the full lifecycle of a product or service. Another recommendation is to consider shifting to open, interoperable platforms, supporting open-source access to technologies, and providing tools and resources to clearly explain technologies and their decision-making processes. In parallel, policymakers and regulators will need to create the enabling environment for just use of AI in the energy transition, for example by ensuring robust regulation of AI; considering the secondary and tertiary costs of deploying AI; moving towards policies to encourage innovation and ingenuity; and fostering international cooperation on AI governance to tackle risks across borders.

You may also be interested in

This week’s latest resources, articles and summaries.
No items found.