top of page

Eco-Business: AI access may not always be unlimited as ESG risks mount – are businesses ready?

  • 3 days ago
  • 1 min read

Companies need to stress-test their plans to ensure business continuity in a disrupted world where access to Artificial Intelligence could be scarce or expensive.


By Steven Okun, Megan Willis, Noemie Viterale



Businesses are placing massive bets on Artificial Intelligence (AI) on the assumption the investment will surely pay off. Their leaders’ decision making conflates AI with clean air – always available, in whatever quantity and quality their business needs.

Businesses and investors should not only focus on the great investment opportunity AI adoption brings.


They should consider the obvious risks in doing so.


Chief sustainability officers and heads of government affairs have much to offer in this regard. Experience in addressing climate risk, managing supply chain challenges and understanding how the downsides from globalisation impacts their business offer models on how to consider future proofing for AI adoption.


Today, leaders build strategies, restructure operations, and make workforce decisions under the implicit assumption that AI will remain just as accessible and affordable for the coming years.


Companies undertake risk scenario planning to ensure there will be business continuity for many contingencies, from natural disasters to cyberattacks. This exercise allows them to test and improve their strategy and guide planning.


Yet few, if any, apply the same rigor to AI.


Ignoring the Responsible AI trilemma – environmental harm, job loss and rising inequality – will lead to material constraints for access to AI.


These could include rising costs for electricity and/or water from environmental impact, growing public pressure from job losses or increasing income inequality, and explicit government regulations limiting access.


Boards and investors need to ensure management teams plan for a world without the AI access they have today. 



bottom of page