x
N A B I L . O R G
Close
Technology - August 21, 2025

Sustainable AI: How Global Corporations Embrace Small Language Models for Energy-Efficient Operations

Sustainable AI: How Global Corporations Embrace Small Language Models for Energy-Efficient Operations

In the rapidly evolving business landscape, the integration of artificial intelligence (AI) is no longer a novelty but a necessity. This escalating dependence has triggered significant advancements in computing power, infrastructure development, and skillset evolution.

The limelight is currently on large language models (LLMs), spearheaded by tech titans like OpenAI, Anthropic, and Google. These models demonstrate exceptional capabilities in handling and generating natural language, finding applications ranging from corporate chat assistance to complex analytics solutions.

However, their prowess comes at a price. These models consume substantial resources, particularly energy and water. A single query on ChatGPT, for instance, may consume as much electricity as ten typical Google searches. The data centers powering these models can even consume millions of gallons of water for cooling purposes.

The scale is staggering: training GPT-3 consumed approximately 1,287 MWh of electricity, enough to power around a hundred and twenty US homes annually. Microsoft’s water consumption surged by 34% in 2022, largely due to its AI activities. In this context, small language models (SLMs) have gained prominence.

These SLMs offer robust capabilities with a significantly smaller environmental footprint, making them an attractive alternative. They operate on a scale ranging from several million to approximately 10 billion parameters, marking a significant shift in corporate sustainability and operational strategies.

But how are major global corporations incorporating these SLMs into their sustainability and operational frameworks?