• Skip to main content
  • Skip to primary sidebar

Digital Inclusion

Everything Digital

March 12, 2026 by admin Leave a Comment

Power Consumption in AI: Why Data Centres Use So Much Electricity

You’ve probably heard that AI uses a lot of energy. But how much is “a lot”? And why? When you ask ChatGPT to write a birthday card or tell Perplexity to find the best pizza in Cork, what’s actually happening behind the screen that costs so much electricity?

The answers are more interesting — and more concerning — than most people realise. This isn’t a niche tech issue. In Ireland, data centres already use more electricity than all rural homes combined. Understanding why helps you understand one of the biggest infrastructure challenges of the next decade.

How Much Power Does AI Actually Consume?

The numbers are growing fast. The International Energy Agency estimated in 2024 that global data centre electricity consumption could exceed 1,000 TWh by 2026. For context, that’s roughly equivalent to Japan’s entire electricity demand. AI workloads are the primary driver of that growth.

A few specific numbers help make this real:

  • A single ChatGPT query uses approximately 10 times more energy than a standard Google search, according to Goldman Sachs analysis
  • Training GPT-4 is estimated to have consumed around 50 GWh of electricity — roughly what 4,600 average Irish homes use in a year
  • NVIDIA’s latest AI chips (the B200 series) can each draw over 1,000 watts. A rack full of them pulls 60-120 kW — enough to power 30-60 average houses simultaneously

In Ireland, the CSO reported in 2023 that data centres consumed 21% of all metered electricity. That figure has been climbing roughly 2-3 percentage points per year. Some projections suggest it could reach 30% by 2028 if current trends continue.

Why Does AI Need So Much Power?

To understand this, you need to know a bit about what happens when an AI model works. Don’t worry — no computer science degree required.

The Processors Run Hot

AI workloads run on specialised chips called GPUs (Graphics Processing Units) and increasingly on purpose-built AI accelerators. These aren’t like the processor in your laptop. They’re designed to perform trillions of calculations per second, and that level of computation generates enormous heat.

Think of it this way. Your laptop might use 15-45 watts when you’re browsing the web. A single NVIDIA H100 GPU — the workhorse of most AI training today — uses around 700 watts at full load. A data centre with thousands of these GPUs running simultaneously draws power on an industrial scale.

Training vs Inference: Two Different Kinds of Demand

There are two main types of AI workload, and they consume power differently:

WorkloadWhat It DoesPower Characteristics
TrainingTeaching the AI model by processing massive datasets, adjusting billions of parameters over weeks or monthsExtremely high, sustained power draw. Thousands of GPUs running 24/7 for weeks. This is where the eye-watering electricity figures come from.
InferenceUsing the trained model to answer your questions, generate images, or make predictionsLower per-query, but multiplied by millions of users. Growing rapidly as AI tools become mainstream.

Training is more energy-intensive per job, but inference is catching up in total consumption because of sheer volume. Every time you use an AI assistant, you’re running inference. Multiply that by hundreds of millions of daily interactions worldwide.

Cooling Is the Hidden Energy Cost

Here’s something most people don’t consider. The electricity powering the AI chips is only part of the story. Every watt of electricity used by a processor eventually becomes heat. That heat needs to be removed, or the equipment fails.

Cooling a data centre can consume 30-50% of its total energy in a poorly designed facility. Even well-designed modern facilities typically spend 10-20% of total energy on cooling. When your racks are drawing 60+ kW each, that cooling demand is massive.

A metric called PUE (Power Usage Effectiveness) measures this overhead. A PUE of 1.5 means for every watt of computing, half a watt goes to cooling and other facility systems. The best modern facilities achieve PUE around 1.1. The worst legacy facilities run at 2.0 or higher — literally doubling their electricity consumption through inefficiency.

What Makes AI Data Centres Different from Regular Ones?

Not all data centres are created equal. The facility hosting your email and storing your Dropbox files looks very different from one training the next generation of AI models.

  • Power density — a standard cloud computing rack draws 5-15 kW. An AI training rack draws 40-100+ kW. That’s not an incremental difference. It fundamentally changes the building design.
  • Cooling requirements — air cooling, which works fine for traditional servers, physically cannot remove enough heat from high-density AI racks. Liquid cooling or immersion cooling becomes necessary.
  • Grid impact — a single large AI training cluster can draw as much power as a small town. Connecting these facilities to the electricity grid requires significant infrastructure upgrades.
  • Continuous operation — AI training runs 24/7 for weeks or months. There’s no off-peak. The power demand is relentless and constant.

Ireland’s Specific Challenge

Ireland has become one of Europe’s most important data centre markets. Companies like Microsoft, Google, Amazon, Meta, and dozens of smaller operators run major facilities here. The mild climate (good for cooling), political stability, English-speaking workforce, and EU membership made Ireland an attractive location.

But that success has created a real tension.

EirGrid, which manages Ireland’s electricity grid, has flagged that data centre demand is growing faster than new renewable generation can be built. In parts of Dublin, there’s been a moratorium on new data centre grid connections. The concern isn’t that the grid will collapse tomorrow — it’s that continuing to add large power consumers without matching supply growth creates reliability risks for everyone.

The numbers tell the story. Irish data centres consumed approximately 5,300 GWh in 2023. Total Irish electricity demand was around 31,500 GWh. That means data centres accounted for roughly one-sixth of all electricity used in the country. No other country in Europe has that kind of concentration.

What’s Being Done About It?

The situation is serious but not hopeless. Several approaches are making a real difference.

Better Design

Modern data centres are dramatically more efficient than those built even five years ago. Free-air cooling (using Ireland’s naturally cool air), liquid cooling for high-density racks, and modular designs that scale with demand are all reducing waste. Companies like Standard Control Systems provide the building energy management systems that tie these elements together, monitoring and optimising energy use in real time.

Renewable Energy

Major data centre operators are investing heavily in renewable power. Microsoft, Google, and Amazon have all signed large power purchase agreements (PPAs) for Irish wind energy. In theory, this means new renewable generation being built to match data centre demand. In practice, the timeline for building wind farms doesn’t always align with the timeline for building data centres.

Waste Heat Recovery

Every watt consumed by a data centre eventually becomes heat. Most of that heat currently gets dumped into the atmosphere through cooling towers. That’s an enormous missed opportunity.

In other European countries — particularly Finland, Sweden, and Denmark — data centre waste heat is piped into district heating systems that warm nearby homes and offices. Ireland is exploring similar approaches. South Dublin County Council has investigated using waste heat from data centres in the Grange Castle area for local heating networks.

AI Efficiency Improvements

The AI industry itself is working to reduce energy per computation. Newer chip architectures are more energy-efficient per operation. Techniques like model distillation (creating smaller, more efficient versions of large models), quantisation (reducing the precision of calculations where full precision isn’t needed), and sparse computing (only activating the parts of a model needed for each query) all reduce energy consumption.

Google’s DeepMind famously used AI to reduce cooling energy in Google’s own data centres by 40%. There’s something satisfying about AI being used to make AI’s own infrastructure more efficient.

Should We Be Worried?

Yes and no. The energy consumption of AI is real and growing. Pretending otherwise doesn’t help. But it’s also not a runaway train with no brakes.

The efficiency of AI computing improves significantly with each hardware generation. The amount of computation you get per watt roughly doubles every two to three years. The problem is that demand for AI computation is growing even faster than efficiency improvements — so total energy consumption keeps rising.

The honest assessment is this: AI’s energy footprint is manageable if three things happen simultaneously:

  1. Renewable energy generation expands fast enough to meet data centre demand growth
  2. Data centre design continues to improve, pushing PUE closer to 1.0
  3. AI models become more computationally efficient through better algorithms and hardware

If any of those three stalls, the maths stops working. And that’s worth paying attention to, whether you’re in the tech industry or not.

What This Means for Ordinary People and Businesses

You don’t need to stop using AI. That’s not the takeaway here. But being an informed user matters.

  • If you’re a business adopting AI tools, ask your providers about their energy sources and efficiency. Cloud providers that invest in renewables and efficient facilities are genuinely better choices than those that don’t.
  • If you’re evaluating AI for your organisation, consider whether you need the largest, most powerful model for every task. Smaller, more efficient models often work just as well for routine tasks and use a fraction of the energy.
  • If you’re concerned about Ireland’s energy future, support policies that require data centre operators to meet efficiency standards, invest in renewables, and explore waste heat reuse. The EU’s Energy Efficiency Directive now requires large data centres to report their energy performance annually — that transparency is a start.

The electricity that powers AI is real. The heat it generates is real. The strain it places on grids — including Ireland’s — is real. But so are the solutions. The question isn’t whether we can afford to use AI. It’s whether we can be smart enough to power it responsibly. Given that we’re using AI to solve increasingly complex problems, it would be ironic if we couldn’t solve this one.

Filed Under: Business

Previous Post
Next Post

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

Recent Posts

  • Digital Inclusion Strategy: What It Actually Means for Your Business
  • Power Consumption in AI: Why Data Centres Use So Much Electricity
  • Your Smart Building Tech Is Useless Without Solid Electrics
  • Sustainability Goals Meet Bottom Line: Making Energy Pay
  • Digital Marketing for Personal Injury Solicitors

Recent Comments

    Categories

    • Business
    • Marketing
    • Uncategorized

    Copyright © 2026 · Kickstart Pro on Genesis Framework · WordPress · Log in