Premium

Just What Exactly Is AI? How Does It Work, Where Is It Going, and Why Does It Use So Much Damn Energy?

Artificial Intelligence. (Credit: Steve Johnson)

AI has seemingly taken over the world, mostly in just the last few years. Sure, it existed before that, but it wasn’t a major factor in the lives of the majority of normal people until more recently. Spell check has morphed into a full-blown writing machine, image generation has evolved from a cute diversion into an output creator that can produce stunning, incredibly realistic videos and images in mere seconds, and chatbots can instantly answer almost any question you can imagine for them.

As the world adjusts, we’re still trying to understand: what can it do, and where it will take us?

In the meantime, though, many wonder, what even IS it? Plenty of folks would say they know, or at least have a general understanding of it, but a large proportion would be hard-pressed to explain exactly how it works, why it takes so much energy, and why it’s far more than a play toy that can summarize a document or quickly recap an email.

I remember back in the day, as a wee lad, when electric typewriters were the talk of the town, thinking, “Wouldn’t it be cool if I could ask the Selectric anything I want, and it would answer?” 

No, I wasn’t a great tech visionary, or I’d be following Mark Zuckerberg and the Google Bros. to Miami’s "Billionaire Bunker" about now, but I do admit, I did have a vivid imagination.

But now, you actually can ask your keyboard any question you have, and it will answer — mostly correctly, but certainly not always so.

I thought I’d ask AI itself the question — in this instance, Microsoft’s offering, Copilot ("Your AI companion") — just what the hell are you? Its answer:

  • AI — artificial intelligence — is a broad term, but the core idea is simple: it’s technology designed to perform tasks that normally require human thinking. From there, the details get more interesting.

🧠 What AI actually is

AI is a set of computer systems that can:

  • Recognize patterns
  • Learn from data
  • Make predictions
  • Generate language, images, or decisions
  • Solve problems

It’s not magic, and it’s not consciousness. It’s math, statistics, and pattern‑learning at a massive scale.

For many, the creepiest thing about it is that it talks to you like it’s your friend. That has led to some tragic consequences, and in some cases, it's sympathized with its questioners and agreed with their desire to do some very bad things.

And is it always right? Hell, no:

But how does it work? We’ll consult Elon Musk’s Grok AI this time. A lot of artificial intelligence is powered by Large Language Models (LLMs), which suck up massive troves of data and human writing to learn how we communicate:

AI (Artificial Intelligence) works by teaching computers to recognize patterns in data and use those patterns to make decisions, predictions, or create new things — without needing someone to program every single rule by hand. Here's a clear, step-by-step explanation from the very basics to how modern AI (like me!) actually functions:

1. The Core Idea — Learning from Examples (Machine Learning)Traditional computer programs follow strict if-then rules written by humans. Modern AI mostly uses machine learning instead:

  • You give the system huge amounts of examples (data).
  • The AI finds patterns in those examples all by itself.
  • It gets better by adjusting itself when it makes mistakes.

Ok, sure. Now I understand it. Sort of. A little bit. Maybe about as much as I grasp calculus, Bitcoin, or Stephen Hawking's universe. Meaning, at the time I’m reading about such things, I can comprehend them, but if you ask me at a party a week later to explain in detail, I might struggle a bit.

All right, that’s all cool, and it’s changing human reality in real time. AI can be both inspiring and terrifying in the same moment. But hey, can someone tell me why it uses so much power, and why it's driving up energy costs in my state of California and elsewhere, even as the progressives try to kneecap our capacity? Oh, it turns out there is someone, or, let’s say, some thing, that can give us an answer. 

We’ll give ChatGPT the opportunity this time:

Here’s the explanation:

1. Training Requires Massive Computation

Training a state-of-the-art AI model means:

  • Feeding it trillions of words or images
  • Adjusting billions (or hundreds of billions) of parameters
  • Repeating this process over and over for weeks or months

This requires thousands of specialized chips (GPUs/TPUs) running 24/7 in data centers. Each chip consumes significant electricity, and collectively they draw megawatts of power.

Training is the most energy-intensive phase.

Mr. or Ms. Chat went on to explain that the models are huge and require high-performance processors, cooling that requires enormous power, and that when millions of users use the bots all at once, the demand requires vast resources.

So there you have it. The explosion of AI may be the biggest technological development in most of our lifetimes, and for anyone who’s been alive in the past few decades, you’ve already seen spectacular advancement. Where this goes is anybody’s guess.

I love it because I can find a fact almost instantly (although I always double-check it, for reasons explained above), but people are finding all sorts of uses for it far beyond that. Could it replace me as a writer? It certainly has the capability of churning out a similar article to this, albeit without my customary charm.

I’ll just have to take comfort in the fact that, at least for now, it cannot replace me as a husband and father. Let’s hope that lasts!

Recommended

Trending on RedState Videos