AI and Electricity
AI and Electricity

Artificial Intelligence and Electricity

Andrew Ng the famous AI Scientist and the guy who never gets old once said “AI is the new Electricity” and he was absolutely right about the analogy. However, this article is not about analogy; actual electricity is measured in amperes, volts, and kWatts-hours. And why it is a huge barrier to AI growth for businesses and Model training giants like Open AI.

Last year in 2023 NVIDIA alone sold 3.76 million Data Center GPUs and to power that many data center GOUs the same amount of energy is required as to power the city of Phoenix in Arizona which has about 620,000 homes which is about 75.1 Tera Watt Hours about 18,132 kWh per home in the city in a year. In dollar terms about $1.6 Billion worth of electricity per year just to run them.

NVIDIA

So what it means for AI so that, when you add another 3.85 million Data Centers shipped by AMD and Inter in 2023 you need another power grid as big as Phoenix and you will have electricity needs multiplying the size of two major cities a year by three companies. Mind you discussion is just limited to data center GPUs, not PC or SOC GPUs, and out of three companies in Data Center GPUS business in the USA there are at least seven others not far behind. The US government may have put sanctions but they do have some great GPUs.

The human brain has 100 Billion Neurons and is powered by about 20 Watts while NVIDIA’s most powerful GPU has 80 billion transistors which translates to about 0.2 Billion neurons roughly and needs 700 watts of power to run. And people talk about Artificial General Intelligence or Singularity to compete with nature. But that is topic of another article.

So startups cannot afford nor GPUs cost not afford to power them if not backed by big bank companies like Microsoft, Google or VCs. Even some survives and some die matter is bigger than that for AI. Which is growth of AI is at stake and how to solve this problem.

Nuclear Fusion

There are many ways and different being tested in details of which not liberty to divulge right now but how big giants are trying to solve it. For example, Sam Altman CEO of Open AI personally involved and invested in a clean energy startup called Helion hoping to produce controllable Hydrogen Fusion Reactor to produce virtually infinite amount of energy using Hydrogen atoms Nuclear Fusion. A principle already tested in Hydrogen Bomb but unlike nuclear fission, a technology using which both nuclear bomb and sustained nuclear electricity has been produced from year. Nuclear Fusion power has to produce converted to sustained source of electricity. A feet on which about 30 plus startups working globally and no at least 10 years away from any commercial Fusion reactor to power the grid. And please do not get started on Solar and Wind they cannot meet the demand after over $10 trillion of global investment in the technology we should learn that at least.

GPUs like human brain.

Other than Nuclear Fusion low power GPU technology may be answer with Integer Operations instead of Floating Point but investment in the technology of Integer operation based GPUs is small and industry do not realize the potential yet to invest more as mathematically it looks very daunting task too and more research is needed to make low power GPUs using Integer Point based GPUs or otherwise to drastically reduce power consumption of GPUs like human brain.

So what is next. Definitely a crippled growth of AI if your technology depends on high power Data Center GPUs for large AI Model training on huge datasets and you cannot complete till available cutting edge models become commoditized in about two years maximum most advanced cutting edge models become cheap and in many cases trainable on commercially available Data Center Old GPUs in cloud which is a lot cheaper and train in a lot less time so you do not need same amount of power as cutting edge Large Models need with cutting edge latest GPU a race from which 99% startups have been priced out already. As one cutting edge large model can take $100 million easily in 3 to 6 months training period.

Good thing is IRVINEi has made sure it does not runs into large model training problem nor electricity bill problems by using technology which is not dependent on large model training and power consumption is sourced to consumers and businesses using IRVINEi’s AI & IoT Hub as GPU is in the hub which provides Single Interface Dashboard in short called SID to bring you your own AI Personal Assistant, Body Guard and Business Manager using IRVINEi’s developed technology which combines Large Language Models (LLMs) and Computer Vision (CV) together to make you feel like Iron Man with your own Jarvin.   

Check Also

Retail theft Prevention

Prevent Theft in Stores with AI-Powered Security Alerts

In today’s fast-paced retail environment, security is a top priority for businesses of all sizes. …

Leave a Reply

Your email address will not be published. Required fields are marked *