Andrew Ng the famous AI Scientist and the guy who never gets old once said “AI is the new Electricity” and he was absolutely right about the analogy. However, this article is not about analogy; actual electricity is measured in amperes, volts, and kWatts-hours. And why it is a huge barrier to Artificial Intelligence growth for businesses and Model training giants like Open AI.
Last year in 2023 NVIDIA alone sold 3.76 million Data Center GPUs and to power that many data center GOUs the same amount of energy is required as to power the city of Phoenix in Arizona which has about 620,000 homes which is about 75.1 Tera Watt Hours about 18,132 kWh per home in the city in a year. In dollar terms about $1.6 Billion worth of electricity per year just to run them.
So what it means for AI so that, when you add another 3.85 million Data Centers shipped by AMD and Inter in 2023 you need another power grid as big as Phoenix and you will have electricity needs multiplying the size of two major cities a year by three companies. Mind you discussion is just limited to data center GPUs, not PC or SOC GPUs, and out of three companies in Data Center GPUS business in the USA there are at least seven others not far behind. The US government may have put sanctions but they do have some great GPUs.
The human brain has 100 Billion Neurons and is powered by about 20 Watts while NVIDIA’s most powerful GPU has 80 billion transistors which translates to about 0.2 Billion neurons roughly and needs 700 watts of power to run. And people talk about Artificial General Intelligence or Singularity to compete with nature. But that is topic of another article.
So startups cannot afford nor GPUs cost not afford to power them if not backed by big bank companies like Microsoft, Google or VCs. Even some survives and some die matter is bigger than that for Artificial Intelligence. Which is growth of AI is at stake and how to solve this problem.
There are many ways and different being tested in details of which not liberty to divulge right now but how big giants are trying to solve it. For example, Sam Altman CEO of Open AI personally involved and invested in a clean energy startup called Helion hoping to produce controllable Hydrogen Fusion Reactor to produce virtually infinite amount of energy using Hydrogen atoms Nuclear Fusion. This principle, already tested in hydrogen bombs, differs from nuclear fission and has been used to produce both nuclear weapons and sustained nuclear electricity for years. Nuclear fusion power still needs to be converted into a reliable, sustained source of electricity. A feet on which about 30 plus startups working globally and no at least 10 years away from any commercial Fusion reactor to power the grid. And don’t even get me started on solar and wind despite over $10 trillion in global investment, these technologies still cannot meet demand, and we should at least recognize that.
Apart from nuclear fusion, low-power GPU technology using integer operations instead of floating-point calculations could be a solution. However, investment in integer-based GPU technology remains limited, as the industry has yet to realize its full potential. Developing these GPUs is mathematically daunting, and researchers need to explore ways to create low-power GPUs whether integer-based or otherwise that drastically reduce power consumption, approaching the efficiency of the human brain.
So what is next. Definitely a crippled growth of AI if your technology depends on high power Data Center GPUs for large AI Model training on huge datasets and you cannot complete till available cutting edge models become commoditized in about two years maximum most advanced cutting edge models become cheap and in many cases trainable on commercially available Data Center Old GPUs in cloud which is a lot cheaper and train in a lot less time so you do not need same amount of power as cutting edge Large Models need with cutting edge latest GPU a race from which 99% startups have been priced out already. As one cutting edge large model can take $100 million easily in 3 to 6 months training period.
The good news is that IRVINEi avoids large model training issues and high electricity costs by using technology that doesn’t rely on massive model training. Consumers and businesses manage power consumption through IRVINEi’s OVAL AI & IoT Hub, which includes a GPU and features a Single Interface Dashboard (SID). This hub brings you your own AI Personal Assistant, AI Bodyguard, and Business Manager. IRVINEi’s proprietary technology combines Large Language Models (LLMs) and Computer Vision (CV) to give you a personalized AI experience, making you feel like Iron Man with your own Jarvin.