
Remember when OpenAI CEO Sam Altman was branded a "podcasting bro" by TSMC executives for his audacious and ambitious AI plan that would "take $7 trillion and many years to build 36 semiconductor plants and additional data centers" to bring the dream to reality?
It's becoming apparent that generative AI is a resource-hungry technology that demands a huge amount of computing power and cooling water to facilitate its sophisticated advances as it scales greater heights.
OpenAI has reportedly complained about Microsoft not being able to meet its cloud computing demands as part of the multi-billion-dollar agreement. The ChatGPT maker indicated that the software giant would be to blame if a rival AI firm achieves AGI (artificial general intelligence) first.
Earlier this year, OpenAI unveiled its $500 billion Stargate project designed to facilitate the construction of data centers across the United States for its AI advances, prompting Microsoft to lose its cloud provider status.
Interestingly, a new report by The Wall Street Journal claims that the multi-billion-dollar project is struggling to get off the ground. OpenAI is reportedly inclined toward "building a small data center by the end of this year."
Sam Altman recently indicated that OpenAI is well on its way to bringing “well over 1 million GPUs online” by the end of this year. This news comes after the executive revealed that the AI firm was forced into doing a lot of "unnatural things" due to a GPU shortage prompted by the viral Studio Ghibli memes from its GPT-4o image generator, including borrowing compute capacity from research and slowing down some features.
we will cross well over 1 million GPUs brought online by the end of this year!very proud of the team but now they better get to work figuring out how to 100x that lolJuly 20, 2025
While ChatGPT gained over one million new users in under one hour, the executive jokingly indicated that the demand surge almost prompted the company's GPUs to melt.
Sam Altman indicated that he is "very proud of the team" but stated a dire need to figure out how to increase OpenAI's potential 1 million GPUs to 100 million. Tom's Hardware estimates that this feat could set back the AI firm up to $3 trillion. For context, this is a billion short of Microsoft's recent market capitalization, which hit the $4 trillion threshold.
This proves that AI's ever-evolving nature will continue to demand more computing power for sophisticated advances, lining up with AI safety researcher and director of the Cyber Security Laboratory at the University of Louisville, Roman Yampolskiy's claim about AGI being achieved by the AI firm with enough money to buy enough computing power. He claimed that AGI could be achieved today if both of these requirements were widely available.
It will be interesting to see if OpenAI will be able to raise over $3 trillion to buy more GPUs to support its AI advances and whether it would be enough to achieve incredible feats like AGI or even superintelligence.