Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Hardware
Tom’s Hardware
Technology
Anton Shilov

'Everyone and Their Dog is Buying GPUs,' Musk Says as AI Startup Details Emerge

image of brain over circuit board

Elon Musk has confirmed that his companies Tesla and Twitter were buying tons of GPUs when asked to confirm whether he was building up Twitters compute prowess to develop a generative artificial intelligence project. Meanwhile, the Financial Times reports that Musk's AI venture will be a separate entity from his other companies, but it could use Twitter content for training.

Elon Musk's AI project, which he began exploring earlier this year, is reportedly separate from his other companies, but could potentially use Twitter content as data to train its language model and tap into Tesla's computing resources, according to Financial Times. This somewhat contradicts the earlier report which claimed that the AI project would be a part of Twitter. 

To build up the new project, Musk is recruiting engineers from top AI companies, including DeepMind, and has already brought on Igor Babuschkin from DeepMind and approximately half a dozen of other AI specialists.

Musk is also reportedly negotiating with various SpaceX and Tesla investors about the possibility of funding his latest AI endeavor, according to an individual with firsthand knowledged about the talks, which may confirm that the project is not set to be a part of Twitter.

In a recent Twitter Spaces interview, Musk was asked about a report claiming that Twitter had procured approximately 10,000 of Nvidia compute GPUs. Musk acknowledged this stating that everyone, including Tesla and Twitter, are buying GPUs for compute and AI these days. This is true as both Microsoft and Oracle have acquired tens of thousands of Nvidia's A100 and H100 GPUs in the recent quarters for their AI and cloud services. 

"It seems like everyone and their dog is buying GPUs at this point," Musk said. "Twitter and Tesla are certainly buying GPUs."

Nvidia's latest H100 GPUs for AI and high-performance computing (HPC) are quite expensive. CDW sells Nvidia's H100 PCIe card with 80GB of HBM2e memory for as much as $30,603 per unit. On Ebay, these things sell for over $40,000 per unit if one wants this product fast. 

Recently Nvidia launched its even more powerful H100 NVL product that bridges two H100 PCIe cards with 96GB of HBM3 memory on each one for an ultimate dual-GPU 188GB solution designed specifically for training of large language models. This product will certainly cost well above $30,000 per unit, though it is unclear at which price Nvidia sells such units to customers buying tens of thousands of boards for their LLM projects.

Meanwhile, the exact position of the AI team in Musk's corporate empire remains unclear. The renowned entrepreneur established a company called X.AI on March 9th, Financial Times reported citing business records from Nevada. Meanwhile, he recently changed the name of Twitter in the company's records to X Corp., which may be a part of his plot to build an 'everything app' under the 'X' brand. Musk is currently the sole director of X.AI, while Jared Birchall, who happens to manage Musk's wealth, is listed as its secretary. 

The rapid progress of OpenAI's ChatGPT, which Elon Musk co-founded in 2015 but no longer is involved with, reportedly inspired him to explore the idea of a rival company. Meanwhile, this new AI venture is expected to be a separate entity from his other companies possibly to ensure that this new project will not be limited by Tesla's or Twitter's frameworks.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.