Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Guide
Tom’s Guide
Technology
Jeff Parsons

ChatGPT is more useful to society than cryptocurrency, says Nvidia

Nvidia GPU

The world’s processing power is better served building artificial intelligence apps like ChatGPT and Google Bard than mining cryptocurrency. That’s according to Nvidia, which would rather see its GPUs used for gaming and AI.

Michael Kagan, Nvidia’s chief technology officer, has stated the company never embraced cryptocurrencies and tried to constrain its RTX-30 graphics chips to limit their use for mining. 

“All this crypto stuff, it needed parallel processing, and [Nvidia] is the best, so people just programmed it to use for this purpose,” he told the Guardian. “They bought a lot of stuff, and then eventually it collapsed, because it doesn’t bring anything useful for society. AI does.”

Despite Nvidia’s reluctance to engage with crypto enthusiasts, the company undoubtedly did well out of the Bitcoin and Ethereum bull market. In all likelihood, it sold a fair amount of GPUs to those wanting to mine digital currencies.

“I never believed that [crypto] is something that will do something good for humanity,” Kagan added. “You know, people do crazy things, but they buy your stuff, you sell them stuff. But you don’t redirect the company to support whatever it is.”

Powering the revolution

(Image credit: Nvidia)

While cryptocurrency may be struggling (Ethereum can no longer be mined and the time and power requirements for Bitcoin are prohibitive), the AI revolution is in full swing. That’s been good for Nvidia. According to UBS analyst Timothy Arcuri, OpenAI (the company behind ChatGPT), used 10,000 of the company’s GPUs to train the model.

Putting the question of how many Nvidia GPUs were used in its development to the chatbot directly, ChatGPT returned the following answer: “The exact number and type of GPUs used during my training process are not publicly disclosed, but it is known that the OpenAI team used a large-scale transformer-based language model architecture and trained me on a massive dataset of text.”

Currently, tens of thousands of Nvidia’s A100 and H100 Tensor Core GPUs handle training and inference on AI models like ChatGPT, running through Microsoft’s Azure cloud service.

Meanwhile, Jensen Huang, Nvidia’s CEO said it was the engine powering “the iPhone moment of AI” at the company’s annual conference last week.

More from Tom's Guide

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.