Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Investors Business Daily
Investors Business Daily
Technology
REINHARDT KRAUSE

AI Stocks: Why The 'Inferencing' Market Will Be Bigger Than Training Models

As the boom in generative artificial intelligence fuels a rally in AI stocks, analysts are forecasting a shift in sales of the underlying technology. They see investments shifting from AI systems for "training" to those for "inferencing."

The shift could have major implications for Nvidia, the current leader in AI chips, and others in the market.

Most revenue now for AI stocks tied to selling data center hardware is from training. But that will shift to inferencing.

Training is a type of machine learning where powerful computers are taught to understand the type of data they will analyze. Inferencing involves making predictions based on live data to produce actionable results.

AI Stocks: Training Drives Investments

"AI training refers to teaching an AI engine to properly understand a data set — for instance, all the historical customer service data ever gathered within a telecom company," a Bank of America report said.

Inferencing boils down to running AI applications or workloads after models have been trained.

"Training is where the opportunity is currently," a report from investment bank Raymond James said. "Inferencing presents potentially bigger long-term opportunity."

For investors in AI stocks, the point is this: Nvidia dominates in AI training, but inferencing is up for grabs. That's why Nvidia rival Advanced Micro Devices has focused on inferencing in developing new AI chips.

Nvidia has posted huge sales growth this year from selling graphics processors for generative AI. Customers use its hardware to train large language models, or LLMs.

Generative AI can create content — including written articles, images, videos and music — from simple descriptive phrases. Generative AI also can write computer programming code. Artificial intelligence systems analyze and digest vast amounts of data to create these new works.

"Training AI models requires an order of magnitude more compute power than inferencing," Raymond James said. "The inferencing opportunity for silicon suppliers could be bigger than that of training longer term."

AMD Well Positioned In Inferencing

Investment bank TD Cowen said Nvidia isn't the only semiconductor company that will benefit from generative AI.

"AMD is well positioned to grow meaningfully with the AI total addressable market, especially initially on the inference side," TD Cowen analysts said in a report. Other potential beneficiaries for increased inference workloads include chipmakers Arteris, Broadcom, Intel, Lattice Semiconductor, Marvell Technology and Qualcomm.

Oppenheimer has a similar take on the AI inference market. "We are now entering an era in which inferencing infrastructure for generative AI has to be scaled to support every workload across centralized cloud, edge compute, and IoT (Internet of Things) devices," Oppenheimer analysts said in a report.

Investing In AI Cloud Infrastructure

Cloud computing giants, including Amazon.com's Amazon Web Services, Microsoft and Alphabet's Google, have been investing heavily in data center infrastructure to run AI apps and workloads.

Companies like Dell Technologies and Super Micro Computer expect a boom in computer server purchases.

Meanwhile, Arista Networks stock has surged on views that internet data centers will need more network bandwidth to whisk AI apps to customers.

AI Stocks: Inferencing Takes Over

The "inferencing" market should boom as Fortune 500 companies now testing generative AI move into commercial deployment. They'll deploy products using cloud computing from AWS, Microsoft, Google, Oracle, IBM and others.

"AI inferencing presents a larger monetization opportunity than AI training, we believe, as AI models are put into action, making predictions based on new inputs," Oppenheimer said. "This happens repeatedly across myriad devices and applications, including from voice assistants on smartphones to autonomous vehicles, and health care systems."

Meanwhile, data center operators stand to benefit from AI investments by tech companies and large corporations.

Bank of America sees opportunities for data center operators Digital Realty and Equinix.

"The two core elements of generative AI models are training and inference," a BofA report said. "DLR and EQIX are positioning themselves to address the incremental demand from both training and inference. While it is not a meaningful contributor today to either, they both view AI as a revenue growth accelerant in the future."

Follow Reinhardt Krause on Twitter @reinhardtk_tech for updates on 5G wireless, artificial intelligence, cybersecurity and cloud computing.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.