Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Creative Bloq
Creative Bloq
Technology
Paul Hatton

Does the rise of the NPU spell the beginning of the end for discrete GPUs?

GPU.

I was recently writing my MSI Prestige A16 AI+ review, and it got me thinking about the future of laptops. My drifting off had absolutely nothing to do with the quality of MSI's latest offering; I did give it a solid 4 stars after all, but was rather a result of the huge upsurge we've seen in AI-branded laptops.

In the last 18 months, I've lost track of the number of AI laptops we've tested, and it shows no signs of slowing down. As a result, I've been questioning if high-end creatives are going to need discrete GPUs in the future or if CPUs and Neural Processing Units (NPUs) will begin to dominate the landscape.

To make sure we're all on the same page, an NPU is a specialised microprocessor designed to drastically accelerate artificial intelligence. From a creativity perspective, an NPU takes on the role of AI inference that was previously handled inefficiently by a GPU or, for lighter tasks, the CPU. As a result, laptops are able to make fast, informed decisions without needing to run extensive calculations on the GPU.

(Image credit: Intel)

What we're finding is that the type of creative tasks we're performing is changing, and as a result, the gravitational pull of processing is moving away from discrete GPUs and towards NPUs. The point is overstated, but at the very least, the need for entry-level discrete GPUs is becoming increasingly redundant.

For example, why pay for a separate, small discrete GPU when the NPU and integrated graphics can handle all but the most intensive tasks while saving battery and space?

At this point in time, we've not yet witnessed a reduction in the requirement for high-end dedicated GPUs in specialised fields. These powerful standalone cards, with their massive parallel processing cores and dedicated high-speed VRAM, remain indispensable for workloads where power efficiency is secondary to raw performance.

Discrete GPUs are still required for high-fidelity 4K video editing, graphics-heavy gaming, and professional 3D rendering, but it's not beyond the realms of possibility for AI to become so powerful that discrete GPUs become entirely unnecessary. One example from my own world of 3D visualisation would be that rather than needing a GPU to calculate physically accurate results, AI will be able to access its knowledge base and generate the same results.

(Image credit: AMD)

I appreciate we're many years away from this reality, but given the AI progress we've witnessed in the past two years, I wouldn't be surprised if we see it a lot sooner than most would think.

For now, the CPU with an NPU and integrated GPU dominates for energy-efficient, everyday computing and basic on-device AI, while the discrete GPU maintains its stronghold by offering the unmatched computational horsepower required for the most demanding visual and AI workloads.

The discrete GPU market doesn't look like it's being eliminated any time soon, but I'll be interested to see how AI-branded laptops evolve in 2026 and whether they begin to erode the need for high-end and very expensive GPUs.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.