Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Wayne Williams

Korean startup backed by LG unveils AI server that matches Nvidia's H100 performance but with a much lower power consumption

FuriosaAI's RNGD.
  • FuriosaAI's new RNGD Server delivers 4 petaFLOPS compute at 3kW for efficient AI
  • Enterprises will be able to scale AI workloads without costly infrastructure changes
  • RNGD Server provides compatibility with OpenAI’s API alongside a growing SDK feature set

South Korean chip startup FuriosaAI, which famously walked away from Meta’s $800 million acquisition bid, is continuing to push forward with new products as demand for efficient AI infrastructure soars.

The startup is seeking to provide enterprises with hardware which can run LLMs without the costly data center upgrades and heavy energy costs often associated with GPUs.

Its latest product, the RNGD Server, is an enterprise-ready AI appliance powered by FuriosaAI’s RNGD (pronounced “Renegade”) AI inference chips.

Scaling more efficiently

Each system delivers 4 petaFLOPS of FP8 compute and 384GB of HBM3 memory, while operating at just 3kW.

In comparison, Nvidia’s DGX H100 servers can draw more than 10kW. This means a standard 15kW data center rack can hold five RNGD Servers, while the same rack would fit only one DGX H100.

FuriosaAI says since most data centers are limited to 8kW per rack or less, its design addresses a key barrier for businesses.

Advanced AI models running in such environments typically require new cooling and power systems.

The company says that by adopting the RNGD Server, enterprises will be able to scale more efficiently, while maintaining compatibility with OpenAI’s API.

The startup recently closed a $125 million Series C bridge round and expanded its partnership with LG AI Research.

LG uses RNGD hardware to run its EXAONE models, and says it gets more than twice the inference performance per watt compared to GPUs.

FuriosaAI also recently collaborated with OpenAI, where the two companies demonstrated the open-weight gpt-oss 120B real-time chatbot running on just two of FuriosaAI’s RNGD accelerators.

The new RNGD Server will receive continuous updates to FuriosaAI’s SDK, which recently introduced inter-chip tensor parallelism, new compiler optimizations, and expanded quantization formats.

RNGD Server is currently sampling with global customers and is expected to be available for order in early 2026.

You might also like

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.