
FuriosaAI, South Korea’s newest AI unicorn with $246 million in total funding, partnered with OpenAI at the grand opening of the AI giant’s Seoul office on Sept. 11 to deliver a live demonstration of enterprise-ready sustainable AI.
The semiconductor startup showcased OpenAI‘s gpt-oss 120B model running in real time on just two RNGD — pronounced "Renegade" — cards using MXFP4 precision, demonstrating that large-scale AI models can operate within the power budgets of typical enterprise data centers, the company said.
FuriosaAI closed a $125 million Series C bridge round in July, lifting its valuation to $735 million. The company also drew attention after turning down an $800 million acquisition offer from META (NASDAQ:META).
Don't Miss:
- Your Last Chance to Invest in Pacaso Before Their Global Expansion — Offer Ends Sept 18
- Kevin O'Leary Loves ‘Wonderful Recurring Cash Flows' — These Small Industrial Assets Deliver Just That
Breaking the GPU Stranglehold on Enterprise AI
FuriosaAI described its RNGD accelerator as a compelling alternative to graphics processing unit-dependent AI infrastructure, backed by validation from LG AI Research and global investors.
The demonstration setup at OpenAI's new Seoul headquarters that operated on FuriosaAI's RNGD cards was compatible with either a standard workstation or RNGD server. FuriosaAI said this shows advanced AI models can be deployed within the power budgets of typical enterprise data centers rather than relying on expensive, high-power GPU infrastructure.
The company described this approach as removing the "prohibitive energy costs and complex infrastructure requirements of GPUs," a challenge that has limited many enterprises from adopting advanced AI at scale.
The real-time chatbot demonstration highlighted how RNGD enables enterprises to run state-of-the-art open-weight models without vendor lock-in or unsustainable total cost of ownership. According to FuriosaAI, businesses can now deploy advanced AI efficiently within existing infrastructure and power limitations, whether through on-premises servers or cloud data centers.
Trending: ‘Scrolling To UBI' — Deloitte's #1 fastest-growing software company allows users to earn money on their phones. You can invest today for just $0.30/share.
Why the FuriosaAI-OpenAI Collaboration Matters for Enterprises
FuriosaAI said the collaboration with OpenAI brings three benefits: lower ownership costs, broad deployment options, and greater data control. By supporting open-weight models, RNGD removes vendor lock-in and reduces the total cost of ownership tied to power-hungry GPUs, FuriosaAI said.
The hardware allows enterprises to scale within existing infrastructure, whether on-premise servers or cloud data centers, according to the statement. Companies dealing with sensitive workloads also gain full control over model weights and data, a feature designed to help meet compliance requirements while enhancing security and privacy.
How RNGD Delivers Breakthrough Efficiency
At the center of FuriosaAI's technology is its Tensor Contraction Processor, a custom architecture designed to address inefficiencies of GPUs in AI workloads. The company said the processor maximizes data reuse and parallelism, creating both strong performance and high energy efficiency.
See Also: Kevin O'Leary Says Real Estate's Been a Smart Bet for 200 Years — This Platform Lets Anyone Tap Into It
FuriosaAI mentioned LG AI Research as an early customer that achieved 2.25 times better inference performance than GPUs while staying within strict energy and cost limits. According to FuriosaAI, RNGD has been proven reliable in real-world production environments, meeting enterprise standards for performance and efficiency.
The Seoul demonstration illustrated how enterprises can deploy advanced large language models on RNGD within current budgets and infrastructure, avoiding disruptive data center upgrades, according to FuriosaAI.
For global businesses, the ability to scale AI while controlling costs and energy use is becoming a competitive advantage. FuriosaAI framed RNGD as the "foundational compute layer of the AI stack," built to support more sustainable growth of enterprise AI worldwide.
Read Next: An EA Co-Founder Shapes This VC Backed Marketplace—Now You Can Invest in Gaming's Next Big Platform
Image: Shutterstock