
Every year, the State of Unreal manages to surprise but perhaps not since the reveal of Unreal Engine 5 back in 2020 has it mattered this much – in 2025 Epic Games didn’t just deliver, it reshaped the landscape. I'm lucky enough to be sat in the crowd and see it happen at this year's Unreal Fest.
As I sit watching the keynote stream, notepad in hand, it quickly became clear this wasn’t just about new tech. It was about changing how games and digital experiences are and will be made, who gets to make them, and what’s even possible in real-time 3D.
Read about the new Unreal Engine 5.6 features announced, and then let me break down the biggest news, straight from the ground.

The Witcher 4 tech demo stole the show
Let’s get this out of the way: no, this isn’t The Witcher 4. But honestly, I don't care because what CD Projekt Red and Epic showed off was breathtaking.
The tech demo stars Ciri, set in a luscious and lifelike, never-before-seen part of the Continent – the snowy, richly atmospheric region of Kovir. What’s really jaw-dropping is how seamlessly the demo runs on a PS5: 60fps, raytracing turned on, dense environments buzzing with life.
The highlight has to be Unreal Engine 5.6’s Fast Geometry Streaming Plugin, which keeps open worlds snappy without sacrificing fidelity. Combine that with the new Nanite Foliage (arriving in UE 5.7), ML Deformer, and the kind of cinematic detail we used to dream about, and suddenly I'm not just watching a demo, but peeking into the next generation of interactive storytelling.
This demo also showcased the latest photogrammetry tech: RealityScan 2.0 is just around the corner. It unifies desktop and mobile workflows and adds powerful features such as AI-powered masking, smarter alignment and even airborne laser scan support. CD Projekt Red used it for its The Witcher demo.

Unreal Engine 5.6 promises bigger, smoother, smarter
Unreal Engine 5.6 is out in the now – you can download it for yourself – and it’s a beast. The headline here is performance. Open-world support has been supercharged to the point where massive, detailed landscapes don’t just work on current-gen consoles but they do so in a way that has eased many of the gripes some have had with UE5.
What really caught my eye was the Unreal Engine-first animation and rigging. Forget round-tripping to your DCC tools, now you can do the heavy lifting right inside Unreal. Add in native MetaHuman authoring, and you’re suddenly working in a pipeline that feels frictionless, immediate, and artist-first. This has not always been the case, but read the official blog for a full rundown.


MetaHuman 5.6: digital humans level up
This was one of those 'finally!' moments from the show. MetaHuman is officially out of Early Access and now fully baked into the engine and it's more versatile than ever.
You can now build, animate, and deploy MetaHumans not just in Unreal Engine 5.6, but in Unity, Godot, Maya, Houdini, Blender… you name it, the best 3D modelling software and the best game development software will be supporting MetaHuman. The improved fidelity is stunning, especially when paired with MetaHuman Animator, which can now generate real-time animation from almost any camera or even audio input – I saw this in action the night before the keynote, and it's impressive (and fun).
New Digital Content Creator plugins and Fab integration round out what feels like the most powerful digital human toolkit we've ever had.

UEFN: LEGO, Squid Game and AI NPCs
If you haven’t been paying attention to Unreal Editor for Fortnite (UEFN), now’s the time to tune in. In just over two years, it's become a juggernaut, with creators clocking 11.2 billion hours across hundreds of thousands of islands, and over $722 million paid out. What started out as a bit of fun, a gateway to game development has become big business.
This summer, UEFN levels up again. Starting 17 June, you’ll be able to build LEGO experiences brick-by-brick using a dedicated LEGO Brick Editor.
The IP hook-ups keep coming, with Squid Game arriving 27 June, followed by Avatar: The Last Airbender and Star Wars. Speaking of a galaxy far, far away, the Vader NPC in Fortnite: Galactic Battle can now talk back using conversational AI, which is both creepy and impressive. Later this year, you’ll be able to build your own AI-powered characters using the Persona Device – watch it on the State of Unreal YouTube channel.
The implications are massive and something I've tried before with the Nvidia AI NPC; an AI that drives not just NPC dialog, but entire gameplay logic is, well… game-changing. Combine this with the new Epic Developer Assistant (which writes Verse code with you in real time) and the upcoming Scene Graph (launching 7 June), and UEFN isn't just a level editor anymore. It's a full-blown game engine, now powered by AI.

Unreal feels bigger than ever
The State of Unreal 2025 didn’t just drop new features, it unveiled an entire vision, and the crowd's reaction here in Orlando suggests this has been a popular move. Sure, The Witcher 4 tech demo stole the show and points to a vision where open-world dev doesn’t require trade-offs, but the news of improved digital human modelling and animation, AI that doesn’t just support creators but empowers them, feels like a line in the sand we're just stepping over.
If you're building the future of games, film, architecture, or anything in between it’s hard to ignore where the tools are headed. Right now, they’re all pointing squarely in Unreal’s direction.