The Art Director's Liberation
Creating Without Waiting

.webp)
GENESIS
There's a moment every art director knows too well. You've meticulously placed each light, positioned the camera just so, and given direction to bring your vision to life. Then comes the dreaded phrase: "Let me render that and get back to you tomorrow." That moment starts to vanish from TwentythirdC's creative process. And holy shit, what a difference it makes.

The Real in Realtime
As an art director, the ability to move a light, adjust its intensity, and immediately see how it affects your scene isn't just convenient—it's transformative. It changes how you think, how you create, and how boldly you explore possibilities.
Before Unreal, I'd have to imagine how a lighting change might look, then wait hours to see if I was right. Now I can try five approaches in five minutes and make decisions based on what I actually see, not what I think I might see.
This WYSIWYG approach (What You See Is What You Get) isn't just about speed—it's about creative accuracy. The gap between intention and execution shrinks to almost nothing, allowing us to work with precision that was previously impossible.
The Sphere: Impossible Made Possible
When we landed the project for The Sphere in Las Vegas last year, the scope was intimidating: create immersive visual content for the world's largest LED screen in a compressed timeline. Using traditional pipelines, it would have been a massive headache. [ spoiler alert : it was ]

What made this Sphere project fascinating was the hybrid approach. We leveraged both Unreal Engine and Cinema 4D with Octane—each for what it did best. The contrast was enlightening.
Working in C4D with Octane gave us that photorealistic and accurate quality we needed for certain elements, but the render times were brutal. Even with optimiztion techniques, we'd make a small adjustment to a light and then... waiting game.
The 24K resolution renders in C4D became a genuine production challenge. Each frame demanded significant render time, and without access to a render farm, it would have been flatly impossible to deliver.
"We ended up spending a couple thousand on render farm services just to make the C4D pipeline viable," Nik recalls. "Even then, the bottleneck of waiting for renders meant creative decisions had to be more conservative." - "Using Unreal, we could actually preview content at scale, with accurate lighting and perspective, making adjustments that accounted for the curved surface and massive scale."
Starset: Breaking Creative Barriers
Our music video for Starset's "Brave New World" demonstrates how realtime tools shattered previous limitations. The project demanded dozens of CG environments, complex character animations, and atmospheric effects—all on a timeline and budget that would have been laughable using traditional methods.

We were able to build, light, and film virtual environments as if we were a physical camera crew. When something wasn't working, we could change it on the spot. No waiting, no compromise, just pure creative flow.
This approach allowed us to create shots that would have each required days of rendering in just hours of actual work. More importantly, it let us focus on what mattered—the emotional impact of each scene—rather than technical limitations.

The Unreal Difference
The contrast with our Unreal Engine pipeline was stark and revelatory.
With Unreal, I spent almost all my time actually directing—moving lights, adjusting materials, refining camera movements—instead of waiting and managing render queues. The immediate feedback loop completely changed how I approached the creative process.
The numbers tell the story: a 600-frame sequence at full 24K resolution rendered on a single local machine in about 3 hours. The same sequence through traditional rendering would have taken days, even with expensive render farm support. And this has been proven through the other scenes we have rendered in C4D. The difference was mental.
The only real bottleneck with Unreal was converting the EXR sequences to PNG 24 — a file handling issue rather than a creative one.
From Frustration to Flow
The psychological impact on our creative process can't be overstated. Traditional rendering pipelines introduced enormous friction between ideation and execution, creating frustration and hesitation.
"The mental cost of thinking 'if I try this and it doesn't work, I've wasted a day of rendering' was huge," Nik reflects. "It made us more conservative, less willing to experiment."
With realtime tools, that friction vanishes. Creative energy that was once spent managing technical limitations is now focused entirely on making scenes look their best. The conversation shifts from "Can we afford to try this?" to "What would make this stronger?"
Keeping Perspective: The Right Tool for the Job
While we've embraced realtime engines, we maintain perspective about their place in the broader ecosystem. Path tracing renderers like Octane, Redshift, and Cycles still have their place for certain projects where photorealistic light transport is non-negotiable. it's not about replacing all our tools—it's about using the right one for each job. Sometimes we need the absolute physical accuracy of path tracing. But often, the visual quality of realtime engines is more than sufficient, and the creative advantages are overwhelming.
What's exciting is how quickly this gap is closing. Each update to Unreal Engine brings it closer to the visual quality of offline renderers, while maintaining its realtime advantage.
The Immersive Future
As we look toward a future where content increasingly shifts toward AR experiences, immersive venues like The Sphere, and devices like Vision Pro and AR glasses, realtime creation becomes not just advantageous but essential.
"Future content won't just be viewed—it'll be experienced, often with viewers controlling their perspective," Nik explains. "Creating for these media demands tools that let us simulate and test these experiences during development."
For art directors, this represents both a challenge and an opportunity. The skills developed in realtime creation—thinking spatially, considering multiple viewpoints, and designing for interactive experiences—position forward-thinking studios to lead in this emerging landscape.
The Ecosystem Explosion
What makes this moment so exciting isn't just Unreal Engine—it's the broader explosion of real-time creative tools transforming every aspect of motion design and visual creation.
EmberGen has revolutionized fluid simulations, allowing artists to create and modify complex smoke, fire, and liquid effects with immediate feedback. What once required overnight renders can now be tweaked in real-time, opening up experimental approaches that were previously impractical.
Rive has transformed UI animation and interactive graphics, bringing real-time principles to interface design. The ability to create complex, responsive animations that can be directly implemented in products rather than approximated has eliminated the gap between design and implementation.
Beyond these, we're seeing tools like:
- Notch for real-time visual effects and interactive content
- TouchDesigner for procedural generative content
- Unity's Art Tools for certain specialized visualization needs
- Blender's EEVEE renderer bridging the gap between speed and quality ( which has helped FLOW win an Oscar, by the way)
"What's significant isn't just individual tools," Nik observes. "It's the philosophical shift toward immediacy across the entire creative process. Every part of the pipeline is moving toward this real-time paradigm."
The Creative Impact
The practical impact of this shift goes beyond production efficiency. It fundamentally changes how we make creative decisions.
When rendering isn't a bottleneck, you take more risks. You try the wild camera move, the complex lighting setup, the unusual material combination—because why not? You'll see immediately if it works.
For clients, this translates to more innovative work. Projects that once would have been deemed too experimental due to production constraints become viable when the exploration cost drops dramatically.
As we look to future projects, this real-time capability is becoming central to our creative advantage. The ability to produce high-quality, high-resolution content efficiently doesn't just save money—it expands what's creatively possible within client timeframes and budgets.
We’ve gone from spending maybe 20% of my time on actual creative direction to well over 80%. The rest used to be technical management and waiting. That shift alone has transformed the quality of our output.
For upcoming immersive projects—whether for venues like the Sphere, AR experiences, or interactive installations—this real-time foundation positions TwentythirdC to take on challenges that would have been daunting just a few years ago.
What a Time to Be Alive
There's a particular satisfaction in witnessing a technological revolution that directly serves creative expression. As Nik puts it: "Throughout history, artistic revolutions often followed technological ones. Oil paint, photography, digital tools—each expanded what artists could express."
Today's real-time revolution feels similar in scale. It's not just about efficiency; it's about removing barriers between imagination and realization. When the technical friction diminishes, pure creative thinking flourishes.
At TwentythirdC, we're not just using these tools—we're riding the wave of a fundamental transformation in how visual storytelling happens. The future belongs to creators who can think beyond the constraints of traditional pipelines and embrace the immediacy of real-time creation.
And yes—what a time to be alive indeed.