Motion, Unfiltered
Where digital dreams meet real-life canvases.


GENESIS
The digital realm has always been our playground. For years, we've crafted motion design and art direction that lives in the ephemeral world of screens—social feeds that get scrolled past, game sequences that blend into gameplay, films that transport viewers temporarily. But lately, something's shifted. Our work has been breaking out of the 2D confines and materializing in physical spaces at a rate we never anticipated.

When Unit9 approached us to create content for Nike's latest AirMax generation,to be displayed on Outernet London's massive LED screens, we realized this wasn't just another digital project. The scale alone was mind-blowing—creating motion that would surround viewers completely, visible from multiple angles simultaneously. This wasn't content designed to be consumed; it was content designed to be experienced.
Matching Outernet's unique architectural configuration proved especially challenging. Synchronizing content across multiple non-standard walls while ensuring seamless transitions between surfaces required precision we hadn't needed before. What looks perfectly aligned in a preview might reveal jarring disconnects when projected onto actual physical spaces.
The Sphere in Vegas presented similar challenges but cranked to eleven. Creating immersive visuals for what's essentially a giant dome isn't like designing for a flat surface. Suddenly we were thinking about sightlines, physical positioning, and how motion feels when it surrounds you rather than just plays in front of you.
Roots in Live Experiences

Looking back, we realize we've been unconsciously preparing for this shift for years. Back in 2016, we created visuals for Illenium's "Take You Down" performance—an ambitious project where we explored a narrative of an ambiguous character falling through a multidimensional, architectural world. The visuals symbolized the journey to find the source of pain, creating an immersive backdrop to Illenium's emotionally charged music.
Similarly, when Lorenzo De Pascalis brought us on to create visuals for Fedez's Flight Tour, we produced content for three tracks that were projected onto a massive cube structure that Fedez performed inside. The scale and immersive nature of that project pushed our creative boundaries in ways we couldn't have anticipated.



These projects were almost 7 years ago. But little did we know they would become the new standard for the majority of our studio's output, seeing how we've progressed from occasional live experiences to making them a cornerstone of our creative offering.
A significant milestone in our live experience journey was the Xbox Series X and S launch project "Power Your Dreams." Unit9 approached us to create content for what was originally planned as a live event but evolved into a livestream due to restrictions. We collaborated with director Sean Pruen to create a motion sequence projected onto a 12-foot monolith in central London, timed perfectly so the last frame coincided with the console going on sale at midnight. We also created a boot-up sequence triggered on set to introduce the monolith, which was ultimately streamed to hundreds of thousands of eager gamers.
Monster Specs, Monster Problems
The technical specifications for The Sphere project nearly broke us. We were dealing with 16K resolution full CG scenes that generated EXR sequence files of staggering size. Our storage systems, which had always been more than adequate, suddenly seemed laughably insufficient.
Coordinating between rendering capabilities, storage limitations, and client feedback loops became a project unto itself. We created new pipeline protocols just to manage the data flow. When a single rendered frame takes up more space than an entire previous project, you quickly learn to rethink everything about your workflow.
The Scale Disconnect
One of the trickiest aspects of transitioning to live experiences has been developing an intuition for scale. Something that looks perfectly balanced on a 27-inch monitor might feel completely overwhelming—or conversely, embarrassingly tiny—when projected onto a building-sized display.
We've had to develop new previsualization techniques just to approximate how motion will feel when experienced at actual scale. It's not just about size either—the way people physically move through a space changes how they perceive motion. Animation timing that feels perfect on a screen might feel sluggish or frantic in a walk-through environment.
Realtime Revolution
The excessive render times for these massive projects pushed us toward a solution that's revolutionizing our workflow: realtime tools. Unreal Engine has become central to our process for large-scale projects, allowing us to preview complex environments at actual scale before committing to final renders.
Embergen for fluid simulations and Rive for interactive animations have similarly transformed our capabilities. The ability to art direct and receive feedback in realtime has been game-changing—not just for efficiency but for creative exploration. When client and creator can iterate together in the moment, the work reaches places it otherwise might not.
The Logistics Headache
Live experiences bring logistical challenges that digital simply doesn't. For the F1 2022 live presentation, we weren't just delivering files—we were part of a complex operation involving physical setup, tech checks, and real-time playback systems.
Suddenly we were sending team members to venues days in advance, coordinating with lighting techs, sound engineers, and stage managers. Our delivery deadlines became inflexible—no "we'll send an update tomorrow" when thousands of people are showing up at a specific time.
The Immersive Payoff
Despite the technical challenges and workflow overhauls, the payoff has been immense. There's something profoundly different about experiencing creative work in physical space—something that simply can't be replicated through a screen.
When our work exists in the real world, people don't just watch it through the filter of an ad or a social post—they feel it, sense it, immerse themselves within it. The impact is exponentially greater. A subtle animation that might go unnoticed online becomes visceral and powerful when it surrounds viewers in physical space.
Even when creating similar content, the live experience version hits 10 times harder. Colors feel more vibrant, motion feels more dynamic, and emotional responses are more immediate and powerful. We're not just creating content to be viewed; we're creating environments to be experienced.
How We've Adapted
The shift hasn't been seamless, but it's been exhilarating. Our core team—Nik Hill steering the creative vision, Andy Dominique Rak handling art direction and lead motion, and Owais Javid executing motion and supporting art direction—has had to stretch in new directions.
We've developed new workflows that factor in physical space from the start. Rather than designing and then adapting to physical contexts, we now begin by mapping out the physical experience first.
Why This Matters
The rise in requests for live experiences isn't random. People are craving physical connection after years of digital saturation. Brands understand this and are investing in moments that can't be scrolled past or skipped.
For us at TwentythirdC, this evolution feels natural. We've always been about creating visual experiences that resonate—the medium is secondary to the impact. Whether it's on a phone screen or projected onto a 366-foot spherical display, our goal remains the same: create work that stops people in their tracks.
The challenge now is maintaining our distinctive visual approach while mastering the unique demands of physical space. But if there's one thing we've learned, it's that constraints often lead to our most innovative work.
HAND-PICKED
PROJECTS FOR YOU
Cadillac

Nike

Incari

Starset

Intel

Xbox
