Transmedia Storytelling and the Multiplatform Worlds of HaZ Dulull

Imagine watching a sci-fi film, then playing a video game that continues the story, and even exploring that universe in an interactive online experience. This is the essence of transmedia storytelling – building one cohesive story world across multiple media platforms. In this explainer, we’ll break down what transmedia storytelling means, where it came from, and how creators use it today. We’ll focus on the work of British filmmaker and creative technologist Hasraf “HaZ” Dulull, whose career exemplifies modern transmedia production. From his feature-film-and-video-game project RIFT/Max Beyond to a Fortnite rap battle experience for the BBC, HaZ Dulull shows how a story can leap across film, games, and virtual platforms. Along the way, we’ll see how new tools like real-time game engines and motion capture are empowering storytellers to create rich, cost-effective transmedia experiences.

What is Transmedia Storytelling?

Transmedia storytelling means telling a single story or building a fictional universe across multiple platforms – such as films, television, games, comics, or virtual experiences – so that each medium contributes uniquely to the overall narrative. Media scholar Henry Jenkins famously defined transmedia storytelling as a process where “integral elements of a fiction get dispersed across multiple delivery channels for the purpose of creating a unified and coordinated entertainment experience. Ideally, each medium makes its own unique contribution to the unfolding of the story”henryjenkins.org. In other words, the different pieces (movie, game, novel, etc.) aren’t just repeat adaptations of each other – they expand the story. A classic example is The Matrix franchise: key story information was spread across three live-action films, an animated anthology (The Animatrix), comics, and video games, so no single piece gave you everything – you had to experience multiple media to get the full picture of the Matrix universehenryjenkins.org. This strategy engages audiences more deeply and encourages them to follow the story across forms.

Transmedia storytelling emerged from both academic theory and industry practice. Jenkins and others observed how modern entertainment franchises were increasingly “horizontally integrated” – large media companies spread their stories across film, TV, books, games and more, both to maximize revenue and to build richer story worldshenryjenkins.org. This often grew out of marketing synergy (for example, a comic book prelude released to drum up excitement for a movie, while also adding backstoryhenryjenkins.org). Over time, creators learned to “surf” these pressures and use transmedia not just as marketing, but as an art form to deliver an “expansive and immersive story” that wouldn’t be possible in a single mediumhenryjenkins.org. Early pioneers included expansive franchises like Star Wars (with its films, novels, games, etc.), and experimental projects like alternate reality games (ARGs) that blended fiction with real-world media.

A key aspect of transmedia storytelling is world-building. Rather than focusing only on one plot or character, transmedia projects often construct a complex fictional world that can support many stories across mediahenryjenkins.org. Each extension (a game, a web series, a set of character diaries on social media, etc.) explores different corners of that world, or different perspectives, enriching the audience’s understanding. For example, a television show might spawn an official podcast with in-universe “archives” that reveal character backstories, or a video game might let fans play through events only hinted at in a film. Ideally, each platform’s content is additive – contributing new insights (“additive comprehension” as game designer Neil Young puts it) – while still standing on its own so newcomers aren’t confusedhenryjenkins.org. Achieving this balance can be challenging, and it requires coordination and creative vision. In fact, Jenkins notes that transmedia works best when one creator or a tightly collaborative team plans the story across all media from the outset, rather than as a loosely licensed afterthoughthenryjenkins.org.

The Evolution from Theory to Practice

In the 2000s, transmedia storytelling moved from theory into a common industry practice. Hollywood studios saw the success of comic-book universes and began to conceive big franchises (like the Marvel Cinematic Universe) with multi-platform expansion in mind. At the same time, technology advances have lowered the barriers for independent creators to execute transmedia ideas. It’s no longer something only mega-franchises can do – thanks to digital tools, even a small studio or an individual can create content in film-quality and game-quality simultaneously. This democratization is crucial to understanding modern transmedia: today’s creators can use game engines, online distribution, and social media to build a story world that lives on multiple platforms without needing a huge corporate machine behind themmudstack.com.

Enter HaZ Dulull – a filmmaker who has taken transmedia storytelling into the independent production space. HaZ started his career in video games and visual effects before directing indie sci-fi films (The Beyond (2018), 2036: Origin Unknown) that gained attention on streaming platformsmagazine.reallusion.com. He then co-founded a production company (HaZimation) and later Beyond the Pixels, focusing on creating original intellectual property (IP) that spans animated films, series, and video gamesmagazine.reallusion.comcollisionawards.com. In short, HaZ Dulull has positioned himself at the intersection of film, gaming, and virtual production, deliberately developing projects as transmedia from day one. “We produce animated feature films, series and video games based on a proprietary pipeline using Unreal Engine,” he explainscreativebloq.com. By leveraging a real-time 3D engine (Unreal) and other tools, Dulull’s team blurs the line between making a movie and making a game. This lets them effectively double up a story across mediums with a unified creative vision. What makes HaZ Dulull particularly interesting for media students is how he exemplifies modern transmedia in practice – not as a huge franchise with separate teams, but as a cohesive indie production using new tech and clever workflows to tell stories in multiple forms.

Next, we’ll dive into some of HaZ Dulull’s projects that show transmedia storytelling in action. Each case study demonstrates different facets of building a narrative universe across platforms – from a feature film paired with a video game, to a reality TV show transformed into a Fortnite game experience, to cinematic storytelling inside a big video game world. Through these examples, we’ll see how transmedia concepts translate into real-world production strategies.

Case Study: RIFT and Max Beyond – One Story, Two Mediums

Figure: A frame from Max Beyond (2024), the animated sci-fi feature film directed by HaZ Dulull. Created entirely using the Unreal Engine (a real-time 3D game engine), the film achieves an anime-inspired visual style. The Max Beyond story – about two brothers trying to reunite across parallel universes – was developed not only as a movie but also as a video game, with both using the same 3D assets and world magazine.reallusion.comtelevisual.com.

Perhaps the most ambitious example of HaZ Dulull’s transmedia approach is his project originally known as RIFT, later released as the animated feature film Max Beyond (2024). This project was conceived simultaneously as a feature-length movie and a video game, sharing the same story universe, characters, and even much of the same digital production assets. The story follows a former marine named Leon who is “continuously trying to rescue his little brother Max” – a boy with the power to tear rifts in spacetime – from a secret research facility that is exploiting his abilitiesmagazine.reallusion.com. Each time Leon attempts the rescue, it ends tragically, as Max’s power rewinds time and splinters them into parallel universes. Both brothers desperately search for the one reality where they can change fate and reunite. HaZ Dulull has described the concept as “‘Edge of Tomorrow’ meets ‘Akira’” – blending a looping, action-packed sci-fi thriller with anime-inspired stylemagazine.reallusion.com. It’s a bold, universe-spanning narrative that was too expensive for live-action, which is why Dulull pivoted to animation using Unreal Engine to bring it to life during the pandemiccommunity.wacom.comcommunity.wacom.com.

From the start, RIFT/Max Beyond was designed as a transmedia production. Dulull’s studio built a unified pipeline for creating the film in Unreal Engine, so every 3D character model, environment and animation could be repurposed for an interactive game. “I don’t think we would have been able to have this creative flexibility and high-quality output for our feature film and video game simultaneously, without the integration of [real-time tools] in our pipeline,” Dulull noted of this processmagazine.reallusion.com. Essentially, they were making a movie and a game at the same time with a small team – a feat made possible by real-time rendering (allowing them to see finished animation instantly) and asset-sharing (using the same digital assets in both mediums). The animated film portion, Max Beyond, runs ~93 minutes and tells a contained narrative about Leon and Max’s multiverse ordeal. Meanwhile, the video game (currently in development for PC and consoles) is designed to let players dive deeper into that multiverse. The game “plays in the same universe as the animated feature film but strongly leans into the multi-verse aspect by embracing the branch narrative approach you get in video games, something we couldn’t really do in a 90-minute animated film.”magazine.reallusion.com In other words, the film gives you one dramatic storyline, whereas the game will allow exploration of alternate outcomes and side-stories within the same universe – a perfect example of each medium contributing differently to the whole story.

Interestingly, the idea for making a game grew organically out of the filmmaking process. While developing RIFT, Dulull’s team realized that they had created a wealth of 3D content and backstory that went beyond what could fit in the movie. In a weekend “game jam” experiment, they “took the existing characters and assets, migrated them to a game project… and soon did immersive things in gameplay which was a ton of fun.”mudstack.com They discovered that the story’s branching multiverse narrative was ideal for an episodic game format. Instead of letting unused ideas go to waste, they structured a plan to release the RIFT game in chapters (comparable to Telltale Games’ episodic approach) so that each episode could focus on a different reality or storyline threadmudstack.com. This not only made the game project feasible for a small indie team, but also meant fans of the film could continue the adventure interactively. The synergy paid off: a demo of RIFT’s game chapter was released on Steam Early Access and attracted so much interest that it got Dulull’s studio accepted into Microsoft’s ID@Xbox program (for independent developers)mudstack.commudstack.com. They even retitled the game (and eventually the film) to Max Beyond to emphasize Max’s central role and the story’s continuation beyond the moviemedium.com. As of 2024, Max Beyond the film has premiered (debuting at Vancouver’s SPARK Animation Festival and releasing on digital platforms), and the spin-off video game is in active development with a planned 2025 releasetelevisual.comtelevisual.com. This transmedia approach – developing a film and game in parallel – showcases how an independent creator can build a story universe across media. The film establishes the characters and emotional core, while the game expands the world, lets the audience participate, and even potentially adds new chapters to the narrative. Crucially, both were made with largely the same assets and engine, meaning the team didn’t need twice the budget to achieve a two-platform storytelling experience.

From a learning perspective, RIFT/Max Beyond teaches that transmedia storytelling isn’t just for Hollywood franchises. With creativity and the right tools, even an indie project can be transmedia. It also demonstrates a principle from Jenkins’ theory: transmedia works well when one creator or team controls the story across mediahenryjenkins.org. In this case, HaZ Dulull and his team at HaZimation were the singular creative force behind both the film and game, ensuring coherence. The result is a unified multiverse story told through both cinematic and interactive language. Media students can draw inspiration from this model – a single vision unfolding in two formats – and see how each medium can highlight different facets (linear emotional storytelling in film, versus player-driven exploration in games) while maintaining a consistent world.

Other Transmedia Ventures: From Rap Battles in Fortnite to Epic Sci-Fi Worlds

HaZ Dulull’s forays into transmedia don’t stop at his own original sci-fi IP. He has also worked on projects that extend existing properties or blend formats in innovative ways. Two notable examples are The Rap Game UK Fortnite experience and the cinematics for Dune: Awakening – very different projects that each bridge media forms.

The Rap Game UK x Fortnite: In 2023, Dulull collaborated with the BBC and Fremantle to bring a popular reality TV show – The Rap Game UK – into the world of the video game Fortnite. The result was a unique interactive experience often billed as the “first ever rap clash in Fortnite”fortnite.com. The Rap Game UK is a televised competition where emerging rappers battle and compete. Dulull’s task was to translate that concept into Fortnite’s universe, effectively creating a transmedia extension of a TV series into a game platform. The team built a custom Fortnite island (using Fortnite’s Creative mode and the Unreal Editor for Fortnite) that serves as an immersive rap battle game. Players can step into an “electrifying world” featuring iconic contestants from the show and even a Grammy-winning grime artist (Flowdan) as a characterfortnite.com. Gameplay-wise, it’s described as a blend of “explosive rap clashes with thrilling gameplay”, taking players through locations from an urban warehouse to a city rooftop and even a rap battle in outer spacefortnite.com.

What makes this project groundbreaking is the use of motion capture and real-time performance to bring the rappers’ performances into the game. Traditional Fortnite emotes wouldn’t suffice for authentic rap battles, so Dulull’s team partnered with a motion-capture studio to capture the actual movements and gestures of multiple rappers performing simultaneouslystudiot3d.comstudiot3d.com. This was challenging – rappers are not going to wear spandex suits and perform like typical actors. The solution involved using a markerless motion capture system (like The Captury) with multiple performers, allowing artists to rap naturally while their movements were recorded in 3Dstudiot3d.com. The data then had to be optimized to run inside Fortnite’s engine (Unreal) without slowing down the gamestudiot3d.com. The end result: the in-game avatars of the real rap contestants perform with surprisingly realistic moves and lip-sync, making the Rap Game Fortnite experience feel like a true extension of the show. This project even earned recognition for its innovative blending of live performance and interactive media – essentially gamifying a reality TV formathenrystewartconferences.com. For media students, it’s a striking example of transmedia in a non-fiction or unscripted entertainment context. A television show’s content (rap battles) was extended into a new medium (an online game world), allowing fans to engage by playing and watching battles in Fortnite. Dulull himself described this as “the gamification of Fremantle & BBC’s The Rap Game into Fortnite — pioneering a new transmedia model for interactive entertainment.”henrystewartconferences.com In other words, it opened the door for using game platforms as an extension of TV content, something we may see more of in the future. It’s also a showcase of real-time technology: using the Unreal Engine and creative tools, a small team can produce what is essentially an interactive episode of a show inside a game, fast and relatively cheaply, compared to making a standalone video game from scratch.

Dune: Awakening Cinematics: On the other end of the spectrum is Dulull’s work on Dune: Awakening, an upcoming open-world survival MMO (Massively Multiplayer Online game) set in the famous Dune science fiction universe. Here, HaZ Dulull was brought on by game developer Funcom to direct in-game cinematic sequences for the Dune: Awakening gamecollisionawards.com. This means he worked on the storytelling scenes within the game – for example, the epic trailer and story cutscenes that establish the world and characters for players. While this might seem like a more traditional role (many film/VFX directors move into game cinematics), it underscores how film and game production are converging. Dulull spent three years crafting cinematic content for this game, applying his filmmaker’s eye to an interactive mediumcollisionawards.com. The Dune franchise itself is a transmedia giant (originating from novels, and spawning films, TV, and games over decades). By directing the game’s cinematics, Dulull contributed to how that story is told in yet another medium – ensuring that the Dune lore and drama come through coherently in the video game format. For instance, the story trailer (revealed at Gamescom 2023) showcases a vision of Paul Atreides and the harsh world of Arrakis; these narrative elements were presented with the same care as a film scene, but rendered in-engine in real timeyoutube.com. Dulull’s involvement highlights a trend: game engines like Unreal are so advanced that film directors are now creating content inside games, and conversely, game cinematics are virtually animated short films. This blurring of roles is exactly the environment where transmedia thrives – creators who can operate in both film language and game language can unify a story across those forms. Dulull’s work on Dune: Awakening also likely fed back into his skill set for his own projects, since it involved high-end real-time graphics and large-scale world-building in a game engine.

Through the Rap Game UK and Dune examples, we see HaZ Dulull’s versatility in transmedia production. One project turned a TV show into a game experience, while the other brought film-style storytelling into a video game franchise. Both required understanding the strengths and limitations of each medium: What makes a rap battle exciting on TV and how can we translate that into an interactive format? What cinematic techniques make a game’s story resonate without interrupting gameplay? In each case, real-time technology (Unreal Engine, Fortnite’s tools) and techniques like motion capture were key to bridging the gap. Dulull’s success with these projects further cements him as a creator who moves fluidly between mediums – essentially a transmedia storyteller for the digital age.

Tech Toolkit: Real-Time Engines, Motion Capture & Asset Reuse

A recurring theme in HaZ Dulull’s work – and in contemporary transmedia creation at large – is the use of real-time 3D engines, motion capture, and shared digital assets to enable cost-effective production across media. Let’s unpack why these tools are so game-changing for transmedia storytelling, especially for independent creators or small teams.

  • Real-Time Game Engines (Unreal Engine): Traditionally, producing a CGI animated film is very different from making a game – different software, workflows, and huge render times for films. Real-time engines like Unreal Engine (originally developed for games) have upended that paradigm. HaZ Dulull leveraged Unreal Engine extensively: Max Beyond was created entirely in Unreal Engine, meaning every frame of the film was rendered in real-time rather than through slow offline renderingcommunity.wacom.com. This allowed his team to adopt a “final pixels” approach – what you see in the engine is the final image, no complex compositing neededcommunity.wacom.com. The benefit for transmedia is huge: if a film is made in a game engine, then the exact same 3D assets and animations can be used in an actual game built on that engine. In RIFT/Max Beyond, once they had high-quality characters and scenes working in Unreal for the movie, turning those into a playable game was relatively fast – they “migrated them to a game project setup very quickly”mudstack.com. Unreal Engine also supports deploying to different outputs (high-res video or interactive applications) from the same project. This unity of pipeline cuts down on duplication of effort. It’s a key reason Dulull’s team could make a film and game together on an indie budget. As Epic Games (makers of Unreal) themselves have championed, real-time tech helps “empower small teams” to accomplish projects that used to require far larger crewsmudstack.com. The visual quality possible in engines now rivals pre-rendered graphics, especially with features like advanced lighting, physics, and even cartoon-style shaders for the anime look Dulull wantedmudstack.com. In short, game engines have become all-purpose creation platforms, enabling transmedia content (films, games, VR experiences) to be developed side by side. Media students today are wise to familiarize themselves with engines like Unreal or Unity, as they are increasingly used in film production, not just games.

  • Motion Capture & Performance: Bringing human performances into digital media has been revolutionized by affordable motion capture (mocap) and even AI-driven animation. HaZ Dulull’s projects make heavy use of mocap to inject life into characters across film and game. For Max Beyond, actors performed scenes in mocap suits (sometimes remotely directed via Zoom during lockdown) to drive the animation of the 3D characterstelevisual.com. This means a single mocap session can serve both the film and the game – the captured animations (running, fighting, emoting) can be reused for interactive gameplay sequences. Dulull also utilized Xsens suits, Manus gloves (for hand/finger capture), and even experimented with machine-learning mocap from video footage for some shotsmagazine.reallusion.commagazine.reallusion.com. By mixing bespoke mocap with libraries of pre-made animations, his small team covered a lot of ground efficiently. In the Rap Game UK Fortnite project, as discussed, they employed markerless motion capture to record rappers’ performances for use in-gamestudiot3d.com. The ability to translate real human movement onto a digital avatar in real time makes the transmedia link more seamless – it preserves the authenticity of a performance across mediums. Also, because mocap can be done relatively quickly and iteratively, Dulull’s team could improve scenes on the fly. For example, if a certain action didn’t look right in the film edit, they could capture a new movement and immediately drop it into Unreal Engine, seeing the result in context without a lengthy render waitmudstack.com. This agility – enabled by real-time visualization and mocap – is a major enabler for transmedia projects, which often have to prototype and adjust content for different platforms.

  • Asset Reusability and Shared Worlds: One of the biggest cost-saving factors in transmedia production is reusing assets across media. In the past, if you had a film and a game, each would be made by separate teams building everything from scratch (which is why movie-based games were so expensive). In Dulull’s approach, the film essentially produces a library of game-ready assets. Characters in Max Beyond were modeled and rigged in software (like Reallusion’s Character Creator and iClone) with the game pipeline in mindmagazine.reallusion.com. The environments, effects, even the style of the film were chosen such that they could run in real-time. When it came time to make the game, the team already had high-quality models of Leon, Max, and other sci-fi elements – they didn’t need to remodel them. They could focus on gameplay mechanics and level design, populating the game with the existing art. This reusability extended to story assets too: the script’s unused ideas became game missions. It’s essentially economy of scope – by developing a story world thoroughly for one medium, they gathered enough material to launch another. Moreover, having a consistent asset base ensures aesthetic continuity across the film and game. Players who watch Max Beyond and then play the game will notice it looks the same – same character designs, same art style – which strengthens the feeling of one coherent universe. Dulull’s team even managed the game project in an episodic way to stay within their resourcesmudstack.com, showing that transmedia doesn’t have to mean biting off everything at once; you can stagger content in chapters.

Overall, the technology toolkit used by HaZ Dulull exemplifies how 21st-century creators can punch above their weight. Real-time engines and mocap significantly lower the cost and time needed to produce high-quality content, which in turn makes transmedia storytelling feasible without a Hollywood budget. For aspiring media creators, it’s a signal that learning these tools – and thinking creatively about cross-platform storytelling – will be invaluable. As Dulull’s work shows, it’s now possible to be a filmmaker-game-developer hybrid, crafting immersive stories that fluidly cross from screen to screen.

Implications for Future Storytellers

HaZ Dulull’s transmedia projects offer a glimpse into the future of media creation. The boundaries between film, games, and interactive experiences are dissolving. For media students and up-and-coming storytellers, there are several key takeaways and implications:

  • Think in Universes, Not Silos: Tomorrow’s hit stories might not be “just a movie” or “just a game.” Instead, they could be story universes that span multiple outlets. When developing an idea, consider how different platforms could each explore different dimensions of your narrative (as Dulull’s game does for Max Beyond’s multiversemagazine.reallusion.com). This doesn’t mean every project must be transmedia, but having that mindset opens up new creative possibilities. You might, for example, write a short film but also imagine an accompanying webcomic or interactive app that expands on it. Modern audiences often engage with story worlds across many channels – meet them where they are.

  • Leverage New Tools to Level Up: The democratization of technology means you don’t need a huge team to create transmedia content. Learn tools like Unreal Engine, Unity, Blender, or motion-capture techniques. These can allow you to prototype a story in one medium and quickly test it in another. As we saw, Dulull’s small team used a game engine to make a festival-quality feature filmcommunity.wacom.com. That same engine then gave them a playable game demo essentially “for free” from the film assetsmudstack.com. The cost savings and flexibility here are game-changers. As a student, getting comfortable with real-time 3D, interactive narrative design, and multi-platform production workflows will put you at the forefront of where the industry is headed.

  • Collaborate and Blur Roles: Transmedia storytelling often requires a mix of skills – writing, filmmaking, game design, UX design, etc. Embrace a collaborative mindset and be willing to wear multiple hats. Dulull’s background in both games and VFX gave him a unique advantage to tackle a hybrid project. If you’re a filmmaker, try partnering with game designers or learning some interactive design yourself; if you’re a game developer, study cinematic storytelling. The future likely belongs to creators who are cross-disciplinary. Studios are already looking for people who understand virtual production, interactive narrative, and cinematic techniques combined. Dulull’s Rap Game UK project, for instance, succeeded because of close collaboration between TV producers, a game development team, and mocap technicians – a blend of very different expertise working toward one goalstudiot3d.comstudiot3d.com.

  • Audience Engagement and Community: Transmedia experiences can deepen audience engagement by giving fans more ways to interact with the story. This can also build community around the content (think of how fans of a franchise might discuss clues dropped in a video game that tie into a movie, fostering online discussion). As a creator, consider how each platform can not only tell story content but also invite audience participation. For example, a transmedia project might include ARG elements or social media role-play that make the experience feel immersive and personal to fans. However, it’s important to ensure each part is satisfying on its own – not everyone will consume everything. Dulull made sure Max Beyond the film works as a standalone story, even as the game will provide bonus adventures for those who want more. Strive for that balance: reward the super-fans who seek out every piece, but don’t alienate those who only try one medium.

  • New Business Models and Opportunities: Transmedia storytelling can also open up new funding and distribution avenues. Dulull funded Max Beyond partly through an Epic MegaGrant and by attracting sponsors interested in the tech-forward approachmudstack.commudstack.com. The project’s transmedia nature (film and game) made it appealing to different stakeholders – film festivals, game platforms, even tech companies showcasing what their tools can do. As a future creator, you might find that framing your project as transmedia makes it more attractive to investors or grants (since it can have multiple revenue streams and innovation angles). Additionally, distribution can be multi-pronged: a project could release on Netflix, Steam, mobile app stores, etc., increasing its reach and longevity. Be open to non-traditional paths; for instance, Dulull’s use of Fortnite’s Creative mode as a distribution platform for content (essentially using a game as a “channel” to reach players) is an inventive way to get exposure without a traditional TV network.

In conclusion, transmedia storytelling is an exciting, ever-evolving field at the cutting edge of media. HaZ Dulull’s work – from RIFT/Max Beyond to the Fortnite rap battles – serves as an inspiring blueprint for how to do it successfully. He shows that with creativity, technical savvy, and a holistic vision, a story can transcend any single medium and become a universe that audiences can watch, play, and live in. For media students, now is the time to experiment with these ideas. As you develop your own projects, ask yourself: How might this story unfold across different platforms? What tools can I use to make that happen? The next generation of great storytellers will likely be those who not only tell compelling stories, but also engineer rich multimedia experiences around those stories. And who knows – the next transmedia universe that captivates the world could come from one of you, armed with a laptop, a game engine, and an imagination unbound by format.

TransmediaFrancesca Tabor