Projecting onto a 360-degree concave dome with a 10k resolution seam-blend. Traditional 16:9 assets would fail immediately.
A 15-minute journey through a black hole where the audience forgot the ceiling existed. The visualisation predicted a 3% pixel overlap error; we achieved 0.5% on site. Case Study B: The Global Livestream (Corporate) Client: [Tech Giant] Venue: Virtual Stage (XR) Role: Real-Time Visualiser visualiser portfolio
The stage design featured kinetic LED blades that moved vertically during the set. Static mapping would break the illusion. Projecting onto a 360-degree concave dome with a
I created a dynamic wireframe visualiser that ingested the DMX position data from the kinetic motors. The content mapped to the blades relative to their current height, not their resting height. I designed a suite of reactive clips in Notch that stretched and compressed in real-time. The visualisation predicted a 3% pixel overlap error;
Subtitle: The Portfolio of [Your Name] I. The Visualiser’s Thesis In the contemporary landscape of live events, branded content, and immersive experiences, the role of the Visualiser has transcended mere technical execution. We are no longer just the person who “hits the spacebar” or routes cables. We are the bridge between the abstract dream of a Creative Director and the physical reality of LED panels, lasers, and projectors.
I built the environment in Unreal Engine 5, utilizing nDisplay. I visualised the camera tracking data live, rendering the background parallax that made the 10-foot deep stage look infinite. I created a "Director's View" dashboard that allowed the producer to see exactly what the render engine saw, 1:1.
Firefox
Chrome