Frequently Asked Questions
Resources

Input Media
What can be supplied; such as animations, pre-shot plates, and interactive assets. Creating dynamic and engaging visual experiences.

Profitability
Benefits of 3D production, such as: reduced location costs, streamlines workflows, and efficient resource management strategies.

The X Factor
Breaking down the Volume’s “special something”. Key for visual impact and immersive storytelling, using both Set and Volume.

Working 3D
Tools for 3D production, including: software, specialised camera, and tracking systems. Key for seamless virtual production.

In-Camera VFX concepts
Essential production concepts: the viewing frustum, parallax, and camera behaviour. Key for optimising accuracy in virtual environments.

Artifacts
Covering common issues like motion blur, aliasing, and texture stretching. Key for ensuring visual quality and realism.
Popular Topics
What types of media can I use?
We can input various types of media, including high-resolution images, video clips, CGI elements, animations, and real-time rendered content created with game engines like Unreal Engine.
What is the advantage of using a Volume / 3D space?
The ability to create immersive environments with realistic lighting and reflections, enhancing visual quality and actor interaction with the virtual set.
LED walls and Volume specs
110m2 LED wall
2.6mm Pixel pitch LED screen
Novastar Mx40 Up LED processors
Four 4K solid state playback
UNREAL render nodes and media server
Mo-Sys StarTracker system
Are Lighting and Equipment provided?
This depends on your production’s needs. Our team is available if required.
Some essentials of 3D production
Technology LED walls, Unreal Engine, Mo-Sys StarTracker.
Cost Daily and weekly rental rates for LED setups.
Pre-Production Needs Storyboarding, asset creation.
Team Skilled professionals in 3D modelling and animation.
Timeframe Adequate time for pre-production to post-production.
Flexibility Be prepared for iterative adjustments.

Input Media
Still images
High resolution stitched photographs are the simplest and most cost-effective content for the wall. These can also be imported into Unreal, combined with various animated elements, and used to create a more immersive environment.
What is a moving ‘plate’?
A ‘plate’ is a series of shots stitched together for form a final scene. Moving or driving plates (360° plates) are typically shot with a multi-camera array. These can be pre-shot, or are available to licence from various libraries.
2.5D and Photogrammetry
We can also achieve 2.5D tracked environments by using LIDAR scanning and a composition of thousands of photogrammetry images. This approach can be a perfect medium depending on your production’s needs; as it’s cost-effective when compared to real-time 3D content. Additionally, it has the benefit of doubling as an existing location that is difficult, or impossible, to access (such as historic or religious buildings).
Real-time 3D production
The holy grail, where all photorealistic content is generated in 3D through Unreal Engine or UNITY, allowing for complete control; over environment, set, and lighting. Requires a lot of prep and planning.

Profitability

How does this affect the budget plan
While the initial shot fee for a Virtual Reality studio looks high – productions can remove ‘Movie Magic’ and ‘Hot Budget’, and allows you to completely rethink how the production’s budget is allocated.
So how does VR production affect the actual budget?
There are potential large savings in production costs from art department, travel, and much of the budget that is traditionally allocated to Post VFX can be transferred into asset creation and VR studio shoot. Set construction is significantly reduced, when building in Unreal.
What is ‘pre-vis’?
Previsualization is the visualizing of scenes or sequences in a movie before filming. It now forms a vital part of the production process and is incorporated as part of the production’s asset output, instead of just being a visualisation tool. The traditional “pre-production – production – post-production” timeline is redundant in VR production.
Does this fall within the VFX budget?
Be aware, currently many productions are expecting VR production costs to come completely from the allocated VFX budget; however, this is not sustainable. Virtual production requires its own budget allocation.
When are the Assets created?
Producers find that bringing forward Asset creation to pre-production, instead of during post-production, they are experiencing less budget slippage, and more timeous delivery.
The X Factor
What is the advantage of using a Volume?
Working in a Volume first and foremost allows the actors to interact with the environment through the LED walls. Enhancing the performance, as well as enabling cast members and assets to interact and collaborate. Along with volumetric light reflecting and refracting on mirrors, glass, and chrome, serve to strengthen the scene and the world it plays out in. Finally, the ability to create and control scalable backgrounds allows for more efficient lens selection and framing.
How different is this to Green Screen?
All obstacles and challenges associated with green screen production are removed when using VR production. Additionally, there is no risk of green fringing on surfaces, thus removing any need to remedy this in post-production.

Working 3D

What is the advantage of using a 3D space?
The primary advantage of using a 3D space to track a camera is that it can move around within the volume, thereby changing the perspective of the LED wall.
What tracking and camera systems are used?
We use the Mo-Sys StarTracker system, which utilises an IR camera (infrared) to collect data about the position of the primary camera. Mo-Sys is an award winning, world-renowned manufacturer or virtual production systems and camera robotics for film, high-end TV, and broadcast. Its products include Vicon Trackmen, Opti, and Type. These allow you to track both inside, and outside, of a system.
How do the camera and Volume interact?
Data collected by the camera system is fed into Unreal Engine, which allows the user to react to the position, height, and angle of the camera in real-time, similar to how a person walks through a scene with a VR headset. The camera is also sending real-time FIZ data (focus, iris, and zoom) directly into Unreal Engine. This enables direct control over f-stop, or focus, and immediate effects seen on the screen.
Does this fall within the VFX budget?
Be aware, currently many productions are expecting VR production costs to come completely from the allocated VFX budget; however, this is not sustainable. Virtual production requires its own budget allocation.
In-Camera VFX concepts
What is a ‘frustum’?
Generally, a frustum is the field-of-view (FOV) of a camera. In ICVFX, a viewing frustum is the region of space in the generated environment that appears on the screen.
What is the inner frustum?
The inner frustum is the exact area of the screen (and digital content) that is on-camera and being recorded. It represents the FOV from the camera’s perspective based on the current focal length. The image shown in the inner frustum tracks with the physical camera as the camera moves within the scene, always displaying what the virtual equivalent of the camera will see.
How does all this frustum stuff affect what I see?
When viewing through the real-world camera, the system creates a parallax effect that creates the immersive experience of shooting in a real-world location – as opposed to a flat background or plate.
What is the outer frustum?
In short, the outer frustum is everything outside the camera’s view. This outer frustum allows the LED wall to function as a dynamic light, and reflection, source for the physical Set. The digital content in the outer frustum envelopes the Set in the virtual world, and illuminate it as if it were a real-world location. The outer frustum remains static when the camera moves, mimicking how lights and reflections do not move with the camera in the real world.
Can I choose what falls on the inner and outer frustums?
Yes, light intensity, resolution, and focus can all be manipulated independently between the inner and outer frustum. This gives you more creative control. Note that only content contained within the inner frustum is recorded.

Artifacts

Sync Artifacts
This affects framerate and timing to devices on the cutting-edge of technology. Check whether the LED wall’s signal processor allows the framerates needed for production. Processor implementations may cause trouble in low lighting environments.
Colour Artifacts?
This affects the accuracy to which LED wall colour is perceived by your eye. We are currently working on a calibration solution for the most accurate colour reproduction in-camera.
Moiré Artifacts?
This deals with disturbances caused by patterns overlapping each other, i.e. photo sites on the sensor and LEDs of the LED wall. We urge you to perform your own test to get a sense for this new style of shooting. There are many variables including camera and screen type, low pass filter, lens characteristics, aperture, focus, and more can influence the severity of the moiré when it emerges.