As a new type of film and television production technology, the basic principle of digital virtual shooting technology isto achieve the perfect integration of real scenes and virtual scenesthrough computer-generated virtual scenes and digital special effects on the basis of real-life shooting, and achieve a simulated live shooting effect., you can achieve scenes that could not be achieved before.
The rapid development of digital virtual shooting technology has provided new possibilities for digital virtual shooting production. It has been widely used in the field of film and television dramas and even pan-entertainment, but it has also encountered technical bottlenecks and creative limitations. TheUnreal based LED digital virtual shooting mobile control system uses a WYSIWYG real-time visualization method to take another big step forward in the technological exploration of digitalization and simulation of film and television production.
Technical value and application challenges
technical value
As the current cutting-edge film and television drama shooting method, LED digital virtual shooting has made revolutionary progress compared with the early digital virtual shooting, and its application in some film and television production fields has become more and more mature.
Early traditional digital virtual shooting-reflection from green screen shooting, for objects that are prone to scene reflection in the scene. For example, the surface of a metal cup will reflect the surrounding environment. If you shoot in a green studio (you want to shoot an office scene), Due to the green screen, the metal cup will appear green, but the reflection effect of real office scenes is unpredictable, which will make it difficult for later staff to handle it,resulting in partial details of the final work being unreal.
InLED virtualization production, the LED screen can display virtual pictures in the 3D real-time engine, or can also display HDRI panoramic images captured in advance. Light consistent with the position, size, shape, brightness and color of the virtual environment is illuminated from different directions andcan simulate a more realistic basic lighting effect.
Therefore,LED digital virtual shooting has the advantage of a real global lighting effect, so that the picture correctly mapped to the real LED screen and the entire shooting scene inherit the lighting information and reflection information in the virtual scene, and can illuminate and reflect the real subject., refraction, etc. produce real effects, and the lighting information will change in real time with the content of the LED screen.
For example, when shooting an office scene, the LED screen displays the office picture, and the real subject, the metal cup, will reflect the office environment, thus eliminating the problem of inconsistent illumination between the subject and the picture displayed on the LED screen in the prior art. LED background wall shooting has advantages in handling transparent media and mirror reverse color shooting.
Therefore,LED digital virtual shooting has dual advantages in saving time costs and production costs for film and television production. Especially for filming with short cycles such as advertising, LED digital virtual shooting can be used to directly produce films without having to enter post-production.
application challenges
LED digital virtual shooting enables actors and virtual environment to be more closely matched to reality by arranging lighting and controlling lighting on site, infinitely narrowing the boundary between virtual shooting and real-time shooting, and becoming a prerequisite for the research on the simulation effect of LED digital virtual shooting. Base, it can improve the new visual fragmentation problems such as lighting effect mismatch, real-time delay in picture synthesis and poor image quality in the current live shooting of many film and television dramas.
However, in the actual LED digital virtual shooting scene, the lighting intensity of the LED background wall screen alone cannot meet the shooting needs. The subject to be photographed is still a certain distance away from the LED. Therefore, traditional lighting equipment also needs to supplement the lighting to make up for the background wall lighting. Problems, by matching each other to obtain the final effect. By controlling the distance between the light source and the subject during shooting, the subject can achieve a specific light ratio with the background, giving different lenses unique tonality.
In addition, the optical properties of the LED background wall itself will make the emission peak of the red light source extremely high,causing human skin to reflect more red, making the picture presented collectively redder. Based on the three primary colors of RGB, the screen spectrum will also be incomplete, and the color rendering is poor, resulting in a metamerism phenomenon, which causes a huge gap between the picture output by the camera and the real color.
In response to the above problems, the Unreal-based LED digital virtual shooting mobile control system can liberate film and television creators to a certain extent. Actors can put themselves in virtual scenes that are close to real and devote themselves to performing, and directors can intuitively see the output of real-time shooting. The final picture effect, and quickly switch virtual scenes according to the creator's artistic expression needs, eliminating the need for transition.
At the same time, by prefabricating the digital assets needed for shooting in advance, the matching functions of lighting and color are improved, so that the effect of direct release is closer to the final effect required by the director. This post-front shooting method further promotes the production process of film and television dramas. Iterative, more collaborative and non-linear.
Innovative design of mobile control systems
The system design aims at the autonomous controllability of the digital virtual shooting scene and enhancing the effects of controlling the lighting and color of the shooting scene.It creates a system tool for the staff of the film and television crew to manually control and move the lighting atmosphere and color of the LED virtual shooting scene.
The design ideas focus on the research background of virtual shooting technology, realistic requirements analysis, case analysis, design concepts, design process implementation steps and data reference and effect feedback of project drills to verify the stability and reliability of the design results.
For the real-time virtual scene LED shooting system, the focus is on the LED digital virtual shooting framework applied in the actual film and television drama shooting process. It includes an LED screen, digital asset module, real-time camera, tracking module, virtual engine module (Unreal), external lighting system, virtual light console module and XR picture synthesis module (obtains real-time rendering of pictures with depth channels by the virtual engine and pictures taken by the real camera to synthesize the final picture).
At the same time, the system can call digital assets to build a virtual scene according to the content needed to be presented in the shooting scene, and reconstruct the virtual LED screen and virtual camera in the virtual engine module. The collected real environment lighting information in the studio is synchronized into the virtual engine in real time, and the virtual scene is rendered in distributed real-time through the virtual engine module and presented on the virtual LED screen. Then, the virtual engine is superimposed on the virtual LED screen based on the real-time camera position and lens distortion information. The picture with depth channel is rendered in real time. The physical LED screen displays the virtual LED screen. When the real-time camera completes shooting, The XR module obtains the picture with the depth channel and the picture taken by the real-time camera, and synthesizes the final picture.
Distributed implementation and application cases of mobile control systems
Distributed implementation of the system
According to the design ideas of the system, combined with the actual situation of the technical equipment, the specific implementation is carried out in accordance with the distributed system process.
First, the virtual engine module reconstructs the virtual LED screen in the virtual engine module according to the shape and characteristics of the physical LED screen; the real-time camera tracking module captures the position information of the real camera and uses the VRPN protocol to capture the real camera position information and motion data in real time. Capture into the virtual engine module and build a virtual camera, and simulate the motion of the real camera in real time by controlling the motion of the virtual camera.
Second, according to the content that needs to be presented in the current shooting scene, the virtual engine module retrieves the three-dimensional model and ring video from the digital asset library module to build a virtual scene, and controls the brightness and color of the lights in the external lighting system through the virtual light console; The ambient light collection system collects real environmental lighting information in the studio for the virtual engine module, synchronizes the real ambient light information into the virtual engine in real time, and simulates the physical optical effects and physical characteristics of the model in the virtual scene.
Third, the virtual engine module performs distributed real-time rendering and projects the rendered picture on the virtual LED screen; the virtual engine also superimposes a picture with a depth channel rendered in real time by the virtual engine outside the virtual LED screen based on the real-time camera position and motion data obtained by the real-time camera tracking module and lens distortion information obtained by the lens data sensor. The rendered picture is a picture that changes according to the real-time position information of the camera and has a depth of field and parallax.
Fourth, the rendered picture on the virtual LED screen is converted into a video stream signal available to the real LED screen for output, and the physical LED screen displays it in real time; the picture with depth channels in step 3 is output to the XR module.
Fifth, the physical LED screen and external lighting provide the required lighting for the subject in reality, and the lighting information changes in real time with the content of the physical LED screen.
Finally, the real-time camera captures the current scene, and the captured picture is transmitted to the XR module. The XR module obtains the picture with depth channel rendered in real time by the virtual engine and the picture taken by the real camera, and synthesizes the picture to obtain the final picture.
Practical application cases of the system
According to the design ideas and implementation process of the system, the actual effect of the system in typed real scenes is analyzed by showing cases.
标签: