Art-directable cloud animation
thesisposted on 06.05.2021, 14:29 by Yiyun Wang
Volumetric cloud generation and rendering algorithms are well-developed to meet the need for a realistic sky performance in animation or games. However, it is challenging to create a stylized or designed animation for volumetric clouds using physics-based generation and simulation methods in real-time.
The problem raised by the research is the current volumetric cloud animation controlling methods are not art-directable. Making a piece of volumetric cloud move in a specific way can be difficult when using only a physics-based simulation method. The purpose of the study is to implement an animating method for volumetric clouds and with art-directable controllers. Using this method, a designer can easily control the cloud's motion in a reliable way. The program will achieve interactive performance using parallel processing with CUDA. Users will be able to animate the cloud by input a few vectors inside the cloud volume.
After reviewing the literature related to the real-time simulation method of clouds, texture advection algorithms, fluid simulation, and other processes to achieve the results, the thesis offers a feasible design of the algorithm and experiments to test the hypotheses. The study uses noise textures and fractional Brownian motion (fBm) to generate volumetric clouds and render the clouds by the ray marching technique. The program will render user input vectors and a three-dimension interpolation vector field with OpenGL. By adding or changing input vectors, the user will gain a divergence minimization interpolation field. The cloud volume could be animated by the texture advection technique based on the interpolation vector field in real-time. By inputting several vectors, the user could plausibly animate the volume cloud in an art-directable way.