Controllable Neural Style Transfer for Dynamic Meshes Video
In recent years, animation movies are shifting from realistic representations to more stylized depictions that support unique de- sign languages. To favor that, recent works implemented a Neural Style Transfer (NST) pipeline that supports the stylization of 3D assets by 2D images. In this paper we propose a novel mesh stylization technique that improves previous NST works in several ways. First, we replace the standard Gram-Matrix style loss by a Neural Neighbor formulation that enables sharper and artifact-free results. To support large mesh deformations, we reparametrize the optimized mesh positions through an implicit formulation based on the Laplace-Beltrami operator that better captures silhouette gradients that are common in inverse differentiable rendering setups. This reparametrization is coupled with a coarse-to-fine stylization setup, which enables deformations that can change large structures of the mesh. We provide artistic control through a novel method that enables directional and temporal control over synthesized styles by a guiding vector field. Lastly, we improve the previous time- coherency schemes and develop an efficient regularization that controls volume changes during the stylization process. These improvements enable high quality mesh stylizations that can create unique looks for both simulations and 3D assets.
Publication Link:
12 views
38
11
4 weeks ago 08:37:35 1
Elon Musk: Neuralink and the Future of Humanity | Lex Fridman Podcast #438
1 month ago 00:01:21 1
Lab-Grown Human Brain Living in a Virtual World
1 month ago 00:02:57 1
All Neural Networks. All Autonomous. All 1X speed | 1X AI Update
1 month ago 00:04:21 1
AI could “take control“ and “make us irrelevant“ as it advances, Nobel Prize winner warns
1 month ago 00:09:12 1
dDigitalGene’s 3-Tier Referral Program: Boost Your Earnings