StructuredField Unifying Structured Geometry and Radiance Field

Kaiwen Song     Jinkai Cui     Zherui Qiu     Juyong Zhang
University of Science and Technology of China

We propose a novel structured representation that unifies
Radiance Field (🎯High-quality rendering) and Structured Geometry (⚙️Physics-based simulation).

TL;DR: Structured Tetrahedral Mesh Representation → Differentiable Mesh Rendering by Reparametrization → Maintain high quality structured geometry with Orientation preserving Homeomorphism

Abstract

Recent point-based differentiable rendering techniques have achieved significant success in high-fidelity reconstruction and fast rendering. However, due to the unstructured nature of point-based representations, they are difficult to apply to modern graphics pipelines designed for structured meshes, as well as to a variety of simulation and editing algorithms that work well with structured mesh representations. To this end, we propose StructuredField, a novel representation that achieves both a structured geometric representation of the reconstructed object and a high-fidelity rendering reconstruction of the object. We employ structured tetrahedral meshes to represent the reconstructed object. We reparameterize the geometric parameters of the tetrahedral mesh into the geometric shape parameters of a 3D Gaussians, thereby achieving differentiable high-fidelity rendering of the tetrahedral mesh. We propose a novel inversion-free homeomorphism to constrain the optimization of the tetrahedral mesh, which strictly guarantees that the tetrahedral mesh is remains both inversion-free and self-intersection-free during the optimization process and the final result. Based on our proposed StructuredField, we achieve high-quality structured meshes and high-fidelity reconstruction. We also demonstrate the applicability of our representation to various applications such as physical simulation and deformation.

Motivation

Point-based representations (e.g., 3D Gaussian Splatting, 2D Gaussian Splatting) are fundamentally unstructured, hindering physics-sensitive tasks such as simulation and deformation. Our structured representation enables direct compatibility with modern graphics pipelines.

Methods

Given multi-view images as input, we reconstruct the 3D scene using a structured tetrahedral mesh representation. We use an orientation-preserving homeomorphism to maintain the inversion-free and self-intersection-free structured geometry of the tetrahedral mesh. Additionally, parameters of tetrahedra are reparameterized into the corresponding 3D Gaussians parameters, enabling differentiable rendering. Based on our structured mesh representation, we can further apply the model to physical simulations, deformation, and other applications.

Representation of StructuredField

We reparametrize the parameters of 3D Gaussians with the parameters of tetrahedra to

  • Enable differentiable rendering of a tetrahedral mesh
  • Make Gaussians better follow the deformation of the tetrahedral mesh

High-quality Structured Geometry

We analyze the causes of low-quality geometric structures in mesh optimization, and we use an orientation-preserving homeomorphism implemented by a novel invertible neural network H to improve the quality of geometric structures during optimization.

  • Self-intersection: Using the homeomorphism to ensure no intersections.
  • Element inversion: Using orientation-preserving properties to reduce inverted elements.

During optimization, we fix the initial tetrahedral mesh vertices V and topology T, and use the vertices mapped by H as the actual vertices of the tetrahedra.

High-quality Rendering

We compare StructuredField with recent SOTA novel view synthesis methods.

Applications

Physical Simulation

We are able to simulate the object's behavior under different parameter settings.

Deformation

BibTeX

If you find StructuredField useful for your work please cite:


      @article{song2025structuredfield,
        title={StructuredField: Unifying Structured Geometry and Radiance Field},
        author={Song, Kaiwen and Cui, Jinkai and Qiu, Zherui and Zhang, Juyong},
        journal={arXiv preprint arXiv:2501.18152},
        year={2025}
      }