


We propose a method, HotSpot, for optimizing neural signed distance functions by using the solution of a screened Poisson equation, which provides an asymptotically sufficient condition to ensure the output converges to a true distance function.
In contrast, existing losses, such as the eikonal loss, act as necessary but insufficient constraints and cannot guarantee that the recovered implicit function represents a true distance function, even if the output minimizes these losses almost everywhere. Furthermore, the eikonal loss suffers from stability issues in optimization. Finally, in conventional optimizations, area loss is indispensable but distorts the output.
We address these challenges by designing a loss function that, when minimized, converges to the true distance function, ensures stability, and naturally penalizes large surface area. We present theoretical analysis and experiments on both challenging 2D and 3D datasets and show that our method provides better surface reconstruction and more accurate distance approximation.
\( L \) represents the total loss function for optimizing neural signed distance function,
and we introduce a new heat loss term \( L_{\text{heat}} \).
@inproceedings{zimo2025hotspot,
title={HotSpot: Signed Distance Function Optimization with an Asymptotically Sufficient Condition},
author={Zimo Wang, Cheng Wang, Taiki Yoshino, Sirui Tao, Ziyang Fu and Tzu-Mao Li},
year={2025},
booktitle={CVPR},
}