%0 Journal Article
%A Ran, Yi
%A Guo, Zhichang
%A Li, Jia
%A Li, Yao
%A Burger, Martin
%A Wu, Boying
%T A tunable despeckling neural network stabilized via diffusion equation
%J Signal processing
%V 239
%@ 0165-1684
%C Amsterdam [u.a.]
%I Elsevier
%M PUBDB-2026-00507
%P 110324
%D 2026
%Z the German Research Foundation, Germany with grant BU 2327/20-1
%X The removal of multiplicative Gamma noise is a critical research area in the application of synthetic aperture radar (SAR) imaging, where neural networks serve as a potent tool. However, real-world data often diverges from theoretical models, exhibiting various disturbances, which makes the neural network less effective. Adversarial attacks can be used as a criterion for judging the adaptability of neural networks to real data, since they can find the most extreme perturbations that make neural networks ineffective. In this work, we propose a tunable, regularized neural network framework that unrolls a shallow neural denoising block and a diffusion regularization block into a single network for end-to-end training. The linear heat equation, known for its inherent smoothness and low-pass filtering properties, is adopted as the diffusion regularization block. The smoothness of our outputs is controlled by a single time step hyperparameter that can be adjusted dynamically. The stability and convergence of our model are theoretically proven. Experimental results demonstrate that the proposed model effectively eliminates high-frequency oscillations induced by adversarial attacks. Finally, the proposed model is benchmarked against several state-of-the-art denoising methods on simulated images, adversarial samples, and real SAR images, achieving superior performance in both quantitative and visual evaluations.
%F PUB:(DE-HGF)16
%9 Journal Article
%R 10.1016/j.sigpro.2025.110324
%U https://bib-pubdb1.desy.de/record/644964