AI/ML in Engineering, Physics, Aerodynamics
AI/ML in Engineering, Physics, Aerodynamics
DINOZAUR: a diffusion-based neural operator parametrization with uncertainty
0:00
-12:50

DINOZAUR: a diffusion-based neural operator parametrization with uncertainty

Enjoy this synthetic podcast on a brand new publication

Here is the publication which this synthetic podcast is based on.

Matveev, Albert, et al. "Light-Weight Diffusion Multiplier and Uncertainty Quantification for Fourier Neural Operators." arXiv preprint arXiv:2508.00643 (2025).

Abstract:

Operator learning is a powerful paradigm for solving partial differential equations, with Fourier Neural Operators serving as a widely adopted foundation. However, FNOs face significant scalability challenges due to overparameterization and offer no native uncertainty quantification – a key requirement for reliable scientific and engineering applications. Instead, neural operators rely on post hoc UQ methods that ignore geometric inductive biases. In this work, we introduce DINOZAUR: a diffusion-based neural operator parametrization with uncertainty quantification. Inspired by the structure of the heat kernel, DINOZAUR replaces the dense tensor multiplier in FNOs with a dimensionality-independent diffusion multiplier that has a single learnable time parameter per channel, drastically reducing parameter count and memory footprint without compromising predictive performance. By defining priors over those time parameters, we cast DINOZAUR as a Bayesian neural operator to yield spatially correlated outputs and calibrated uncertainty estimates. Our method achieves competitive or superior performance across several PDE benchmarks while providing efficient uncertainty quantification.

🔗LINK

Share

Discussion about this episode

User's avatar