Project Details
Projekt Print View

Generative Models for Bayesian Inverse Problems in Image Processing

Subject Area Mathematics
Term since 2021
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 463409137
 
This is a follow-up proposal to continue our work on generalized normalizing flows and neural gradient flows in the first funding period of the SPP. Based on our previous results we want to investigate how well conditional normalizing flows approximate posteriors of well-posed Bayesian inverse problems. Here we intend to combine approximation results from a smoother class of functions with knowledge on optimal transport maps. Concerning gradient flows we will pursue three directions: first, we are interested in the convergence of Wasserstein gradient flows of maximum mean discrepancies (MMDs) with Riesz kernels which have several advantageous properties, but make the task challenging since the corresponding functionals are not geodesically lambda-convex. To this end, we will determine the critical points of such MMD functionals. Further, we will examine if we can modify recent results on Wasserstein gradient flows of Coulomb potentials for our setting. Second, we will consider MMD regularized f-divergences that relax absolute continuity assumption on the measures, and their Wasserstein gradient flows. Based on the kernel mean embedding of measures into reproducing kernel Hilbert spaces (RKHS), we intend to analyze such regularized functionals by the help of Moreau envelopes in RKHS in order to exploit their know properties. Third, we will deal with uni- and multivariate kernels which are related by a slicing procedure. Sliced kernels can be used to speed up learning procedures in the neural network context. While we proved a relation between these kernels in the Riesz setting, we are also interested in integrally positive definite kernels appearing, e.g., in the Stein gradient flow of the Kullback-Leibler divergence. Once the relation is clarified, we will use sliced kernels in such gradient flows. Finally, we will design and analyze algorithms for constrained optimal transport, where we have mainly moment constraints in mind. Here we will examine a class of generalized iterative scaling algorithms, also known as block-iterative simultaneous multiplicative algebraic reconstruction technique or as a special form of mirror descent. In particular, we will tackle some long standing open convergence questions on block iterations for our optimal transport setting.
DFG Programme Priority Programmes
 
 

Additional Information

Textvergrößerung und Kontrastanpassung