A recent research trend is the optimization of Fourier sampling schemes for specific datasets of signals such as the Fast-MRI database. Many works consider subset selection methods, which restrict the samples locations on a grid, while a few recent papers begin exploring the optimization of off-the-grid sampling schemes using continuous approaches with applications to Magnetic Resonance Imaging. Despite promising results in terms of image reconstruction improvements, the optimization routines seem to suffer from a strong dependency to the initialization.
In a recent work, we explain why choosing optimal non Cartesian Fourier sampling patterns is a difficult nonconvex problem by bringing to light two optimization issues. The first one is the existence of a combinatorial number of spurious minimizers for a generic class of signals. The spectral theory applied to Vandermonde matrices allows to show that the cost function is highly oscillating. This results in a number of minimizers larger than $\binom{M}{K} M !$, where M is the number of measurements and K scales as the number of maximizers of the Fourier transform modulus of the signal. The second issue is a vanishing gradient effect for the high frequencies. In practice, this results in optimization routines being very slow, or trapped in local minimizers: the final sampling schemes are close to the initialization.
We show how using large datasets can mitigate the first effect and we illustrate experimentally the benefits of using stochastic gradient algorithms with a variable metric. We also suggest globalization approaches to attack this challenging problem.