Abstract:Incorporating group symmetries via equivariance into neural networks has emerged as a robust approach for overcoming the efficiency and data demands of modern deep learning. While most existing approaches, such as group convolutions and averaging-based methods, focus on compact, finite, or low-dimensional groups with linear actions, this work explores how equivariance can be extended to infinite-dimensional groups. We propose a strategy designed to induce diffeomorphism equivariance in pre-trained neural networks via energy-based canonicalisation. Formulating equivariance as an optimisation problem allows us to access the rich toolbox of already established differentiable image registration methods. Empirical results on segmentation and classification tasks confirm that our approach achieves approximate equivariance and generalises to unseen transformations without relying on extensive data augmentation or retraining.
Abstract:The stationary velocity field (SVF) approach allows to build parametrizations of invertible deformation fields, which is often a desirable property in image registration. Its expressiveness is particularly attractive when used as a block following a machine learning-inspired network. However, it can struggle with large deformations. We extend the SVF approach to matrix groups, in particular $\SE(3)$. This moves Euclidean transformations into the low-frequency part, towards which network architectures are often naturally biased, so that larger motions can be recovered more easily. This requires an extension of the flow equation, for which we provide sufficient conditions for existence. We further prove a decomposition condition that allows us to apply a scaling-and-squaring approach for efficient numerical integration of the flow equation. We numerically validate the approach on inter-patient registration of 3D MRI images of the human brain.