Alert button
Picture for Marc Lambert

Marc Lambert

Alert button

DGA, SIERRA

Variational Gaussian approximation of the Kushner optimal filter

Oct 03, 2023
Marc Lambert, Silvère Bonnabel, Francis Bach

In estimation theory, the Kushner equation provides the evolution of the probability density of the state of a dynamical system given continuous-time observations. Building upon our recent work, we propose a new way to approximate the solution of the Kushner equation through tractable variational Gaussian approximations of two proximal losses associated with the propagation and Bayesian update of the probability density. The first is a proximal loss based on the Wasserstein metric and the second is a proximal loss based on the Fisher metric. The solution to this last proximal loss is given by implicit updates on the mean and covariance that we proposed earlier. These two variational updates can be fused and shown to satisfy a set of stochastic differential equations on the Gaussian's mean and covariance matrix. This Gaussian flow is consistent with the Kalman-Bucy and Riccati flows in the linear case and generalize them in the nonlinear one.

* Lecture Notes in Computer Science, 2023 
Viaarxiv icon

Multiple description video coding for real-time applications using HEVC

Mar 10, 2023
Trung Hieu Le, Marc Antonini, Marc Lambert, Karima Alioua

Figure 1 for Multiple description video coding for real-time applications using HEVC
Figure 2 for Multiple description video coding for real-time applications using HEVC
Figure 3 for Multiple description video coding for real-time applications using HEVC
Figure 4 for Multiple description video coding for real-time applications using HEVC

Remote control vehicles require the transmission of large amounts of data, and video is one of the most important sources for the driver. To ensure reliable video transmission, the encoded video stream is transmitted simultaneously over multiple channels. However, this solution incurs a high transmission cost due to the wireless channel's unreliable and random bit loss characteristics. To address this issue, it is necessary to use more efficient video encoding methods that can make the video stream robust to noise. In this paper, we propose a low-complexity, low-latency 2-channel Multiple Description Coding (MDC) solution with an adaptive Instantaneous Decoder Refresh (IDR) frame period, which is compatible with the HEVC standard. This method shows better resistance to high packet loss rates with lower complexity.

Viaarxiv icon

Variational inference via Wasserstein gradient flows

May 31, 2022
Marc Lambert, Sinho Chewi, Francis Bach, Silvère Bonnabel, Philippe Rigollet

Figure 1 for Variational inference via Wasserstein gradient flows
Figure 2 for Variational inference via Wasserstein gradient flows
Figure 3 for Variational inference via Wasserstein gradient flows
Figure 4 for Variational inference via Wasserstein gradient flows

Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior $\pi$, VI aims at producing a simple but effective approximation $\hat \pi$ to $\pi$ for which summary statistics are easy to compute. However, unlike the well-studied MCMC methodology, VI is still poorly understood and dominated by heuristics. In this work, we propose principled methods for VI, in which $\hat \pi$ is taken to be a Gaussian or a mixture of Gaussians, which rest upon the theory of gradient flows on the Bures-Wasserstein space of Gaussian measures. Akin to MCMC, it comes with strong theoretical guarantees when $\pi$ is log-concave.

* 52 pages, 15 figures 
Viaarxiv icon