Categories
Uncategorized

Best paper award ! (highlight talk)

Our paper From sparse recovery to plug-and-play priors, understanding trade-offs for stable recovery with generalized projected gradient descent, (A. Joundi, Y. Traonmilin, J.-F. Aujol) won one of the best papers award (highlight talk) of the Conference on Parsimony and Learning,  2026. Congrats everyone !

Categories
Uncategorized

3 Papers accepted !

We just had the following papers accepted ! Congrats to everyone especially Phd students (S. Houache and A; Joundi)

From sparse recovery to plug-and-play priors, understanding trade-offs for stable recovery with generalized projected gradient descent, A. Joundi, Y. Traonmilin, J.-F. Aujol, Accepted to Conference on Parsimony and Learning,  2026.

A Recovery Theory for Diffusion Priors: Deterministic Analysis of the Implicit Prior Algorithm, O. Leong, Y. Traonmilin, accepted to AISTATS 2026.

On the impact of the parametrization of deep convolutional neural networks on post-training quantization, S Houache, J.-F. Aujol, Y. Traonmilin, To appear in Transactions of machine learning research, 2026.

Categories
Uncategorized

Paper acepted

Our paper Joint structure-texture low dimensional modeling for image decomposition with a plug and play framework, A. Guennec, J.- F. Aujol, Y.  Traonmilin has been accepted for publication in Siam journal on Imaging Sciences.

Categories
Uncategorized

New preprint

We uploaded our new preprint: “Towards optimal algorithms for the recovery of low-dimensional models with linear rates” Y. Traonmilin , J.-F. Aujol, A. Guennec

Abstract: We consider the problem of recovering elements of a low-dimensional model from linear measurements. From signal and image processing to inverse problems in data science, this question has been at the center of many applications. Lately, with the success of models and methods relying on deep neural networks leading to non-convex formulations, traditional convex variational approaches have shown their limits. Furthermore, the multiplication of algorithms and recovery results makes identifying the best methods a complex task. In this article, we study recovery with a class of widely used algorithms without considering any underlying functional. This result leads to a class of projected gradient descent algorithms that recover a given low-dimensional with linear rates. The obtained rates decouple the impact of the quality of the measurements with respect to the model from its intrinsic complexity. As a consequence, we can directly measure the performance of this class of projected gradient descents through a restricted Lipschitz constant of the projection. By optimizing this constant, we define optimal algorithms. Our general approach provides an optimality result in the case of sparse recovery. Moreover, we uncover underlying linear rates of convergence for some “plug and play” imaging methods relying on deep priors by interpreting our results in this context, thus linking low-dimensional recovery and recovery with deep priors under a unified theory, validated by experiments on synthetic and real data.

Categories
Uncategorized

Paper accepted

Our paper “Sketched over-parametrized projected gradient descent for sparse spike estimation” (https://hal.science/hal-04584951v1) has been accepted to Signal Processing Letters.

This is the last work of PJ Bénard for his PhD, his defense is next week! A nice application of compressed sensing in spaces of measures.