Abstract:We propose a proximal variable smoothing algorithm for a nonsmooth optimization problem whose cost function is the sum of three functions including a weakly convex composite function. The proposed algorithm has a single-loop structure inspired by a proximal gradient-type method. More precisely, the proposed algorithm consists of two steps: (i) a gradient descent of a time-varying smoothed surrogate function designed partially with the Moreau envelope of the weakly convex function; (ii) an application of the proximity operator of the remaining function not covered by the smoothed surrogate function. We also present a convergence analysis of the proposed algorithm by exploiting a novel asymptotic approximation of a gradient mapping-type stationarity measure. Numerical experiments demonstrate the effectiveness of the proposed algorithm in two scenarios: (i) maxmin dispersion problem and (ii) multiple-input-multiple-output (MIMO) signal detection.
Abstract:For a regularized least squares estimation of discrete-valued signals, we propose an LiGME regularizer, as a nonconvex regularizer, of designated isolated minimizers. The proposed regularizer is designed as a Generalized Moreau Enhancement (GME) of the so-called SOAV convex regularizer. Every candidate vector in the discrete-valued set is aimed to be assigned to an isolated local minimizer of the proposed regularizer while the overall convexity of the regularized least squares model is maintained. Moreover, a global minimizer of the proposed model can be approximated iteratively by using a variant of cLiGME algorithm. To enhance the accuracy of the proposed estimation, we also propose a pair of simple modifications, called respectively an iterative reweighting and a generalized superiorization. Numerical experiments demonstrate the effectiveness of the proposed model and algorithms in a scenario of MIMO signal detection.