Dual Space Preconditioning for Gradient Descent
The conditions of relative convexity and smoothness were recently introduced by Bauschke, Bolte, and Teboulle and Lu, Freund, and Nesterov for the analysis of first-order methods optimizing a convex function. Those papers considered conditions over the primal space. We introduce a fully explicit descent scheme with relative smoothness in the dual space between the convex conjugate of the function and a designed dual reference function. For Legendre type convex functions under relative smoothness, our scheme naturally remains in the interior of the domain, despite being fully explicit. We obtain linear convergence under dual relative strong convexity with rates that are invariant under horizontal translations. We show that it is not equivalent to the standard primal scheme by finding a separating class of functions. Our method is a form of non-linear preconditioning of gradient descent that extends the class of convex functions on which explicit first-order methods can be well-conditioned.