Quoc Tran Dinh
Laboratory for Information and Inference Systems (LIONS), EPFL, Switzerland
Tuesday, May 26, 2015, 15:00 - 17:00
Room 01-012, Georges-Köhler-Allee 102, Freiburg 79110, Germany
Solving large-scale constrained convex optimization problems in a parallel and distributed manner often requires parameter tuning-free algorithms with rigorous convergence guarantees and numerical robustness. One of the most convenient approaches to handle the constraints in these problems is primal-dual frameworks. Unfortunately, current state-ofßthe-art primal-dual algorithms such as dual (fast) (sub)-gradients and alternating direction optimization methods are sensitive to the choice of smoothness/penalty parameters. In this talk, I will discuss some recent developments in primal-dual methods for solving general constrained convex optimization problems that can be implemented in a parallel and distributed manner. Our approach relies on three ideas: smoothing methods, gap reduction techniques, and dual decomposition frameworks. I will focus on the first-order methods and present several concrete algorithms under different structure assumptions. Numerical experiments are given to verify the performance of these algorithms.