Intelligent Systems

On Constraints in First-Order Optimization: A View from Non-Smooth Dynamical Systems

2022

Article

lds


We introduce a class of first-order methods for smooth constrained optimization that are based on an analogy to non-smooth dynamical systems. Two distinctive features of our approach are that (i) projections or optimizations over the entire feasible set are avoided, in stark contrast to projected gradient methods or the Frank-Wolfe method, and (ii) iterates are allowed to become infeasible, which differs from active set or feasible direction methods, where the descent motion stops as soon as a new constraint is encountered. The resulting algorithmic procedure is simple to implement even when constraints are nonlinear, and is suitable for large-scale constrained optimization problems in which the feasible set fails to have a simple structure. The key underlying idea is that constraints are expressed in terms of velocities instead of positions, which has the algorithmic consequence that optimizations over feasible sets at each iteration are replaced with optimizations over local, sparse convex approximations. The result is a simplified suite of algorithms and an expanded range of possible applications in machine learning.

Author(s): Michael Muehlebach and M. I. Jordan
Journal: Journal of Machine Learning Research
Volume: 23
Year: 2022

Department(s): Learning and Dynamical Systems
Bibtex Type: Article (article)
Paper Type: Journal

Article Number: 256
State: Published
URL: https://www.jmlr.org/papers/v23/21-0798.html

BibTex

@article{Constraints_in_First-Order_Optimization,
  title = {On Constraints in First-Order Optimization: A View from Non-Smooth Dynamical Systems},
  author = {Muehlebach, Michael and Jordan, M. I.},
  journal = {Journal of Machine Learning Research},
  volume = {23},
  year = {2022},
  doi = {},
  url = {https://www.jmlr.org/papers/v23/21-0798.html}
}