Intelligent Systems


2024


no image
Deep Backtracking Counterfactuals for Causally Compliant Explanations

Kladny, K., Kügelgen, J. V., Schölkopf, B., Muehlebach, M.

Transactions on Machine Learning Research, July 2024 (article)

arXiv link (url) [BibTex]

2024

arXiv link (url) [BibTex]


no image
Balancing a 3D Inverted Pendulum using Remote Magnetic Manipulation

Zughaibi, J., Nelson, B. J., Muehlebach, M.

Robotics and Automation Letters, 2024 (article) In revision

link (url) [BibTex]

link (url) [BibTex]


no image
Towards a systems theory of algorithms

Florian Dörfler, Zhiyu He, Giuseppe Belgioioso, Saverio Bolognani, John Lygeros, Michael Muehlebach

IEEE Control System Letters, 2024 (article)

link (url) [BibTex]

link (url) [BibTex]


no image
Gray-box nonlinear feedback optimization

He, Z., Bolognani, S., Muehlebach, M., Dörfler, F.

IEEE Transactions on Automatic Control, 2024 (article) Submitted

link (url) [BibTex]

link (url) [BibTex]


no image
Accelerated First-Order Optimization under Nonlinear Constraints

Muehlebach, M., Jordan, M. I.

Mathematical Programming, 2024 (article) Submitted

link (url) [BibTex]

link (url) [BibTex]


no image
Primal Methods for Variational Inequality Problems with Functional Constraints

Zhang, L., He, N., Muehlebach, M.

Mathematical Programming, 2024 (article) Submitted

link (url) [BibTex]

link (url) [BibTex]

2022


no image
On Constraints in First-Order Optimization: A View from Non-Smooth Dynamical Systems

Muehlebach, M., Jordan, M. I.

Journal of Machine Learning Research, 23, 2022 (article)

Abstract
We introduce a class of first-order methods for smooth constrained optimization that are based on an analogy to non-smooth dynamical systems. Two distinctive features of our approach are that (i) projections or optimizations over the entire feasible set are avoided, in stark contrast to projected gradient methods or the Frank-Wolfe method, and (ii) iterates are allowed to become infeasible, which differs from active set or feasible direction methods, where the descent motion stops as soon as a new constraint is encountered. The resulting algorithmic procedure is simple to implement even when constraints are nonlinear, and is suitable for large-scale constrained optimization problems in which the feasible set fails to have a simple structure. The key underlying idea is that constraints are expressed in terms of velocities instead of positions, which has the algorithmic consequence that optimizations over feasible sets at each iteration are replaced with optimizations over local, sparse convex approximations. The result is a simplified suite of algorithms and an expanded range of possible applications in machine learning.

link (url) [BibTex]

2022