Department of Mathematics2024-11-1020171052-623410.1137/1409665512-s2.0-85021054871http://dx.doi.org/10.1137/140966551https://hdl.handle.net/20.500.14288/17601Optimization of convex functions subject to eigenvalue constraints is intriguing because of peculiar analytical properties of eigenvalue functions and is of practical interest because of a wide range of applications in fields such as structural design and control theory. Here we focus on the optimization of a linear objective subject to a constraint on the smallest eigenvalue of an analytic and Hermitian matrix-valued function. We propose a numerical approach based on quadratic support functions that overestimate the smallest eigenvalue function globally. the quadratic support functions are derived by employing variational properties of the smallest eigenvalue function over a set of Hermitian matrices. We establish the local convergence of the algorithm under mild assumptions and deduce a precise rate of convergence result by viewing the algorithm as a fixed point iteration. the convergence analysis reveals that the algorithm is immune to the nonsmooth nature of the smallest eigenvalue. We illustrate the practical applicability of the algorithm on the pseudospectral functions.MathematicsApplied mathematicsA support function based algorithm for optimization with eigenvalue constraintsJournal Article1095-7189404178500011Q15388