Department of Mathematics2024-11-0920200272-497910.1093/imanum/drz0412-s2.0-85102125011https://hdl.handle.net/20.500.14288/1692Nonsmoothness at optimal points is a common phenomenon in many eigenvalue optimization problems. We consider two recent algorithms to minimize the largest eigenvalue of a Hermitian matrix dependent on one parameter, both proven to be globally convergent unaffected by nonsmoothness. One of these algorithms models the eigenvalue function with a piece-wise quadratic function and is effective in dealing with nonconvex problems. The other algorithm projects the Hermitian matrix into subspaces formed of eigenvectors and is effective in dealing with large-scale problems. We generalize the latter slightly to cope with nonsmoothness. For both algorithms we analyze the rate of convergence in the nonsmooth setting, when the largest eigenvalue is multiple at the minimizer and zero is strictly in the interior of the generalized Clarke derivative, and prove that both algorithms converge rapidly. The algorithms are applied to, and the deduced results are illustrated on the computation of the inner numerical radius, the modulus of the point on the boundary of the field of values closest to the origin, which carries significance for instance for the numerical solution of a symmetric definite generalized eigenvalue problem and the iterative solution of a saddle point linear system.pdfMathematicsNonsmooth algorithms for minimizing the largest eigenvalue with applications to inner numerical radiusJournal Article1464-3642https://doi.org/10.1093/imanum/drz041610489200007Q1NOIR02915